diff --git a/.buildinfo b/.buildinfo new file mode 100644 index 0000000..58da66f --- /dev/null +++ b/.buildinfo @@ -0,0 +1,4 @@ +# Sphinx build info version 1 +# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. +config: ce8b6b98136a67e2e3a2318e3afdcc8d +tags: d77d1c0d9ca2f4c8421862c7c5a0d620 diff --git a/.nojekyll b/.nojekyll new file mode 100644 index 0000000..e69de29 diff --git a/_generated/lasso.CubeTransit/index.html b/_generated/lasso.CubeTransit/index.html new file mode 100644 index 0000000..aae054d --- /dev/null +++ b/_generated/lasso.CubeTransit/index.html @@ -0,0 +1,569 @@ + + + + + + + lasso.CubeTransit — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.CubeTransit

+
+
+class lasso.CubeTransit(parameters={})[source]
+

Bases: object

+

Class for storing information about transit defined in Cube line +files.

+

Has the capability to:

+
+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
+

Typical usage example:

+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+lines
+

list of strings representing unique line names in +the cube network.

+
+
Type:
+

list

+
+
+
+ +
+
+line_properties
+

dictionary of line properties keyed by line name. Property +values are stored in a dictionary by property name. These +properties are directly read from the cube line files and haven’t +been translated to standard transit values.

+
+
Type:
+

dict

+
+
+
+ +
+
+shapes
+

dictionary of shapes +keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns:

+
+
    +
  • ‘node_id’ (int): positive integer of node id

  • +
  • ‘node’ (int): node number, with negative indicating a non-stop

  • +
  • ‘stop’ (boolean): indicates if it is a stop

  • +
  • ‘order’ (int): order within this shape

  • +
+
+
+
Type:
+

dict

+
+
+
+ +
+
+program_type
+

Either PT or TRNBLD

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

Parameters instance that will be applied to this instance which +includes information about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+source_list
+

List of cube line file sources that have been read and added.

+
+
Type:
+

list

+
+
+
+ +
+
+diff_dict
+
+
Type:
+

dict

+
+
+
+ +
+
+__init__(parameters={})[source]
+

Constructor for CubeTransit

+

parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([parameters])

Constructor for CubeTransit

add_additional_time_periods(...)

Copies a route to another cube time period with appropriate values for time-period-specific properties.

add_cube(transit_source)

Reads a .lin file and adds it to existing TransitNetwork instance.

build_route_name([route_id, time_period, ...])

Create a route name by contatenating route, time period, agency, and direction

calculate_start_end_times(line_properties_dict)

Calculate the start and end times of the property change WARNING: Doesn't take care of discongruous time periods!!!!

create_add_route_card_dict(line)

Creates a project card change formatted dictionary for adding a route based on the information in self.route_properties for the line.

create_delete_route_card_dict(line, ...)

Creates a project card change formatted dictionary for deleting a line.

create_from_cube(transit_source[, parameters])

Reads a cube .lin file and stores as TransitNetwork object.

create_update_route_card_dict(line, ...)

Creates a project card change formatted dictionary for updating the line.

cube_properties_to_standard_properties(...)

Converts cube style properties to standard properties.

evaluate_differences(base_transit)

    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
+

evaluate_route_property_differences(...[, ...])

Checks if any values have been updated or added for a specific route and creates project card entries for each.

evaluate_route_shape_changes(shape_build, ...)

Compares two route shapes and constructs returns list of changes suitable for a project card.

get_time_period_numbers_from_cube_properties(...)

Finds properties that are associated with time periods and the returns the numbers in them.

unpack_route_name(line_name)

Unpacks route name into direction, route, agency, and time period info

+
+
+add_additional_time_periods(new_time_period_number, orig_line_name)[source]
+

Copies a route to another cube time period with appropriate +values for time-period-specific properties.

+
+
New properties are stored under the new name in:
    +
  • ::self.shapes

  • +
  • ::self.line_properties

  • +
+
+
+
+
Parameters:
+
    +
  • new_time_period_number (int) – cube time period number

  • +
  • orig_line_name (str) – name of the originating line, from which +the new line will copy its properties.

  • +
+
+
Returns:
+

Line name with new time period.

+
+
+
+ +
+
+add_cube(transit_source)[source]
+

Reads a .lin file and adds it to existing TransitNetwork instance.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
+
+ +
+
+static build_route_name(route_id='', time_period='', agency_id=0, direction_id=1)[source]
+

Create a route name by contatenating route, time period, agency, and direction

+
+
Parameters:
+
    +
  • route_id – i.e. 452-111

  • +
  • time_period – i.e. pk

  • +
  • direction_id – i.e. 1

  • +
  • agency_id – i.e. 0

  • +
+
+
Returns:
+

constructed line_name i.e. “0_452-111_452_pk1”

+
+
+
+ +
+
+calculate_start_end_times(line_properties_dict)[source]
+

Calculate the start and end times of the property change +WARNING: Doesn’t take care of discongruous time periods!!!!

+
+
Parameters:
+

line_properties_dict – dictionary of cube-flavor properties for a transit line

+
+
+
+ +
+
+create_add_route_card_dict(line)[source]
+

Creates a project card change formatted dictionary for adding +a route based on the information in self.route_properties for +the line.

+
+
Parameters:
+

line – name of line that is being updated

+
+
Returns:
+

A project card change-formatted dictionary for the route addition.

+
+
+
+ +
+
+create_delete_route_card_dict(line, base_transit_line_properties_dict)[source]
+

Creates a project card change formatted dictionary for deleting a line.

+
+
Parameters:
+
    +
  • line – name of line that is being deleted

  • +
  • base_transit_line_properties_dict – dictionary of cube-style +attribute values in order to find time periods and +start and end times.

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the route deletion.

+
+
+
+ +
+
+static create_from_cube(transit_source, parameters={})[source]
+

Reads a cube .lin file and stores as TransitNetwork object.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
Returns:
+

A ::CubeTransit object created from the transit_source.

+
+
+
+ +
+
+create_update_route_card_dict(line, updated_properties_dict)[source]
+

Creates a project card change formatted dictionary for updating +the line.

+
+
Parameters:
+
    +
  • line – name of line that is being updated

  • +
  • updated_properties_dict – dictionary of attributes to update as +‘property’: <property name>, +‘set’: <new property value>

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the attribute update.

+
+
+
+ +
+
+static cube_properties_to_standard_properties(cube_properties_dict)[source]
+

Converts cube style properties to standard properties.

+

This is most pertinent to time-period specific variables like headway, +and varibles that have stnadard units like headway, which is minutes +in cube and seconds in standard format.

+
+
Parameters:
+

cube_properties_dict – <cube style property name> : <property value>

+
+
Returns:
+

+
<standard

style property name>, “set” : <property value with correct units>`

+
+
+

+
+
Return type:
+

A list of dictionaries with values for `”property”

+
+
+
+ +
+
+evaluate_differences(base_transit)[source]
+
    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
  3. +
    For routes being added or updated, identify if the time periods

    have changed or if there are multiples, and make duplicate lines if so

    +
    +
    +
  4. +
  5. Create project card dictionaries for each change.

  6. +
+
+
Parameters:
+

base_transit (CubeTransit) – an instance of this class for the base condition

+
+
Returns:
+

A list of dictionaries containing project card changes +required to evaluate the differences between the base network +and this transit network instance.

+
+
+
+ +
+
+evaluate_route_property_differences(properties_build, properties_base, time_period_number, absolute=True, validate_base=False)[source]
+

Checks if any values have been updated or added for a specific +route and creates project card entries for each.

+
+
Parameters:
+
    +
  • properties_build – ::<property_name>: <property_value>

  • +
  • properties_base – ::<property_name>: <property_value>

  • +
  • time_period_number – time period to evaluate

  • +
  • absolute – if True, will use set command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway

  • +
  • validate_base – if True, will add the existing line in the project card

  • +
+
+
Returns:
+

+
a list of dictionary values suitable for writing to a project card

{ +‘property’: <property_name>, +‘set’: <set value>, +‘change’: <change from existing value>, +‘existing’: <existing value to check>, +}

+
+
+

+
+
Return type:
+

transit_change_list (list)

+
+
+
+ +
+
+static evaluate_route_shape_changes(shape_build, shape_base)[source]
+

Compares two route shapes and constructs returns list of changes +suitable for a project card.

+
+
Parameters:
+
    +
  • shape_build – DataFrame of the build-version of the route shape.

  • +
  • shape_base – dDataFrame of the base-version of the route shape.

  • +
+
+
Returns:
+

List of shape changes formatted as a project card-change dictionary.

+
+
+
+ +
+
+static get_time_period_numbers_from_cube_properties(properties_list)[source]
+

Finds properties that are associated with time periods and the +returns the numbers in them.

+
+
Parameters:
+

properties_list (list) – list of all properties.

+
+
Returns:
+

list of strings of the time period numbers found

+
+
+
+ +
+
+static unpack_route_name(line_name)[source]
+

Unpacks route name into direction, route, agency, and time period info

+
+
Parameters:
+

line_name (str) – i.e. “0_452-111_452_pk1”

+
+
Returns:
+

452-111 +time_period (str): i.e. pk +direction_id (str) : i.e. 1 +agency_id (str) : i.e. 0

+
+
Return type:
+

route_id (str)

+
+
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_generated/lasso.ModelRoadwayNetwork/index.html b/_generated/lasso.ModelRoadwayNetwork/index.html new file mode 100644 index 0000000..40f38df --- /dev/null +++ b/_generated/lasso.ModelRoadwayNetwork/index.html @@ -0,0 +1,1571 @@ + + + + + + + lasso.ModelRoadwayNetwork — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.ModelRoadwayNetwork

+
+
+class lasso.ModelRoadwayNetwork(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Bases: RoadwayNetwork

+

Subclass of network_wrangler class RoadwayNetwork

+

A representation of the physical roadway network and its properties.

+
+
+__init__(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Constructor

+
+
Parameters:
+
    +
  • nodes – geodataframe of nodes

  • +
  • links – dataframe of links

  • +
  • shapes – geodataframe of shapes

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. +If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(nodes, links, shapes[, parameters])

Constructor

add_counts([network_variable, ...])

Adds count variable.

add_incident_link_data_to_nodes([links_df, ...])

Add data from links going to/from nodes to node.

add_new_roadway_feature_change(links, nodes)

add the new roadway features defined in the project card.

add_variable_using_shst_reference([...])

Join network links with source data, via SHST API node match result.

addition_map(links, nodes)

Shows which links and nodes are added to the roadway network

apply(project_card_dictionary)

Wrapper method to apply a project to a roadway network.

apply_managed_lane_feature_change(link_idx, ...)

Apply the managed lane feature changes to the roadway network

apply_python_calculation(pycode[, in_place])

Changes roadway network object by executing pycode.

apply_roadway_feature_change(link_idx, ...)

Changes the roadway attributes for the selected features based on the project card information passed

assess_connectivity([mode, ...])

Returns a network graph and list of disconnected subgraphs as described by a list of their member nodes.

build_selection_key(selection_dict)

Selections are stored by a key combining the query and the A and B ids.

calculate_area_type([area_type_shape, ...])

#MC Calculates area type variable.

calculate_centroidconnect(parameters[, ...])

Calculates centroid connector variable.

calculate_county([county_shape, ...])

#MC Calculates county variable.

calculate_distance([network_variable, ...])

calculate link distance in miles

calculate_mpo([county_network_variable, ...])

Calculates mpo variable.

calculate_use([network_variable, ...])

Calculates use variable.

convert_int([int_col_names])

Convert integer columns

create_ML_variable([network_variable, overwrite])

Created ML lanes placeholder for project to write out ML changes

create_calculated_variables()

Creates calculated roadway variables.

create_dummy_connector_links(ml_df[, ...])

create dummy connector links between the general purpose and managed lanes

create_hov_corridor_variable([...])

Created hov corridor placeholder for project to write out corridor changes

create_managed_lane_network([...])

Create a roadway network with managed lanes links separated out.

create_managed_variable([network_variable, ...])

Created placeholder for project to write out managed

dataframe_to_fixed_width(df)

Convert dataframe to fixed width format, geometry column will not be transformed.

delete_roadway_feature_change(links, nodes)

delete the roadway features defined in the project card.

deletion_map(links, nodes)

Shows which links and nodes are deleted from the roadway network

fill_na()

Fill na values from create_managed_lane_network()

from_RoadwayNetwork(roadway_network_object)

RoadwayNetwork to ModelRoadwayNetwork

get_attribute(links_df, join_key, ...)

Gets attribute from source data using SHST match result.

get_managed_lane_node_ids(nodes_list[, scalar])

Transform a list of node IDS by a scalar.

get_modal_graph(links_df, nodes_df[, mode, ...])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

get_modal_links_nodes(links_df, nodes_df[, ...])

Returns nodes and link dataframes for specific mode.

get_property_by_time_period_and_group(prop)

Return a series for the properties with a specific group or time period.

identify_segment(O_id, D_id[, ...])

+
param endpoints:
+

list of length of two unique keys of nodes making up endpoints of segment

+
+
+

identify_segment_endpoints([mode, links_df, ...])

+
param mode:
+

list of modes of the network, one of drive,`transit`,

+
+
+

is_network_connected([mode, links_df, nodes_df])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

load_transform_network(node_filename, ...[, ...])

Reads roadway network files from disk and transforms them into GeoDataFrames.

network_connection_plot(G, ...)

Plot a graph to check for network connection.

orig_dest_nodes_foreign_key(selection[, ...])

Returns the foreign key id (whatever is used in the u and v variables in the links file) for the AB nodes as a tuple.

ox_graph(nodes_df, links_df[, ...])

create an osmnx-flavored network graph

path_search(candidate_links_df, O_id, D_id)

+
param candidate_links:
+

selection of links geodataframe with links likely to be part of path

+
+
+

read(link_filename, node_filename, ...[, ...])

Reads in links and nodes network standard.

read_match_result(path)

Reads the shst geojson match returns.

rename_variables_for_dbf(input_df[, ...])

Rename attributes for DBF/SHP, make sure length within 10 chars.

roadway_net_to_gdf(roadway_net)

+
rtype:
+

GeoDataFrame

+
+
+

roadway_standard_to_met_council_network([...])

Rename and format roadway attributes to be consistent with what metcouncil's model is expecting.

select_roadway_features(selection[, ...])

Selects roadway features that satisfy selection criteria

selection_has_unique_link_id(selection_dict)

+
rtype:
+

bool

+
+
+

selection_map(selected_link_idx[, A, B, ...])

Shows which links are selected for roadway property change or parallel managed lanes category of roadway projects.

shortest_path(graph_links_df, O_id, D_id[, ...])

+
rtype:
+

tuple

+
+
+

split_properties_by_time_period_and_category([...])

Splits properties by time period, assuming a variable structure of

update_distance([links_df, use_shapes, ...])

Calculate link distance in specified units to network variable using either straight line distance or (if specified) shape distance if available.

validate_link_schema(link_filename[, ...])

Validate roadway network data link schema and output a boolean

validate_node_schema(node_file[, ...])

Validate roadway network data node schema and output a boolean

validate_properties(properties[, ...])

If there are change or existing commands, make sure that that property exists in the network.

validate_selection(selection[, ...])

Evaluate whetther the selection dictionary contains the minimum required values.

validate_shape_schema(shape_file[, ...])

Validate roadway network data shape schema and output a boolean

validate_uniqueness()

Confirms that the unique identifiers are met.

write([path, filename])

Writes a network in the roadway network standard

write_roadway_as_fixedwidth(output_dir[, ...])

Writes out fixed width file.

write_roadway_as_shp(output_dir[, ...])

Write out dbf/shp/gpkg for cube.

+

Attributes

+ + + + + + +

CALCULATED_VALUES

+
+
+add_counts(network_variable='AADT', mndot_count_shst_data=None, widot_count_shst_data=None, mndot_count_variable_shp=None, widot_count_variable_shp=None)[source]
+

Adds count variable. +#MC +join the network with count node data, via SHST API node match result

+
+
Parameters:
+
    +
  • network_variable (str) – Name of the variable that should be written to. Default to “AADT”.

  • +
  • mndot_count_shst_data (str) – File path to MNDOT count location SHST API node match result.

  • +
  • widot_count_shst_data (str) – File path to WIDOT count location SHST API node match result.

  • +
  • mndot_count_variable_shp (str) – File path to MNDOT count location geodatabase.

  • +
  • widot_count_variable_shp (str) – File path to WIDOT count location geodatabase.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+ +

Add data from links going to/from nodes to node.

+
+
Return type:
+

DataFrame

+
+
Parameters:
+
    +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
  • link_variables – list of columns in links dataframe to add to incident nodes

  • +
+
+
Returns:
+

nodes DataFrame with link data where length is N*number of links going in/out

+
+
+
+ +
+
+add_new_roadway_feature_change(links, nodes)
+

add the new roadway features defined in the project card. +new shapes are also added for the new roadway links.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – list of dictionaries

  • +
  • nodes – list of dictionaries

  • +
+
+
+

returns: None

+
+ +
+
+add_variable_using_shst_reference(var_shst_csvdata=None, shst_csv_variable=None, network_variable=None, network_var_type=<class 'int'>, overwrite=False)[source]
+

Join network links with source data, via SHST API node match result.

+
+
Parameters:
+
    +
  • var_shst_csvdata (str) – File path to SHST API return.

  • +
  • shst_csv_variable (str) – Variable name in the source data.

  • +
  • network_variable (str) – Name of the variable that should be written to.

  • +
  • network_var_type – Variable type in the written network.

  • +
  • overwrite (bool) – True is overwriting existing variable. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+addition_map(links, nodes)
+

Shows which links and nodes are added to the roadway network

+
+ +
+
+apply(project_card_dictionary)
+

Wrapper method to apply a project to a roadway network.

+
+
Parameters:
+

project_card_dictionary – dict +a dictionary of the project card object

+
+
+
+ +
+
+apply_managed_lane_feature_change(link_idx, properties, in_place=True)
+

Apply the managed lane feature changes to the roadway network

+
+
Parameters:
+
    +
  • link_idx – list of lndices of all links to apply change to

  • +
  • properties – list of dictionarys roadway properties to change

  • +
  • in_place – boolean to indicate whether to update self or return +a new roadway network object

  • +
+
+
+
+ +
+
+apply_python_calculation(pycode, in_place=True)
+

Changes roadway network object by executing pycode.

+
+
Parameters:
+
    +
  • pycode – python code which changes values in the roadway network object

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+
+ +
+
+apply_roadway_feature_change(link_idx, properties, in_place=True)
+

Changes the roadway attributes for the selected features based on the +project card information passed

+
+
Parameters:
+
    +
  • link_idx – list +lndices of all links to apply change to

  • +
  • properties – list of dictionarys +roadway properties to change

  • +
  • in_place – boolean +update self or return a new roadway network object

  • +
+
+
+
+ +
+
+assess_connectivity(mode='', ignore_end_nodes=True, links_df=None, nodes_df=None)
+

Returns a network graph and list of disconnected subgraphs +as described by a list of their member nodes.

+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • ignore_end_nodes – if True, ignores stray singleton nodes

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+
Returns: Tuple of

Network Graph (osmnx flavored networkX DiGraph) +List of disconnected subgraphs described by the list of their

+
+

member nodes (as described by their model_node_id)

+
+
+
+
+ +
+
+build_selection_key(selection_dict)
+

Selections are stored by a key combining the query and the A and B ids. +This method combines the two for you based on the selection dictionary.

+
+
Return type:
+

tuple

+
+
Parameters:
+

selection_dictonary – Selection Dictionary

+
+
+

Returns: Tuple serving as the selection key.

+
+ +
+
+calculate_area_type(area_type_shape=None, area_type_shape_variable=None, network_variable='area_type', area_type_codes_dict=None, downtown_area_type_shape=None, downtown_area_type=None, overwrite=False)[source]
+

#MC +Calculates area type variable.

+

This uses the centroid of the geometry field to determine which area it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • area_type_shape (str) – The File path to area geodatabase.

  • +
  • area_type_shape_variable (str) – The variable name of area type in area geodadabase.

  • +
  • network_variable (str) – The variable name of area type in network standard. Default to “area_type”.

  • +
  • area_type_codes_dict – The dictionary to map input area_type_shape_variable to network_variable

  • +
  • downtown_area_type_shape – The file path to the downtown area type boundary.

  • +
  • downtown_area_type (int) – Integer value of downtown area type

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_centroidconnect(parameters, network_variable='centroidconnect', highest_taz_number=None, as_integer=True, overwrite=False)[source]
+

Calculates centroid connector variable.

+
+
Parameters:
+
    +
  • parameters (Parameters) – A Lasso Parameters, which stores input files.

  • +
  • network_variable (str) – Variable that should be written to in the network. Default to “centroidconnect”

  • +
  • highest_taz_number (int) – the max TAZ number in the network.

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Default to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

RoadwayNetwork

+
+
+
+ +
+
+calculate_county(county_shape=None, county_shape_variable=None, network_variable='county', county_codes_dict=None, overwrite=False)[source]
+

#MC +Calculates county variable.

+

This uses the centroid of the geometry field to determine which county it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • county_shape (str) – The File path to county geodatabase.

  • +
  • county_shape_variable (str) – The variable name of county in county geodadabase.

  • +
  • network_variable (str) – The variable name of county in network standard. Default to “county”.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_distance(network_variable='distance', centroidconnect_only=False, overwrite=False)[source]
+

calculate link distance in miles

+
+
Parameters:
+
    +
  • centroidconnect_only (Bool) – True if calculating distance for centroidconnectors only. Default to False.

  • +
  • overwrite (Bool) – True if overwriting existing variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_mpo(county_network_variable='county', network_variable='mpo', as_integer=True, mpo_counties=None, overwrite=False)[source]
+

Calculates mpo variable. +#MC +:param county_variable: Name of the variable where the county names are stored. Default to “county”. +:type county_variable: str +:param network_variable: Name of the variable that should be written to. Default to “mpo”. +:type network_variable: str +:param as_integer: If true, will convert true/false to 1/0s. +:type as_integer: bool +:param mpo_counties: List of county names that are within mpo region. +:type mpo_counties: list +:param overwrite: True if overwriting existing county variable in network. Default to False. +:type overwrite: Bool

+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_use(network_variable='use', as_integer=True, overwrite=False)[source]
+

Calculates use variable.

+
+
Parameters:
+
    +
  • network_variable (str) – Variable that should be written to in the network. Default to “use”

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Defauly to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+convert_int(int_col_names=[])[source]
+

Convert integer columns

+
+ +
+
+create_ML_variable(network_variable='ML_lanes', overwrite=False)[source]
+

Created ML lanes placeholder for project to write out ML changes

+

ML lanes default to 0, ML info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_calculated_variables()[source]
+

Creates calculated roadway variables.

+
+
Parameters:
+

None

+
+
+
+ +
+ +

create dummy connector links between the general purpose and managed lanes

+
+
Parameters:
+
    +
  • gp_df – GeoDataFrame +dataframe of general purpose links (where managed lane also exists)

  • +
  • ml_df – GeoDataFrame +dataframe of corresponding managed lane links,

  • +
  • access_lanes – int +number of lanes in access dummy link

  • +
  • egress_lanes – int +number of lanes in egress dummy link

  • +
  • access_roadway – str +roaday type for access dummy link

  • +
  • egress_roadway – str +roadway type for egress dummy link

  • +
  • access_name_prefix – str +prefix for access dummy link name

  • +
  • egress_name_prefix – str +prefix for egress dummy link name

  • +
+
+
+
+ +
+
+create_hov_corridor_variable(network_variable='segment_id', overwrite=False)[source]
+

Created hov corridor placeholder for project to write out corridor changes

+

hov corridor id default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_managed_lane_network(keep_same_attributes_ml_and_gp=None, keep_additional_attributes_ml_and_gp=[], managed_lanes_required_attributes=[], managed_lanes_node_id_scalar=None, managed_lanes_link_id_scalar=None, in_place=False)
+

Create a roadway network with managed lanes links separated out. +Add new parallel managed lane links, access/egress links, +and add shapes corresponding to the new links

+
+
Return type:
+

RoadwayNetwork

+
+
Parameters:
+
    +
  • keep_same_attributes_ml_and_gp – list of attributes to copy from general purpose +lane to managed lane. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to KEEP_SAME_ATTRIBUTES_ML_AND_GP.

  • +
  • keep_additional_attributes_ml_and_gp – list of additional attributes to add. This is useful +if you want to leave the default attributes and then ALSO some others.

  • +
  • managed_lanes_required_attributes – list of attributes that are required to be specified +in new managed lanes. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_REQUIRED_ATTRIBUTES.

  • +
  • managed_lanes_node_id_scalar – integer value added to original node IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_NODE_ID_SCALAR.

  • +
  • managed_lanes_link_id_scalar – integer value added to original link IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_LINK_ID_SCALAR.

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+

returns: A RoadwayNetwork instance

+
+ +
+
+create_managed_variable(network_variable='managed', overwrite=False)[source]
+

Created placeholder for project to write out managed

+

managed default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+static dataframe_to_fixed_width(df)[source]
+

Convert dataframe to fixed width format, geometry column will not be transformed.

+
+
Parameters:
+

df (pandas DataFrame) –

+
+
Returns:
+

dataframe with fixed width for each column. +dict: dictionary with columns names as keys, column width as values.

+
+
Return type:
+

pandas dataframe

+
+
+
+ +
+
+delete_roadway_feature_change(links, nodes, ignore_missing=True)
+

delete the roadway features defined in the project card. +valid links and nodes defined in the project gets deleted +and shapes corresponding to the deleted links are also deleted.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – dict +list of dictionaries

  • +
  • nodes – dict +list of dictionaries

  • +
  • ignore_missing – bool +If True, will only warn about links/nodes that are missing from +network but specified to “delete” in project card +If False, will fail.

  • +
+
+
+
+ +
+
+deletion_map(links, nodes)
+

Shows which links and nodes are deleted from the roadway network

+
+ +
+
+fill_na()[source]
+

Fill na values from create_managed_lane_network()

+
+ +
+
+static from_RoadwayNetwork(roadway_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • roadway_network_object (RoadwayNetwork) –

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static get_attribute(links_df, join_key, source_shst_ref_df, source_gdf, field_name)[source]
+

Gets attribute from source data using SHST match result.

+
+
Parameters:
+
    +
  • links_df (dataframe) – The network dataframe that new attribute should be written to.

  • +
  • join_key (str) – SHST ID variable name used to join source data with network dataframe.

  • +
  • source_shst_ref_df (str) – File path to source data SHST match result.

  • +
  • source_gdf (str) – File path to source data.

  • +
  • field_name (str) – Name of the attribute to get from source data.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+static get_managed_lane_node_ids(nodes_list, scalar=4500000)
+

Transform a list of node IDS by a scalar. +..todo #237 what if node ids are not a number?

+
+
Parameters:
+
    +
  • nodes_list – list of integers

  • +
  • scalar – value to add to node IDs

  • +
+
+
+

Returns: list of integers

+
+ +
+
+static get_modal_graph(links_df, nodes_df, mode=None, modes_to_network_link_variables={'bike': ['bike_access'], 'bus': ['bus_only', 'drive_access'], 'drive': ['drive_access'], 'rail': ['rail_only'], 'transit': ['bus_only', 'rail_only', 'drive_access'], 'walk': ['walk_access']})
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: networkx: osmnx: DiGraph of network

+
+ +
+ +

Returns nodes and link dataframes for specific mode.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • modes – list of the modes of the network to be kept, must be in drive,`transit`,`rail`,`bus`, +walk, bike. For example, if bike and walk are selected, both bike and walk links will be kept.

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: tuple of DataFrames for links, nodes filtered by mode

+

links with walk access are not marked as having walk access +Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145 +modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]

+
+ +
+
+get_property_by_time_period_and_group(prop, time_period=None, category=None, default_return=None)
+

Return a series for the properties with a specific group or time period.

+
+
Parameters:
+
    +
  • prop (str) – the variable that you want from network

  • +
  • time_period (list(str)) – the time period that you are querying for +i.e. [‘16:00’, ‘19:00’]

  • +
  • category (str or list(str)(Optional)) –

    the group category +i.e. “sov”

    +

    or

    +

    list of group categories in order of search, i.e. +[“hov3”,”hov2”]

    +

  • +
  • default_return (what to return if variable or time period not found. Default is None.) –

  • +
+
+
Return type:
+

pandas series

+
+
+
+ +
+
+identify_segment(O_id, D_id, selection_dict={}, mode=None, nodes_df=None, links_df=None)
+
+
Parameters:
+
    +
  • endpoints – list of length of two unique keys of nodes making up endpoints of segment

  • +
  • selection_dict – dictionary of link variables to select candidate links from, otherwise will create a graph of ALL links which will be both a RAM hog and could result in odd shortest paths.

  • +
  • segment_variables – list of variables to keep

  • +
+
+
+
+ +
+
+identify_segment_endpoints(mode='', links_df=None, nodes_df=None, min_connecting_links=10, min_distance=None, max_link_deviation=2)
+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+ +
+
+is_network_connected(mode=None, links_df=None, nodes_df=None)
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
+
+
+

Returns: boolean

+
+ +
+
+static load_transform_network(node_filename, link_filename, shape_filename, crs=4326, node_foreign_key='model_node_id', validate_schema=True, **kwargs)
+

Reads roadway network files from disk and transforms them into GeoDataFrames.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • node_filename – file name for nodes.

  • +
  • link_filename – file name for links.

  • +
  • shape_filename – file name for shapes.

  • +
  • crs – coordinate reference system. Defaults to value in CRS.

  • +
  • node_foreign_key – variable linking the node table to the link table. Defaults +to NODE_FOREIGN_KEY.

  • +
  • validate_schema – boolean indicating if network should be validated to schema.

  • +
+
+
+

returns: tuple of GeodataFrames nodes_df, links_df, shapes_df

+
+ +
+
+static network_connection_plot(G, disconnected_subgraph_nodes)
+

Plot a graph to check for network connection.

+
+
Parameters:
+
    +
  • G – OSMNX flavored networkX graph.

  • +
  • disconnected_subgraph_nodes – List of disconnected subgraphs described by the list of their +member nodes (as described by their model_node_id).

  • +
+
+
+

returns: fig, ax : tuple

+
+ +
+
+orig_dest_nodes_foreign_key(selection, node_foreign_key='')
+

Returns the foreign key id (whatever is used in the u and v +variables in the links file) for the AB nodes as a tuple.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • selection – selection dictionary with A and B keys

  • +
  • node_foreign_key – variable name for whatever is used by the u and v variable

  • +
  • specified (in the links_df file. If nothing is) –

  • +
  • whatever (assume) –

  • +
  • is (default) –

  • +
+
+
+

Returns: tuple of (A_id, B_id)

+
+ +
+
+static ox_graph(nodes_df, links_df, node_foreign_key='model_node_id', link_foreign_key=['A', 'B'], unique_link_key='model_link_id')
+

create an osmnx-flavored network graph

+

osmnx doesn’t like values that are arrays, so remove the variables +that have arrays. osmnx also requires that certain variables +be filled in, so do that too.

+
+
Parameters:
+
    +
  • nodes_df – GeoDataFrame of nodes

  • +
  • link_df – GeoDataFrame of links

  • +
  • node_foreign_key – field referenced in link_foreign_key

  • +
  • link_foreign_key – list of attributes that define the link start and end nodes to the node foreign key

  • +
  • unique_link_key – primary key for links

  • +
+
+
+

Returns: a networkx multidigraph

+
+ +
+ +
+
Parameters:
+
    +
  • candidate_links – selection of links geodataframe with links likely to be part of path

  • +
  • O_id – origin node foreigh key ID

  • +
  • D_id – destination node foreigh key ID

  • +
  • weight_column – column to use for weight of shortest path. Defaults to “i” (iteration)

  • +
  • weight_factor – optional weight to multiply the weight column by when finding the shortest path

  • +
  • search_breadth

  • +
+
+
+

Returns

+
+ +
+
+static read(link_filename, node_filename, shape_filename, fast=False, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Reads in links and nodes network standard.

+
+
Parameters:
+
    +
  • link_filename (str) – File path to link json.

  • +
  • node_filename (str) – File path to node geojson.

  • +
  • shape_filename (str) – File path to link true shape geojson

  • +
  • fast (bool) – boolean that will skip validation to speed up read time.

  • +
  • recalculate_calculated_variables (bool) – calculates fields from spatial joins, etc.

  • +
  • recalculate_distance (bool) – re-calculates distance.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static read_match_result(path)[source]
+

Reads the shst geojson match returns.

+

Returns shst dataframe.

+

Reading lots of same type of file and concatenating them into a single DataFrame.

+
+
Parameters:
+

path (str) – File path to SHST match results.

+
+
Returns:
+

geopandas geodataframe

+
+
Return type:
+

geodataframe

+
+
+

##todo +not sure why we need, but should be in utilities not this class

+
+ +
+
+rename_variables_for_dbf(input_df, variable_crosswalk=None, output_variables=None, convert_geometry_to_xy=False)[source]
+

Rename attributes for DBF/SHP, make sure length within 10 chars.

+
+
Parameters:
+
    +
  • input_df (dataframe) – Network standard DataFrame.

  • +
  • variable_crosswalk (str) – File path to variable name crosswalk from network standard to DBF names.

  • +
  • output_variables (list) – List of strings for DBF variables.

  • +
  • convert_geometry_to_xy (bool) – True if converting node geometry to X/Y

  • +
+
+
Returns:
+

dataframe

+
+
+
+ +
+
+static roadway_net_to_gdf(roadway_net)
+
+
Return type:
+

GeoDataFrame

+
+
+

Turn the roadway network into a GeoDataFrame +:param roadway_net: the roadway network to export

+

returns: shapes dataframe

+
+ +
+
+roadway_standard_to_met_council_network(output_epsg=None)[source]
+

Rename and format roadway attributes to be consistent with what metcouncil’s model is expecting. +#MC +:param output_epsg: epsg number of output network. +:type output_epsg: int

+
+
Returns:
+

None

+
+
+
+ +
+
+select_roadway_features(selection, search_mode='drive', force_search=False, sp_weight_factor=None)
+

Selects roadway features that satisfy selection criteria

+
+
Return type:
+

GeoDataFrame

+
+
+
+
Example usage:
+
net.select_roadway_features(
+
selection = [ {

# a match condition for the from node using osm, +# shared streets, or model node number +‘from’: {‘osm_model_link_id’: ‘1234’}, +# a match for the to-node.. +‘to’: {‘shstid’: ‘4321’}, +# a regex or match for facility condition +# could be # of lanes, facility type, etc. +‘facility’: {‘name’:’Main St’}, +}, … ])

+
+
+
+
+
+
+
+
Parameters:
+
    +
  • selection – dictionary with keys for: +A - from node +B - to node +link - which includes at least a variable for name

  • +
  • search_mode – mode which you are searching for; defaults to “drive”

  • +
  • force_search – boolean directing method to perform search even if one +with same selection dict is stored from a previous search.

  • +
  • sp_weight_factor – multiple used to discourage shortest paths which +meander from original search returned from name or ref query. +If not set here, will default to value of sp_weight_factor in +RoadwayNetwork instance. If not set there, will defaul to SP_WEIGHT_FACTOR.

  • +
+
+
+

Returns: a list of link IDs in selection

+
+ +
+ +
+
Return type:
+

bool

+
+
Parameters:
+

selection_dictionary – Dictionary representation of selection +of roadway features, containing a “link” key.

+
+
+
+
Returns: A boolean indicating if the selection dictionary contains

a unique identifier for links.

+
+
+
+ +
+
+selection_map(selected_link_idx, A=None, B=None, candidate_link_idx=[])
+

Shows which links are selected for roadway property change or parallel +managed lanes category of roadway projects.

+
+
Parameters:
+
    +
  • selected_links_idx – list of selected link indices

  • +
  • candidate_links_idx – optional list of candidate link indices to also include in map

  • +
  • A – optional foreign key of starting node of a route selection

  • +
  • B – optional foreign key of ending node of a route selection

  • +
+
+
+
+ +
+
+shortest_path(graph_links_df, O_id, D_id, nodes_df=None, weight_column='i', weight_factor=100)
+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • graph_links_df

  • +
  • O_id – foreign key for start node

  • +
  • D_id – foreign key for end node

  • +
  • nodes_df – optional nodes df, otherwise will use network instance

  • +
  • weight_column – column to use as a weight, defaults to “i”

  • +
  • weight_factor – any additional weighting to multiply the weight column by, defaults to SP_WEIGHT_FACTOR

  • +
+
+
+

Returns: tuple with length of four +- Boolean if shortest path found +- nx Directed graph of graph links +- route of shortest path nodes as List +- links in shortest path selected from links_df

+
+ +
+
+split_properties_by_time_period_and_category(properties_to_split=None)[source]
+

Splits properties by time period, assuming a variable structure of

+
+
Parameters:
+

properties_to_split

dict +dictionary of output variable prefix mapped to the source variable and what to stratify it by +e.g. +{

+
+

’lanes’ : {‘v’:’lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘ML_lanes’ : {‘v’:’ML_lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘use’ : {‘v’:’use’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}},

+
+

}

+

+
+
+
+ +
+
+update_distance(links_df=None, use_shapes=False, units='miles', network_variable='distance', overwrite=True, inplace=True)
+

Calculate link distance in specified units to network variable using either straight line +distance or (if specified) shape distance if available.

+
+
Parameters:
+
    +
  • links_df – Links GeoDataFrame. Useful if want to update a portion of network links +(i.e. only centroid connectors). If not provided, will use entire self.links_df.

  • +
  • use_shapes – if True, will add length information from self.shapes_df rather than crow-fly. +If no corresponding shape found in self.shapes_df, will default to crow-fly.

  • +
  • units – units to use. Defaults to the standard unit of miles. Available units: “meters”, “miles”.

  • +
  • network_variable – variable to store link distance in. Defaults to “distance”.

  • +
  • overwrite – Defaults to True and will overwrite all existing calculated distances. +False will only update NaNs.

  • +
  • inplace – updates self.links_df

  • +
+
+
Returns:
+

links_df with updated distance

+
+
+
+ +
+ +

Validate roadway network data link schema and output a boolean

+
+ +
+
+static validate_node_schema(node_file, schema_location='roadway_network_node.json')
+

Validate roadway network data node schema and output a boolean

+
+ +
+
+validate_properties(properties, ignore_existing=False, require_existing_for_change=False)
+

If there are change or existing commands, make sure that that +property exists in the network.

+
+
Return type:
+

bool

+
+
Parameters:
+
    +
  • properties – properties dictionary to be evaluated

  • +
  • ignore_existing – If True, will only warn about properties +that specify an “existing” value. If False, will fail.

  • +
  • require_existing_for_change – If True, will fail if there isn’t +a specified value in theproject card for existing when a +change is specified.

  • +
+
+
+

Returns: boolean value as to whether the properties dictonary is valid.

+
+ +
+
+validate_selection(selection, selection_requires=['link'])
+

Evaluate whetther the selection dictionary contains the +minimum required values.

+
+
Return type:
+

bool

+
+
Parameters:
+

selection – selection dictionary to be evaluated

+
+
+

Returns: boolean value as to whether the selection dictonary is valid.

+
+ +
+
+static validate_shape_schema(shape_file, schema_location='roadway_network_shape.json')
+

Validate roadway network data shape schema and output a boolean

+
+ +
+
+validate_uniqueness()
+

Confirms that the unique identifiers are met.

+
+
Return type:
+

bool

+
+
+
+ +
+
+write(path='.', filename=None)
+

Writes a network in the roadway network standard

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • path – the path were the output will be saved

  • +
  • filename – the name prefix of the roadway files that will be generated

  • +
+
+
+
+ +
+
+write_roadway_as_fixedwidth(output_dir, node_output_variables=None, link_output_variables=None, output_link_txt=None, output_node_txt=None, output_link_header_width_txt=None, output_node_header_width_txt=None, output_cube_network_script=None, drive_only=False)[source]
+

Writes out fixed width file.

+

This function does: +1. write out link and node fixed width data files for cube. +2. write out header and width correspondence. +3. write out cube network building script with header and width specification.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to where links, nodes and script will be written and run

  • +
  • node_output_variables (list) – list of node variable names.

  • +
  • link_output_variables (list) – list of link variable names.

  • +
  • output_link_txt (str) – File name of output link database (within output_dir)

  • +
  • output_node_txt (str) – File name of output node database (within output_dir)

  • +
  • output_link_header_width_txt (str) – File name of link column width records (within output_dir)

  • +
  • output_node_header_width_txt (str) – File name of node column width records (within output_dir)

  • +
  • output_cube_network_script (str) – File name of CUBE network building script (within output_dir)

  • +
  • drive_only (bool) – If True, only writes drive nodes and links

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+write_roadway_as_shp(output_dir, node_output_variables=None, link_output_variables=None, data_to_csv=True, data_to_dbf=False, output_link_shp=None, output_node_shp=None, output_link_csv=None, output_node_csv=None, output_gpkg=None, output_link_gpkg_layer=None, output_node_gpkg_layer=None, output_gpkg_link_filter=None)[source]
+

Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to directory

  • +
  • node_output_variables (list) – List of strings for node output variables.

  • +
  • link_output_variables (list) – List of strings for link output variables.

  • +
  • data_to_csv (bool) – True if write network in csv format.

  • +
  • data_to_dbf (bool) – True if write network in dbf/shp format.

  • +
  • output_link_shp (str) – File name to output link dbf/shp.

  • +
  • output_node_shp (str) – File name of output node dbf/shp.

  • +
  • output_link_csv (str) – File name to output link csv.

  • +
  • output_node_csv (str) – File name to output node csv.

  • +
  • output_gpkg (str) – File name to output GeoPackage.

  • +
  • output_link_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_node_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_gpkg_link_filter (str) – Optional column name to additional output link subset layers

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_generated/lasso.Parameters/index.html b/_generated/lasso.Parameters/index.html new file mode 100644 index 0000000..108e293 --- /dev/null +++ b/_generated/lasso.Parameters/index.html @@ -0,0 +1,553 @@ + + + + + + + lasso.Parameters — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Parameters

+
+
+class lasso.Parameters(**kwargs)[source]
+

Bases: object

+

A class representing all the parameters defining the networks +including time of day, categories, etc.

+

Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class. +.. highlight:: python

+
+
Attr:
+
time_period_to_time (dict): Maps time period abbreviations used in

Cube to time of days used on gtfs and highway network standard +Default:

+
{
+    "EA": ("3:00", "6:00"),
+    "AM": ("6:00, "10:00"),
+    "MD": ("10:00", "15:00"),
+    "PM": ("15:00", "19:00"),
+    "EV": ("19:00", "3:00"),
+}
+
+
+
+
cube_time_periods (dict): Maps cube time period numbers used in

transit line files to the time period abbreviations in time_period_to_time +dictionary. +Default:

+
{"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"}
+
+
+
+
categories (dict): Maps demand category abbreviations to a list of

network categories they are allowed to use. +Default:

+
{
+    # suffix, source (in order of search)
+    "sov": ["sov", "default"],
+    "hov2": ["hov2", "default", "sov"],
+    "hov3": ["hov3", "hov2", "default", "sov"],
+    "truck": ["trk", "sov", "default"],
+}
+
+
+
+
properties_to_split (dict): Dictionary mapping variables in standard

roadway network to categories and time periods that need to be +split out in final model network to get variables like LANES_AM. +Default:

+
{
+    "lanes": {
+        "v": "lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "ML_lanes": {
+        "v": "ML_lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "use": {
+        "v": "use",
+        "time_periods": self.time_periods_to_time
+    },
+}
+
+
+
+
county_shape (str): File location of shapefile defining counties.

Default:

+
r"metcouncil_data/county/cb_2017_us_county_5m.shp"
+
+
+
+
county_variable_shp (str): Property defining the county n ame in

the county_shape file. +Default:

+
NAME
+
+
+
+
lanes_lookup_file (str): Lookup table of number of lanes for different data sources.

Default:

+
r"metcouncil_data/lookups/lanes.csv"
+
+
+
+
centroid_connect_lanes (int): Number of lanes for centroid connectors.

Default:

+
1
+
+
+
+
mpo_counties (list): list of county names within MPO boundary.

Default:

+
[
+    "ANOKA",
+    "DAKOTA",
+    "HENNEPIN",
+    "RAMSEY",
+    "SCOTT",
+    "WASHINGTON",
+    "CARVER",
+]
+
+
+
+
taz_shape (str):

Default:

+
r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp"
+
+
+
+
taz_data (str):

Default:

+
??
+
+
+
+
highest_taz_number (int): highest TAZ number in order to define

centroid connectors. +Default:

+
3100
+
+
+
+
output_variables (list): list of variables to output in final model

network. +Default:

+
[
+    "model_link_id",
+    "link_id",
+    "A",
+    "B",
+    "shstGeometryId",
+    "distance",
+    "roadway",
+    "name",
+    "roadway_class",
+    "bike_access",
+    "walk_access",
+    "drive_access",
+    "truck_access",
+    "trn_priority_EA",
+    "trn_priority_AM",
+    "trn_priority_MD",
+    "trn_priority_PM",
+    "trn_priority_EV",
+    "ttime_assert_EA",
+    "ttime_assert_AM",
+    "ttime_assert_MD",
+    "ttime_assert_PM",
+    "ttime_assert_EV",
+    "lanes_EA",
+    "lanes_AM",
+    "lanes_MD",
+    "lanes_PM",
+    "lanes_EV",
+    "price_sov_EA",
+    "price_hov2_EA",
+    "price_hov3_EA",
+    "price_truck_EA",
+    "price_sov_AM",
+    "price_hov2_AM",
+    "price_hov3_AM",
+    "price_truck_AM",
+    "price_sov_MD",
+    "price_hov2_MD",
+    "price_hov3_MD",
+    "price_truck_MD",
+    "price_sov_PM",
+    "price_hov2_PM",
+    "price_hov3_PM",
+    "price_truck_PM",
+    "price_sov_EV",
+    "price_hov2_EV",
+    "price_hov3_EV",
+    "price_truck_EV",
+    "roadway_class_idx",
+    "facility_type",
+    "county",
+    "centroidconnect",
+    "model_node_id",
+    "N",
+    "osm_node_id",
+    "bike_node",
+    "transit_node",
+    "walk_node",
+    "drive_node",
+    "geometry",
+    "X",
+    "Y",
+    "ML_lanes_EA",
+    "ML_lanes_AM",
+    "ML_lanes_MD",
+    "ML_lanes_PM",
+    "ML_lanes_EV",
+    "segment_id",
+    "managed",
+    "bus_only",
+    "rail_only"
+]
+
+
+
+
osm_facility_type_dict (dict): Mapping between OSM Roadway variable

and facility type. Default:

+
+
area_type_shape (str): Location of shapefile defining area type.

Default:

+
r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp"
+
+
+
+
area_type_variable_shp (str): property in area_type_shape with area

type in it. +Default:

+
"COMDES2040"
+
+
+
+
area_type_code_dict (dict): Mapping of the area_type_variable_shp to

the area type code used in the MetCouncil cube network. +Default:

+
{
+    23: 4,  # urban center
+    24: 3,
+    25: 2,
+    35: 2,
+    36: 1,
+    41: 1,
+    51: 1,
+    52: 1,
+    53: 1,
+    60: 1,
+}
+
+
+
+
downtown_area_type_shape (str): Location of shapefile defining downtown area type.

Default:

+
r"metcouncil_data/area_type/downtownzones_TAZ.shp"
+
+
+
+
downtown_area_type (int): Area type integer for downtown.

Default:

+
5
+
+
+
+
mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property

associated with roadway class. Default:

+
r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp"
+
+
+
+
mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp

associated with roadway class. Default:

+
"ROUTE_SYS"
+
+
+
+
widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property

associated with roadway class. Default:

+
r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp"
+
+
+
+
widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape

associated with roadway class.Default:

+
"RDWY_CTGY_"
+
+
+
+
mndot_count_shape (str): Shapefile of MnDOT links with a property

associated with counts. Default:

+
r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp"
+
+
+
+
mndot_count_variable_shp (str): The property in mndot_count_shape

associated with counts. Default:

+
+
::

“lookups/osm_highway_facility_type_crosswalk.csv”

+
+
+
+
legacy_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from Legacy TM2 network. Default:

+
"lookups/legacy_tm2_attributes.csv"
+
+
+
+
osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId

from OSM. Default:

+
"lookups/osm_lanes_attributes.csv"
+
+
+
+
tam_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from TAM TM2 network. Default:

+
"lookups/tam_tm2_attributes.csv"
+
+
+
+
tom_tom_attributes (str): CSV file of link attributes by

shStReferenceId from TomTom network. Default:

+
"lookups/tomtom_attributes.csv"
+
+
+
+
sfcta_attributes (str): CSV file of link attributes by

shStReferenceId from SFCTA network. Default:

+
"lookups/sfcta_attributes.csv"
+
+
+
+
output_epsg (int): EPSG type of geographic projection for output

shapefiles. Default:

+
102646
+
+
+
+
output_link_shp (str): Output shapefile for roadway links. Default:
+
::

r”tests/scratch/links.shp”

+
+
+
+
output_node_shp (str): Output shapefile for roadway nodes. Default:
+
::

r”tests/scratch/nodes.shp”

+
+
+
+
output_link_csv (str): Output csv for roadway links. Default:
+
::

r”tests/scratch/links.csv”

+
+
+
+
output_node_csv (str): Output csv for roadway nodes. Default:
+
::

r”tests/scratch/nodes.csv”

+
+
+
+
output_link_txt (str): Output fixed format txt for roadway links. Default:
+
::

r”tests/scratch/links.txt”

+
+
+
+
output_node_txt (str): Output fixed format txt for roadway nodes. Default:
+
::

r”tests/scratch/nodes.txt”

+
+
+
+
output_link_header_width_txt (str): Header for txt roadway links. Default:
+
::

r”tests/scratch/links_header_width.txt”

+
+
+
+
output_node_header_width_txt (str): Header for txt for roadway Nodes. Default:
+
::

r”tests/scratch/nodes_header_width.txt”

+
+
+
+
output_cube_network_script (str): Cube script for importing

fixed-format roadway network. Default:

+
r"tests/scratch/make_complete_network_from_fixed_width_file.s
+
+
+
+
+
+
+
+
+__init__(**kwargs)[source]
+

Time period and category splitting info

+
+ +

Methods

+ + + + + + +

__init__(**kwargs)

Time period and category splitting info

+

Attributes

+ + + + + + + + + + + + + + + +

cube_time_periods

#MC self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7}

properties_to_split

Details for calculating the county based on the centroid of the link.

county_link_range_dict

self.county_code_dict = {

zones

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+
self.county_code_dict = {

“Anoka”: 1, +“Carver”: 2, +“Dakota”: 3, +“Hennepin”: 4, +“Ramsey”: 5, +“Scott”: 6, +“Washington”: 7, +“external”: 10, +“Chisago”: 11, +“Goodhue”: 12, +“Isanti”: 13, +“Le Sueur”: 14, +“McLeod”: 15, +“Pierce”: 16, +“Polk”: 17, +“Rice”: 18, +“Sherburne”: 19, +“Sibley”: 20, +“St. Croix”: 21, +“Wright”: 22,

+
+
+

}

+
+ +
+
+cube_time_periods
+

#MC +self.route_type_bus_mode_dict = {“Urb Loc”: 5, “Sub Loc”: 6, “Express”: 7}

+

self.route_type_mode_dict = {0: 8, 2: 9}

+

self.cube_time_periods = {“1”: “AM”, “2”: “MD”} +self.cube_time_periods_name = {“AM”: “pk”, “MD”: “op”}

+
+ +
+
+properties_to_split
+

Details for calculating the county based on the centroid of the link. +The NAME varible should be the name of a field in shapefile.

+
+ +
+
+zones
+

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_generated/lasso.Project/index.html b/_generated/lasso.Project/index.html new file mode 100644 index 0000000..3677364 --- /dev/null +++ b/_generated/lasso.Project/index.html @@ -0,0 +1,519 @@ + + + + + + + lasso.Project — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Project

+
+
+class lasso.Project(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

Bases: object

+

A single or set of changes to the roadway or transit system.

+

Compares a base and a build transit network or a base and build +highway network and produces project cards.

+

Typical usage example:

+
test_project = Project.create_project(
+    base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+    build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+)
+test_project.evaluate_changes()
+test_project.write_project_card(
+    os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+)
+
+
+
+
+DEFAULT_PROJECT_NAME
+

a class-level constant that defines what +the project name will be if none is set.

+
+ +
+
+STATIC_VALUES
+

a class-level constant which defines values that +are not evaluated when assessing changes.

+
+ +
+
+card_data
+

{“project”: <project_name>, “changes”: <list of change dicts>}

+
+
Type:
+

dict

+
+
+
+ +
+ +

pandas dataframe of CUBE roadway link changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+roadway_node_changes
+

pandas dataframe of CUBE roadway node changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+transit_changes
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+base_roadway_network
+
+
Type:
+

RoadwayNetwork

+
+
+
+ +
+
+base_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+build_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+project_name
+

name of the project, set to DEFAULT_PROJECT_NAME if not provided

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

an instance of the Parameters class which sets a bunch of parameters

+
+ +
+
+__init__(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

ProjectCard constructor.

+
+
Parameters:
+
    +
  • roadway_link_changes – dataframe of roadway changes read from a log file

  • +
  • roadway_node_changes – dataframe of roadway changes read from a log file

  • +
  • transit_changes – dataframe of transit changes read from a log file

  • +
  • base_roadway_network – RoadwayNetwork instance for base case

  • +
  • base_transit_network – StandardTransit instance for base case

  • +
  • base_cube_transit_network – CubeTransit instance for base transit network

  • +
  • build_cube_transit_network – CubeTransit instance for build transit network

  • +
  • project_name – name of the project

  • +
  • evaluate – defaults to false, but if true, will create card data

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
+

returns: instance of ProjectCard

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([roadway_link_changes, ...])

ProjectCard constructor.

add_highway_changes([...])

Evaluates changes from the log file based on the base highway object and adds entries into the self.card_data dictionary.

add_transit_changes()

Evaluates changes between base and build transit objects and adds entries into the self.card_data dictionary.

create_project([roadway_log_file, ...])

Constructor for a Project instance.

determine_roadway_network_changes_compatability(...)

Checks to see that any links or nodes that change exist in base roadway network.

emme_id_to_wrangler_id(emme_link_change_df, ...)

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

emme_name_to_wrangler_name(...)

rename emme names to wrangler names using crosswalk file

evaluate_changes()

Determines which changes should be evaluated, initiates self.card_data to be an aggregation of transit and highway changes.

get_object_from_network_build_command()

determine the network build object is node or link

get_operation_from_network_build_command()

determine the network build object action type

read_logfile(logfilename)

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

read_network_build_file(networkbuildfilename)

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

write_project_card([filename])

Writes project cards.

+

Attributes

+ + + + + + + + + + + + +

CALCULATED_VALUES

DEFAULT_PROJECT_NAME

STATIC_VALUES

+
+
+add_highway_changes(limit_variables_to_existing_network=False)[source]
+

Evaluates changes from the log file based on the base highway object and +adds entries into the self.card_data dictionary.

+
+
Parameters:
+

limit_variables_to_existing_network (bool) – True if no ad-hoc variables. Default to False.

+
+
+
+ +
+
+add_transit_changes()[source]
+

Evaluates changes between base and build transit objects and +adds entries into the self.card_data dictionary.

+
+ +
+
+static create_project(roadway_log_file=None, roadway_shp_file=None, roadway_csv_file=None, network_build_file=None, emme_node_id_crosswalk_file=None, emme_name_crosswalk_file=None, base_roadway_dir=None, base_transit_dir=None, base_cube_transit_source=None, build_cube_transit_source=None, roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name=None, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Constructor for a Project instance.

+
+
Parameters:
+
    +
  • roadway_log_file – File path to consuming logfile or a list of logfile paths.

  • +
  • roadway_shp_file – File path to consuming shape file for roadway changes.

  • +
  • roadway_csv_file – File path to consuming csv file for roadway changes.

  • +
  • network_build_file – File path to consuming EMME network build for network changes.

  • +
  • base_roadway_dir – Folder path to base roadway network.

  • +
  • base_transit_dir – Folder path to base transit network.

  • +
  • base_cube_transit_source – Folder path to base transit network or cube line file string.

  • +
  • base_cube_transit_file – File path to base transit network.

  • +
  • build_cube_transit_source – Folder path to build transit network or cube line file string.

  • +
  • build_cube_transit_file – File path to build transit network.

  • +
  • roadway_link_changes – pandas dataframe of CUBE roadway link changes.

  • +
  • roadway_node_changes – pandas dataframe of CUBE roadway node changes.

  • +
  • transit_changes – build transit changes.

  • +
  • base_roadway_network – Base roadway network object.

  • +
  • base_cube_transit_network – Base cube transit network object.

  • +
  • build_cube_transit_network – Build cube transit network object.

  • +
  • project_name – If not provided, will default to the roadway_log_file filename if +provided (or the first filename if a list is provided)

  • +
  • recalculate_calculated_variables – if reading in a base network, if this is true it +will recalculate variables such as area type, etc. This only needs to be true +if you are creating project cards that are changing the calculated variables.

  • +
  • recalculate_distance – recalculate the distance variable. This only needs to be +true if you are creating project cards that change the distance.

  • +
  • parameters – dictionary of parameters

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in +the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables +in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

A Project instance.

+
+
+
+ +
+
+static determine_roadway_network_changes_compatability(base_roadway_network, roadway_link_changes, roadway_node_changes, parameters)[source]
+

Checks to see that any links or nodes that change exist in base roadway network.

+
+ +
+
+static emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file)[source]
+

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

+
+ +
+
+static emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file)[source]
+

rename emme names to wrangler names using crosswalk file

+
+ +
+
+evaluate_changes()[source]
+

Determines which changes should be evaluated, initiates +self.card_data to be an aggregation of transit and highway changes.

+
+ +
+
+get_object_from_network_build_command()[source]
+

determine the network build object is node or link

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘N’ for node, ‘L’ for link

+
+
+
+ +
+
+get_operation_from_network_build_command()[source]
+

determine the network build object action type

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘A’, ‘C’, ‘D’

+
+
+
+ +
+
+static read_logfile(logfilename)[source]
+

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

logfilename (str or list[str]) – File path to CUBE logfile or list of logfile paths.

+
+
Returns:
+

A DataFrame reprsentation of the log file.

+
+
+
+ +
+
+static read_network_build_file(networkbuildfilename)[source]
+

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

networkbuildfilename (str or list[str]) – File path to emme nework build file or list of network build file paths.

+
+
Returns:
+

A DataFrame representation of the network build file

+
+
+
+ +
+
+write_project_card(filename=None)[source]
+

Writes project cards.

+
+
Parameters:
+

filename (str) – File path to output .yml

+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'assign_group', 'centroidconnect']
+
+ +
+
+DEFAULT_PROJECT_NAME = 'USER TO define'
+
+ +
+
+STATIC_VALUES = ['model_link_id', 'area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_generated/lasso.StandardTransit/index.html b/_generated/lasso.StandardTransit/index.html new file mode 100644 index 0000000..897e2c4 --- /dev/null +++ b/_generated/lasso.StandardTransit/index.html @@ -0,0 +1,451 @@ + + + + + + + lasso.StandardTransit — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.StandardTransit

+
+
+class lasso.StandardTransit(ptg_feed, parameters={})[source]
+

Bases: object

+

Holds a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s +Cube Line files.

+

Typical usage example:

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+feed
+

Partridge Feed object containing read-only access to GTFS feed

+
+ +
+
+parameters
+

Parameters instance containing information +about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+__init__(ptg_feed, parameters={})[source]
+
+
Parameters:
+
    +
  • ptg_feed – partridge feed object

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(ptg_feed[, parameters])

+
param ptg_feed:
+

partridge feed object

+
+
+

calculate_cube_mode(row)

Assigns a cube mode number by following logic.

cube_format(row)

Creates a string represnting the route in cube line file notation. #MC :param row: row of a DataFrame representing a cube-formatted trip, with the Attributes trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR.

evaluate_differences(transit_changes)

Compare changes from the transit_changes dataframe with the standard transit network returns the project card changes in dictionary format

fromTransitNetwork(transit_network_object[, ...])

RoadwayNetwork to ModelRoadwayNetwork

read_gtfs(gtfs_feed_dir[, parameters])

Reads GTFS files from a directory and returns a StandardTransit instance.

route_properties_gtfs_to_cube(self)

Prepare gtfs for cube lin file.

shape_gtfs_to_cube(row[, add_nntime])

Creates a list of nodes that for the route in appropriate cube format.

shape_gtfs_to_dict_list(trip_id, shape_id, ...)

This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of stepping through the routed nodes and corresponding them with shape nodes.

shape_gtfs_to_emme(trip_row)

Creates transit segment for the trips in appropriate emme format.

time_to_cube_time_period(start_time_secs[, ...])

Converts seconds from midnight to the cube time period.

write_as_cube_lin([outpath])

Writes the gtfs feed as a cube line file after converting gtfs properties to MetCouncil cube properties.

+
+
+calculate_cube_mode(row)[source]
+

Assigns a cube mode number by following logic. +#MC +For rail, uses GTFS route_type variable: +https://developers.google.com/transit/gtfs/reference

+
+
::

# route_type : cube_mode +route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail

+
+

3: 0, # Bus; further disaggregated for cube +2: 9} # Rail

+
+
+
+

For buses, uses route id numbers and route name to find +express and suburban buses as follows:

+
+
::
+
if not cube_mode:
+
if ‘express’ in row[‘LONGNAME’].lower():

cube_mode = 7 # Express

+
+
elif int(row[‘route_id’].split(“-“)[0]) > 99:

cube_mode = 6 # Suburban Local

+
+
else:

cube_mode = 5 # Urban Local

+
+
+
+
+
+
+
+
Parameters:
+

row – A DataFrame row with route_type, route_long_name, and route_id

+
+
Returns:
+

cube mode number

+
+
+
+ +
+
+cube_format(row)[source]
+

Creates a string represnting the route in cube line file notation. +#MC +:param row: row of a DataFrame representing a cube-formatted trip, with the Attributes

+
+

trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR

+
+
+
Returns:
+

string representation of route in cube line file notation

+
+
+
+ +
+
+evaluate_differences(transit_changes)[source]
+

Compare changes from the transit_changes dataframe with the standard transit network +returns the project card changes in dictionary format

+
+ +
+
+static fromTransitNetwork(transit_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • transit_network_object – Reference to an instance of TransitNetwork.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit

+
+
+
+ +
+
+static read_gtfs(gtfs_feed_dir, parameters={})[source]
+

Reads GTFS files from a directory and returns a StandardTransit +instance.

+
+
Parameters:
+
    +
  • gtfs_feed_dir – location of the GTFS files

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit instance

+
+
+
+ +
+
+static route_properties_gtfs_to_cube(self)[source]
+

Prepare gtfs for cube lin file. +#MC +Does the following operations: +1. Combines route, frequency, trip, and shape information +2. Converts time of day to time periods +3. Calculates cube route name from gtfs route name and properties +4. Assigns a cube-appropriate mode number +5. Assigns a cube-appropriate operator number

+
+
Returns:
+

+
DataFrame of trips with cube-appropriate values for:
    +
  • NAME

  • +
  • ONEWAY

  • +
  • OPERATOR

  • +
  • MODE

  • +
  • HEADWAY

  • +
+
+
+

+
+
Return type:
+

trip_df (DataFrame)

+
+
+
+ +
+
+shape_gtfs_to_cube(row, add_nntime=False)[source]
+

Creates a list of nodes that for the route in appropriate +cube format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a string representation of the node list

for a route in cube format.

+
+
+
+ +
+
+shape_gtfs_to_dict_list(trip_id, shape_id, add_nntime)[source]
+

This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of +stepping through the routed nodes and corresponding them with shape nodes.

+

TODO: eliminate this necessity by tagging the stop nodes in the shapes to begin with when +the transit routing on the roadway network is first performed.

+

As such, I’m copying the code from StandardTransit.shape_gtfs_to_cube() with minimal modifications.

+
+
Parameters:
+
    +
  • question (shape_id of the trip in) –

  • +
  • question

  • +
+
+
Returns:
+

trip_id +shape_id +shape_pt_sequence +shape_mode_node_id +is_stop +access +stop_sequence

+
+
Return type:
+

list of dict records with columns

+
+
+
+ +
+
+shape_gtfs_to_emme(trip_row)[source]
+

Creates transit segment for the trips in appropriate +emme format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a dataframe representation of the transit segment

for a trip in emme format.

+
+
+
+ +
+
+time_to_cube_time_period(start_time_secs, as_str=True, verbose=False)[source]
+

Converts seconds from midnight to the cube time period.

+
+
Parameters:
+
    +
  • start_time_secs – start time for transit trip in seconds +from midnight

  • +
  • as_str – if True, returns the time period as a string, +otherwise returns a numeric time period

  • +
+
+
Returns:
+

+
if as_str is False, returns the numeric

time period

+
+
this_tp: if as_str is True, returns the Cube time period

name abbreviation

+
+
+

+
+
Return type:
+

this_tp_num

+
+
+
+ +
+
+write_as_cube_lin(outpath=None)[source]
+

Writes the gtfs feed as a cube line file after +converting gtfs properties to MetCouncil cube properties. +#MC +:param outpath: File location for output cube line file.

+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_generated/lasso.logger/index.html b/_generated/lasso.logger/index.html new file mode 100644 index 0000000..b001be1 --- /dev/null +++ b/_generated/lasso.logger/index.html @@ -0,0 +1,141 @@ + + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.logger

+

Functions

+ + + + + + +

setupLogging(infoLogFilename, debugLogFilename)

Sets up the logger.

+
+
+lasso.logger.setupLogging(infoLogFilename, debugLogFilename, logToConsole=True)[source]
+

Sets up the logger. The infoLog is terse, just gives the bare minimum of details +so the network composition will be clear later. +The debuglog is very noisy, for debugging.

+

Pass none to either. +Spews it all out to console too, if logToConsole is true.

+
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_generated/lasso.util/index.html b/_generated/lasso.util/index.html new file mode 100644 index 0000000..4f85ed0 --- /dev/null +++ b/_generated/lasso.util/index.html @@ -0,0 +1,1694 @@ + + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.util

+

Functions

+ + + + + + + + + + + + + + + + + + + + + + + + +

column_name_to_parts(c[, parameters])

create_locationreference(node, link)

geodesic_point_buffer(lat, lon, meters)

creates circular buffer polygon for node

get_shared_streets_intersection_hash(lat, long)

Calculated per:

hhmmss_to_datetime(hhmmss_str)

Creates a datetime time object from a string of hh:mm:ss

secs_to_datetime(secs)

Creates a datetime time object from a seconds from midnight

shorten_name(name)

+
+
+class lasso.util.Point(*args)[source]
+

Bases: BaseGeometry

+

A geometry type that represents a single coordinate with +x,y and possibly z values.

+

A point is a zero-dimensional feature and has zero length and zero area.

+
+
Parameters:
+

args (float, or sequence of floats) –

The coordinates can either be passed as a single parameter, or as +individual float values using multiple parameters:

+
    +
  1. 1 parameter: a sequence or array-like of with 2 or 3 values.

  2. +
  3. 2 or 3 parameters (float): x, y, and possibly z.

  4. +
+

+
+
+
+
+x, y, z
+

Coordinate values

+
+
Type:
+

float

+
+
+
+ +

Examples

+

Constructing the Point using separate parameters for x and y:

+
>>> p = Point(1.0, -1.0)
+
+
+

Constructing the Point using a list of x, y coordinates:

+
>>> p = Point([1.0, -1.0])
+>>> print(p)
+POINT (1 -1)
+>>> p.y
+-1.0
+>>> p.x
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG circle element for the Point geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG circle diameter. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property x
+

Return x coordinate.

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+

Example

+
>>> x, y = Point(0, 0).xy
+>>> list(x)
+[0.0]
+>>> list(y)
+[0.0]
+
+
+
+ +
+
+property y
+

Return y coordinate.

+
+ +
+
+property z
+

Return z coordinate.

+
+ +
+ +
+
+class lasso.util.Polygon(shell=None, holes=None)[source]
+

Bases: BaseGeometry

+

A geometry type representing an area that is enclosed by a linear ring.

+

A polygon is a two-dimensional feature and has a non-zero area. It may +have one or more negative-space “holes” which are also bounded by linear +rings. If any rings cross each other, the feature is invalid and +operations on it may fail.

+
+
Parameters:
+
    +
  • shell (sequence) – A sequence of (x, y [,z]) numeric coordinate pairs or triples, or +an array-like with shape (N, 2) or (N, 3). +Also can be a sequence of Point objects.

  • +
  • holes (sequence) – A sequence of objects which satisfy the same requirements as the +shell parameters above

  • +
+
+
+
+
+exterior
+

The ring which bounds the positive space of the polygon.

+
+
Type:
+

LinearRing

+
+
+
+ +
+
+interiors
+

A sequence of rings which bound all existing holes.

+
+
Type:
+

sequence

+
+
+
+ +

Examples

+

Create a square polygon with no holes

+
>>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.))
+>>> polygon = Polygon(coords)
+>>> polygon.area
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+classmethod from_bounds(xmin, ymin, xmax, ymax)[source]
+

Construct a Polygon() from spatial bounds.

+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG path element for the Polygon geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG stroke-width. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property exterior
+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property interiors
+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+
+ +
+ +
+
+class lasso.util.partial[source]
+

Bases: object

+

partial(func, *args, **keywords) - new function with partial application +of the given arguments and keywords.

+
+
+args
+

tuple of arguments to future partial calls

+
+ +
+
+func
+

function object to use in future partial calls

+
+ +
+
+keywords
+

dictionary of keyword arguments to future partial calls

+
+ +
+ +
+
+lasso.util.column_name_to_parts(c, parameters=None)[source]
+
+ +
+
+lasso.util.create_locationreference(node, link)[source]
+
+ +
+
+lasso.util.geodesic_point_buffer(lat, lon, meters)[source]
+

creates circular buffer polygon for node

+
+
Parameters:
+
    +
  • lat – node lat

  • +
  • lon – node lon

  • +
  • meters – buffer distance, radius of circle

  • +
+
+
Returns:
+

Polygon

+
+
+
+ +
+
+lasso.util.get_shared_streets_intersection_hash(lat, long, osm_node_id=None)[source]
+
+
Calculated per:

https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565

+
+
Expected in/out
+
-93.0965985, 44.952112199999995 osm_node_id = 954734870

69f13f881649cb21ee3b359730790bb9

+
+
+
+
+
+ +
+
+lasso.util.hhmmss_to_datetime(hhmmss_str)[source]
+

Creates a datetime time object from a string of hh:mm:ss

+
+
Parameters:
+

hhmmss_str – string of hh:mm:ss

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.secs_to_datetime(secs)[source]
+

Creates a datetime time object from a seconds from midnight

+
+
Parameters:
+

secs – seconds from midnight

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.shorten_name(name)[source]
+
+ +
+
+lasso.util.transform(func, geom)[source]
+

Applies func to all coordinates of geom and returns a new +geometry of the same type from the transformed coordinates.

+

func maps x, y, and optionally z to output xp, yp, zp. The input +parameters may iterable types like lists or arrays or single values. +The output shall be of the same type. Scalars in, scalars out. +Lists in, lists out.

+

For example, here is an identity function applicable to both types +of input.

+
+
+
def id_func(x, y, z=None):

return tuple(filter(None, [x, y, z]))

+
+
+

g2 = transform(id_func, g1)

+
+

Using pyproj >= 2.1, this example will accurately project Shapely geometries:

+
+

import pyproj

+

wgs84 = pyproj.CRS(‘EPSG:4326’) +utm = pyproj.CRS(‘EPSG:32618’)

+

project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform

+

g2 = transform(project, g1)

+
+

Note that the always_xy kwarg is required here as Shapely geometries only support +X,Y coordinate ordering.

+

Lambda expressions such as the one in

+
+

g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1)

+
+

also satisfy the requirements for func.

+
+ +
+
+lasso.util.unidecode(string, errors='ignore', replace_str='?')
+

Transliterate an Unicode object into an ASCII string

+
+
Return type:
+

str

+
+
+
>>> unidecode("北亰")
+"Bei Jing "
+
+
+

This function first tries to convert the string using ASCII codec. +If it fails (because of non-ASCII characters), it falls back to +transliteration using the character tables.

+

This is approx. five times faster if the string only contains ASCII +characters, but slightly slower than unicode_expect_nonascii if +non-ASCII characters are present.

+

errors specifies what to do with characters that have not been +found in replacement tables. The default is ‘ignore’ which ignores +the character. ‘strict’ raises an UnidecodeError. ‘replace’ +substitutes the character with replace_str (default is ‘?’). +‘preserve’ keeps the original character.

+

Note that if ‘preserve’ is used the returned string might not be +ASCII!

+
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_modules/functools/index.html b/_modules/functools/index.html new file mode 100644 index 0000000..77a2db2 --- /dev/null +++ b/_modules/functools/index.html @@ -0,0 +1,1081 @@ + + + + + + functools — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for functools

+"""functools.py - Tools for working with functions and callable objects
+"""
+# Python module wrapper for _functools C module
+# to allow utilities written in Python to be added
+# to the functools module.
+# Written by Nick Coghlan <ncoghlan at gmail.com>,
+# Raymond Hettinger <python at rcn.com>,
+# and Łukasz Langa <lukasz at langa.pl>.
+#   Copyright (C) 2006-2013 Python Software Foundation.
+# See C source code for _functools credits/copyright
+
+__all__ = ['update_wrapper', 'wraps', 'WRAPPER_ASSIGNMENTS', 'WRAPPER_UPDATES',
+           'total_ordering', 'cmp_to_key', 'lru_cache', 'reduce', 'partial',
+           'partialmethod', 'singledispatch', 'singledispatchmethod',
+           "cached_property"]
+
+from abc import get_cache_token
+from collections import namedtuple
+# import types, weakref  # Deferred to single_dispatch()
+from reprlib import recursive_repr
+from _thread import RLock
+
+
+################################################################################
+### update_wrapper() and wraps() decorator
+################################################################################
+
+# update_wrapper() and wraps() are tools to help write
+# wrapper functions that can handle naive introspection
+
+WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__',
+                       '__annotations__')
+WRAPPER_UPDATES = ('__dict__',)
+def update_wrapper(wrapper,
+                   wrapped,
+                   assigned = WRAPPER_ASSIGNMENTS,
+                   updated = WRAPPER_UPDATES):
+    """Update a wrapper function to look like the wrapped function
+
+       wrapper is the function to be updated
+       wrapped is the original function
+       assigned is a tuple naming the attributes assigned directly
+       from the wrapped function to the wrapper function (defaults to
+       functools.WRAPPER_ASSIGNMENTS)
+       updated is a tuple naming the attributes of the wrapper that
+       are updated with the corresponding attribute from the wrapped
+       function (defaults to functools.WRAPPER_UPDATES)
+    """
+    for attr in assigned:
+        try:
+            value = getattr(wrapped, attr)
+        except AttributeError:
+            pass
+        else:
+            setattr(wrapper, attr, value)
+    for attr in updated:
+        getattr(wrapper, attr).update(getattr(wrapped, attr, {}))
+    # Issue #17482: set __wrapped__ last so we don't inadvertently copy it
+    # from the wrapped function when updating __dict__
+    wrapper.__wrapped__ = wrapped
+    # Return the wrapper so this can be used as a decorator via partial()
+    return wrapper
+
+def wraps(wrapped,
+          assigned = WRAPPER_ASSIGNMENTS,
+          updated = WRAPPER_UPDATES):
+    """Decorator factory to apply update_wrapper() to a wrapper function
+
+       Returns a decorator that invokes update_wrapper() with the decorated
+       function as the wrapper argument and the arguments to wraps() as the
+       remaining arguments. Default arguments are as for update_wrapper().
+       This is a convenience function to simplify applying partial() to
+       update_wrapper().
+    """
+    return partial(update_wrapper, wrapped=wrapped,
+                   assigned=assigned, updated=updated)
+
+
+################################################################################
+### total_ordering class decorator
+################################################################################
+
+# The total ordering functions all invoke the root magic method directly
+# rather than using the corresponding operator.  This avoids possible
+# infinite recursion that could occur when the operator dispatch logic
+# detects a NotImplemented result and then calls a reflected method.
+
+def _gt_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a < b) and (a != b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _le_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (a < b) or (a == b).'
+    op_result = self.__lt__(other)
+    return op_result or self == other
+
+def _ge_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a < b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _ge_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a <= b) or (a == b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _lt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (a <= b) and (a != b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _gt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a <= b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _lt_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a > b) and (a != b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _ge_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (a > b) or (a == b).'
+    op_result = self.__gt__(other)
+    return op_result or self == other
+
+def _le_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a > b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _le_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a >= b) or (a == b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _gt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (a >= b) and (a != b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _lt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a >= b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+_convert = {
+    '__lt__': [('__gt__', _gt_from_lt),
+               ('__le__', _le_from_lt),
+               ('__ge__', _ge_from_lt)],
+    '__le__': [('__ge__', _ge_from_le),
+               ('__lt__', _lt_from_le),
+               ('__gt__', _gt_from_le)],
+    '__gt__': [('__lt__', _lt_from_gt),
+               ('__ge__', _ge_from_gt),
+               ('__le__', _le_from_gt)],
+    '__ge__': [('__le__', _le_from_ge),
+               ('__gt__', _gt_from_ge),
+               ('__lt__', _lt_from_ge)]
+}
+
+def total_ordering(cls):
+    """Class decorator that fills in missing ordering methods"""
+    # Find user-defined comparisons (not those inherited from object).
+    roots = {op for op in _convert if getattr(cls, op, None) is not getattr(object, op, None)}
+    if not roots:
+        raise ValueError('must define at least one ordering operation: < > <= >=')
+    root = max(roots)       # prefer __lt__ to __le__ to __gt__ to __ge__
+    for opname, opfunc in _convert[root]:
+        if opname not in roots:
+            opfunc.__name__ = opname
+            setattr(cls, opname, opfunc)
+    return cls
+
+
+################################################################################
+### cmp_to_key() function converter
+################################################################################
+
+def cmp_to_key(mycmp):
+    """Convert a cmp= function into a key= function"""
+    class K(object):
+        __slots__ = ['obj']
+        def __init__(self, obj):
+            self.obj = obj
+        def __lt__(self, other):
+            return mycmp(self.obj, other.obj) < 0
+        def __gt__(self, other):
+            return mycmp(self.obj, other.obj) > 0
+        def __eq__(self, other):
+            return mycmp(self.obj, other.obj) == 0
+        def __le__(self, other):
+            return mycmp(self.obj, other.obj) <= 0
+        def __ge__(self, other):
+            return mycmp(self.obj, other.obj) >= 0
+        __hash__ = None
+    return K
+
+try:
+    from _functools import cmp_to_key
+except ImportError:
+    pass
+
+
+################################################################################
+### reduce() sequence to a single item
+################################################################################
+
+_initial_missing = object()
+
+def reduce(function, sequence, initial=_initial_missing):
+    """
+    reduce(function, sequence[, initial]) -> value
+
+    Apply a function of two arguments cumulatively to the items of a sequence,
+    from left to right, so as to reduce the sequence to a single value.
+    For example, reduce(lambda x, y: x+y, [1, 2, 3, 4, 5]) calculates
+    ((((1+2)+3)+4)+5).  If initial is present, it is placed before the items
+    of the sequence in the calculation, and serves as a default when the
+    sequence is empty.
+    """
+
+    it = iter(sequence)
+
+    if initial is _initial_missing:
+        try:
+            value = next(it)
+        except StopIteration:
+            raise TypeError("reduce() of empty sequence with no initial value") from None
+    else:
+        value = initial
+
+    for element in it:
+        value = function(value, element)
+
+    return value
+
+try:
+    from _functools import reduce
+except ImportError:
+    pass
+
+
+################################################################################
+### partial() argument application
+################################################################################
+
+# Purely functional, no descriptor behaviour
+
[docs]class partial: + """New function with partial application of the given arguments + and keywords. + """ + + __slots__ = "func", "args", "keywords", "__dict__", "__weakref__" + + def __new__(cls, func, /, *args, **keywords): + if not callable(func): + raise TypeError("the first argument must be callable") + + if hasattr(func, "func"): + args = func.args + args + keywords = {**func.keywords, **keywords} + func = func.func + + self = super(partial, cls).__new__(cls) + + self.func = func + self.args = args + self.keywords = keywords + return self + + def __call__(self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(*self.args, *args, **keywords) + + @recursive_repr() + def __repr__(self): + qualname = type(self).__qualname__ + args = [repr(self.func)] + args.extend(repr(x) for x in self.args) + args.extend(f"{k}={v!r}" for (k, v) in self.keywords.items()) + if type(self).__module__ == "functools": + return f"functools.{qualname}({', '.join(args)})" + return f"{qualname}({', '.join(args)})" + + def __reduce__(self): + return type(self), (self.func,), (self.func, self.args, + self.keywords or None, self.__dict__ or None) + + def __setstate__(self, state): + if not isinstance(state, tuple): + raise TypeError("argument to __setstate__ must be a tuple") + if len(state) != 4: + raise TypeError(f"expected 4 items in state, got {len(state)}") + func, args, kwds, namespace = state + if (not callable(func) or not isinstance(args, tuple) or + (kwds is not None and not isinstance(kwds, dict)) or + (namespace is not None and not isinstance(namespace, dict))): + raise TypeError("invalid partial state") + + args = tuple(args) # just in case it's a subclass + if kwds is None: + kwds = {} + elif type(kwds) is not dict: # XXX does it need to be *exactly* dict? + kwds = dict(kwds) + if namespace is None: + namespace = {} + + self.__dict__ = namespace + self.func = func + self.args = args + self.keywords = kwds
+ +try: + from _functools import partial +except ImportError: + pass + +# Descriptor version +class partialmethod(object): + """Method descriptor with partial application of the given arguments + and keywords. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(*args, **keywords): + if len(args) >= 2: + self, func, *args = args + elif not args: + raise TypeError("descriptor '__init__' of partialmethod " + "needs an argument") + elif 'func' in keywords: + func = keywords.pop('func') + self, *args = args + import warnings + warnings.warn("Passing 'func' as keyword argument is deprecated", + DeprecationWarning, stacklevel=2) + else: + raise TypeError("type 'partialmethod' takes at least one argument, " + "got %d" % (len(args)-1)) + args = tuple(args) + + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError("{!r} is not callable or a descriptor" + .format(func)) + + # func could be a descriptor like classmethod which isn't callable, + # so we can't inherit from partial (it verifies func is callable) + if isinstance(func, partialmethod): + # flattening is mandatory in order to place cls/self before all + # other arguments + # it's also more efficient since only one function will be called + self.func = func.func + self.args = func.args + args + self.keywords = {**func.keywords, **keywords} + else: + self.func = func + self.args = args + self.keywords = keywords + __init__.__text_signature__ = '($self, func, /, *args, **keywords)' + + def __repr__(self): + args = ", ".join(map(repr, self.args)) + keywords = ", ".join("{}={!r}".format(k, v) + for k, v in self.keywords.items()) + format_string = "{module}.{cls}({func}, {args}, {keywords})" + return format_string.format(module=self.__class__.__module__, + cls=self.__class__.__qualname__, + func=self.func, + args=args, + keywords=keywords) + + def _make_unbound_method(self): + def _method(cls_or_self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(cls_or_self, *self.args, *args, **keywords) + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method._partialmethod = self + return _method + + def __get__(self, obj, cls=None): + get = getattr(self.func, "__get__", None) + result = None + if get is not None: + new_func = get(obj, cls) + if new_func is not self.func: + # Assume __get__ returning something new indicates the + # creation of an appropriate callable + result = partial(new_func, *self.args, **self.keywords) + try: + result.__self__ = new_func.__self__ + except AttributeError: + pass + if result is None: + # If the underlying descriptor didn't do anything, treat this + # like an instance method + result = self._make_unbound_method().__get__(obj, cls) + return result + + @property + def __isabstractmethod__(self): + return getattr(self.func, "__isabstractmethod__", False) + +# Helper functions + +def _unwrap_partial(func): + while isinstance(func, partial): + func = func.func + return func + +################################################################################ +### LRU Cache function decorator +################################################################################ + +_CacheInfo = namedtuple("CacheInfo", ["hits", "misses", "maxsize", "currsize"]) + +class _HashedSeq(list): + """ This class guarantees that hash() will be called no more than once + per element. This is important because the lru_cache() will hash + the key multiple times on a cache miss. + + """ + + __slots__ = 'hashvalue' + + def __init__(self, tup, hash=hash): + self[:] = tup + self.hashvalue = hash(tup) + + def __hash__(self): + return self.hashvalue + +def _make_key(args, kwds, typed, + kwd_mark = (object(),), + fasttypes = {int, str}, + tuple=tuple, type=type, len=len): + """Make a cache key from optionally typed positional and keyword arguments + + The key is constructed in a way that is flat as possible rather than + as a nested structure that would take more memory. + + If there is only a single argument and its data type is known to cache + its hash value, then that argument is returned without a wrapper. This + saves space and improves lookup speed. + + """ + # All of code below relies on kwds preserving the order input by the user. + # Formerly, we sorted() the kwds before looping. The new way is *much* + # faster; however, it means that f(x=1, y=2) will now be treated as a + # distinct call from f(y=2, x=1) which will be cached separately. + key = args + if kwds: + key += kwd_mark + for item in kwds.items(): + key += item + if typed: + key += tuple(type(v) for v in args) + if kwds: + key += tuple(type(v) for v in kwds.values()) + elif len(key) == 1 and type(key[0]) in fasttypes: + return key[0] + return _HashedSeq(key) + +def lru_cache(maxsize=128, typed=False): + """Least-recently-used cache decorator. + + If *maxsize* is set to None, the LRU features are disabled and the cache + can grow without bound. + + If *typed* is True, arguments of different types will be cached separately. + For example, f(3.0) and f(3) will be treated as distinct calls with + distinct results. + + Arguments to the cached function must be hashable. + + View the cache statistics named tuple (hits, misses, maxsize, currsize) + with f.cache_info(). Clear the cache and statistics with f.cache_clear(). + Access the underlying function with f.__wrapped__. + + See: http://en.wikipedia.org/wiki/Cache_replacement_policies#Least_recently_used_(LRU) + + """ + + # Users should only access the lru_cache through its public API: + # cache_info, cache_clear, and f.__wrapped__ + # The internals of the lru_cache are encapsulated for thread safety and + # to allow the implementation to change (including a possible C version). + + if isinstance(maxsize, int): + # Negative maxsize is treated as 0 + if maxsize < 0: + maxsize = 0 + elif callable(maxsize) and isinstance(typed, bool): + # The user_function was passed in directly via the maxsize argument + user_function, maxsize = maxsize, 128 + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + elif maxsize is not None: + raise TypeError( + 'Expected first argument to be an integer, a callable, or None') + + def decorating_function(user_function): + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + + return decorating_function + +def _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo): + # Constants shared by all lru cache instances: + sentinel = object() # unique object used to signal cache misses + make_key = _make_key # build a key from the function arguments + PREV, NEXT, KEY, RESULT = 0, 1, 2, 3 # names for the link fields + + cache = {} + hits = misses = 0 + full = False + cache_get = cache.get # bound method to lookup a key or return None + cache_len = cache.__len__ # get cache size without calling len() + lock = RLock() # because linkedlist updates aren't threadsafe + root = [] # root of the circular doubly linked list + root[:] = [root, root, None, None] # initialize by pointing to self + + if maxsize == 0: + + def wrapper(*args, **kwds): + # No caching -- just a statistics update + nonlocal misses + misses += 1 + result = user_function(*args, **kwds) + return result + + elif maxsize is None: + + def wrapper(*args, **kwds): + # Simple caching without ordering or size limit + nonlocal hits, misses + key = make_key(args, kwds, typed) + result = cache_get(key, sentinel) + if result is not sentinel: + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + cache[key] = result + return result + + else: + + def wrapper(*args, **kwds): + # Size limited caching that tracks accesses by recency + nonlocal root, hits, misses, full + key = make_key(args, kwds, typed) + with lock: + link = cache_get(key) + if link is not None: + # Move the link to the front of the circular queue + link_prev, link_next, _key, result = link + link_prev[NEXT] = link_next + link_next[PREV] = link_prev + last = root[PREV] + last[NEXT] = root[PREV] = link + link[PREV] = last + link[NEXT] = root + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + with lock: + if key in cache: + # Getting here means that this same key was added to the + # cache while the lock was released. Since the link + # update is already done, we need only return the + # computed result and update the count of misses. + pass + elif full: + # Use the old root to store the new key and result. + oldroot = root + oldroot[KEY] = key + oldroot[RESULT] = result + # Empty the oldest link and make it the new root. + # Keep a reference to the old key and old result to + # prevent their ref counts from going to zero during the + # update. That will prevent potentially arbitrary object + # clean-up code (i.e. __del__) from running while we're + # still adjusting the links. + root = oldroot[NEXT] + oldkey = root[KEY] + oldresult = root[RESULT] + root[KEY] = root[RESULT] = None + # Now update the cache dictionary. + del cache[oldkey] + # Save the potentially reentrant cache[key] assignment + # for last, after the root and links have been put in + # a consistent state. + cache[key] = oldroot + else: + # Put result in a new link at the front of the queue. + last = root[PREV] + link = [last, root, key, result] + last[NEXT] = root[PREV] = cache[key] = link + # Use the cache_len bound method instead of the len() function + # which could potentially be wrapped in an lru_cache itself. + full = (cache_len() >= maxsize) + return result + + def cache_info(): + """Report cache statistics""" + with lock: + return _CacheInfo(hits, misses, maxsize, cache_len()) + + def cache_clear(): + """Clear the cache and cache statistics""" + nonlocal hits, misses, full + with lock: + cache.clear() + root[:] = [root, root, None, None] + hits = misses = 0 + full = False + + wrapper.cache_info = cache_info + wrapper.cache_clear = cache_clear + return wrapper + +try: + from _functools import _lru_cache_wrapper +except ImportError: + pass + + +################################################################################ +### singledispatch() - single-dispatch generic function decorator +################################################################################ + +def _c3_merge(sequences): + """Merges MROs in *sequences* to a single MRO using the C3 algorithm. + + Adapted from http://www.python.org/download/releases/2.3/mro/. + + """ + result = [] + while True: + sequences = [s for s in sequences if s] # purge empty sequences + if not sequences: + return result + for s1 in sequences: # find merge candidates among seq heads + candidate = s1[0] + for s2 in sequences: + if candidate in s2[1:]: + candidate = None + break # reject the current head, it appears later + else: + break + if candidate is None: + raise RuntimeError("Inconsistent hierarchy") + result.append(candidate) + # remove the chosen candidate + for seq in sequences: + if seq[0] == candidate: + del seq[0] + +def _c3_mro(cls, abcs=None): + """Computes the method resolution order using extended C3 linearization. + + If no *abcs* are given, the algorithm works exactly like the built-in C3 + linearization used for method resolution. + + If given, *abcs* is a list of abstract base classes that should be inserted + into the resulting MRO. Unrelated ABCs are ignored and don't end up in the + result. The algorithm inserts ABCs where their functionality is introduced, + i.e. issubclass(cls, abc) returns True for the class itself but returns + False for all its direct base classes. Implicit ABCs for a given class + (either registered or inferred from the presence of a special method like + __len__) are inserted directly after the last ABC explicitly listed in the + MRO of said class. If two implicit ABCs end up next to each other in the + resulting MRO, their ordering depends on the order of types in *abcs*. + + """ + for i, base in enumerate(reversed(cls.__bases__)): + if hasattr(base, '__abstractmethods__'): + boundary = len(cls.__bases__) - i + break # Bases up to the last explicit ABC are considered first. + else: + boundary = 0 + abcs = list(abcs) if abcs else [] + explicit_bases = list(cls.__bases__[:boundary]) + abstract_bases = [] + other_bases = list(cls.__bases__[boundary:]) + for base in abcs: + if issubclass(cls, base) and not any( + issubclass(b, base) for b in cls.__bases__ + ): + # If *cls* is the class that introduces behaviour described by + # an ABC *base*, insert said ABC to its MRO. + abstract_bases.append(base) + for base in abstract_bases: + abcs.remove(base) + explicit_c3_mros = [_c3_mro(base, abcs=abcs) for base in explicit_bases] + abstract_c3_mros = [_c3_mro(base, abcs=abcs) for base in abstract_bases] + other_c3_mros = [_c3_mro(base, abcs=abcs) for base in other_bases] + return _c3_merge( + [[cls]] + + explicit_c3_mros + abstract_c3_mros + other_c3_mros + + [explicit_bases] + [abstract_bases] + [other_bases] + ) + +def _compose_mro(cls, types): + """Calculates the method resolution order for a given class *cls*. + + Includes relevant abstract base classes (with their respective bases) from + the *types* iterable. Uses a modified C3 linearization algorithm. + + """ + bases = set(cls.__mro__) + # Remove entries which are already present in the __mro__ or unrelated. + def is_related(typ): + return (typ not in bases and hasattr(typ, '__mro__') + and issubclass(cls, typ)) + types = [n for n in types if is_related(n)] + # Remove entries which are strict bases of other entries (they will end up + # in the MRO anyway. + def is_strict_base(typ): + for other in types: + if typ != other and typ in other.__mro__: + return True + return False + types = [n for n in types if not is_strict_base(n)] + # Subclasses of the ABCs in *types* which are also implemented by + # *cls* can be used to stabilize ABC ordering. + type_set = set(types) + mro = [] + for typ in types: + found = [] + for sub in typ.__subclasses__(): + if sub not in bases and issubclass(cls, sub): + found.append([s for s in sub.__mro__ if s in type_set]) + if not found: + mro.append(typ) + continue + # Favor subclasses with the biggest number of useful bases + found.sort(key=len, reverse=True) + for sub in found: + for subcls in sub: + if subcls not in mro: + mro.append(subcls) + return _c3_mro(cls, abcs=mro) + +def _find_impl(cls, registry): + """Returns the best matching implementation from *registry* for type *cls*. + + Where there is no registered implementation for a specific type, its method + resolution order is used to find a more generic implementation. + + Note: if *registry* does not contain an implementation for the base + *object* type, this function may return None. + + """ + mro = _compose_mro(cls, registry.keys()) + match = None + for t in mro: + if match is not None: + # If *match* is an implicit ABC but there is another unrelated, + # equally matching implicit ABC, refuse the temptation to guess. + if (t in registry and t not in cls.__mro__ + and match not in cls.__mro__ + and not issubclass(match, t)): + raise RuntimeError("Ambiguous dispatch: {} or {}".format( + match, t)) + break + if t in registry: + match = t + return registry.get(match) + +def singledispatch(func): + """Single-dispatch generic function decorator. + + Transforms a function into a generic function, which can have different + behaviours depending upon the type of its first argument. The decorated + function acts as the default implementation, and additional + implementations can be registered using the register() attribute of the + generic function. + """ + # There are many programs that use functools without singledispatch, so we + # trade-off making singledispatch marginally slower for the benefit of + # making start-up of such applications slightly faster. + import types, weakref + + registry = {} + dispatch_cache = weakref.WeakKeyDictionary() + cache_token = None + + def dispatch(cls): + """generic_func.dispatch(cls) -> <function implementation> + + Runs the dispatch algorithm to return the best available implementation + for the given *cls* registered on *generic_func*. + + """ + nonlocal cache_token + if cache_token is not None: + current_token = get_cache_token() + if cache_token != current_token: + dispatch_cache.clear() + cache_token = current_token + try: + impl = dispatch_cache[cls] + except KeyError: + try: + impl = registry[cls] + except KeyError: + impl = _find_impl(cls, registry) + dispatch_cache[cls] = impl + return impl + + def register(cls, func=None): + """generic_func.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_func*. + + """ + nonlocal cache_token + if func is None: + if isinstance(cls, type): + return lambda f: register(cls, f) + ann = getattr(cls, '__annotations__', {}) + if not ann: + raise TypeError( + f"Invalid first argument to `register()`: {cls!r}. " + f"Use either `@register(some_class)` or plain `@register` " + f"on an annotated function." + ) + func = cls + + # only import typing if annotation parsing is necessary + from typing import get_type_hints + argname, cls = next(iter(get_type_hints(func).items())) + if not isinstance(cls, type): + raise TypeError( + f"Invalid annotation for {argname!r}. " + f"{cls!r} is not a class." + ) + registry[cls] = func + if cache_token is None and hasattr(cls, '__abstractmethods__'): + cache_token = get_cache_token() + dispatch_cache.clear() + return func + + def wrapper(*args, **kw): + if not args: + raise TypeError(f'{funcname} requires at least ' + '1 positional argument') + + return dispatch(args[0].__class__)(*args, **kw) + + funcname = getattr(func, '__name__', 'singledispatch function') + registry[object] = func + wrapper.register = register + wrapper.dispatch = dispatch + wrapper.registry = types.MappingProxyType(registry) + wrapper._clear_cache = dispatch_cache.clear + update_wrapper(wrapper, func) + return wrapper + + +# Descriptor version +class singledispatchmethod: + """Single-dispatch generic method descriptor. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(self, func): + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError(f"{func!r} is not callable or a descriptor") + + self.dispatcher = singledispatch(func) + self.func = func + + def register(self, cls, method=None): + """generic_method.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_method*. + """ + return self.dispatcher.register(cls, func=method) + + def __get__(self, obj, cls=None): + def _method(*args, **kwargs): + method = self.dispatcher.dispatch(args[0].__class__) + return method.__get__(obj, cls)(*args, **kwargs) + + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method.register = self.register + update_wrapper(_method, self.func) + return _method + + @property + def __isabstractmethod__(self): + return getattr(self.func, '__isabstractmethod__', False) + + +################################################################################ +### cached_property() - computed once per instance, cached as attribute +################################################################################ + +_NOT_FOUND = object() + + +class cached_property: + def __init__(self, func): + self.func = func + self.attrname = None + self.__doc__ = func.__doc__ + self.lock = RLock() + + def __set_name__(self, owner, name): + if self.attrname is None: + self.attrname = name + elif name != self.attrname: + raise TypeError( + "Cannot assign the same cached_property to two different names " + f"({self.attrname!r} and {name!r})." + ) + + def __get__(self, instance, owner=None): + if instance is None: + return self + if self.attrname is None: + raise TypeError( + "Cannot use cached_property instance without calling __set_name__ on it.") + try: + cache = instance.__dict__ + except AttributeError: # not all objects have __dict__ (e.g. class defines slots) + msg = ( + f"No '__dict__' attribute on {type(instance).__name__!r} " + f"instance to cache {self.attrname!r} property." + ) + raise TypeError(msg) from None + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + with self.lock: + # check if another thread filled cache while we awaited lock + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + val = self.func(instance) + try: + cache[self.attrname] = val + except TypeError: + msg = ( + f"The '__dict__' attribute on {type(instance).__name__!r} instance " + f"does not support item assignment for caching {self.attrname!r} property." + ) + raise TypeError(msg) from None + return val +
+ +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_modules/index.html b/_modules/index.html new file mode 100644 index 0000000..21b1409 --- /dev/null +++ b/_modules/index.html @@ -0,0 +1,114 @@ + + + + + + Overview: module code — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+ + +
+
+
+
+ + + + \ No newline at end of file diff --git a/_modules/lasso/logger/index.html b/_modules/lasso/logger/index.html new file mode 100644 index 0000000..0b8cb3c --- /dev/null +++ b/_modules/lasso/logger/index.html @@ -0,0 +1,151 @@ + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.logger

+import logging
+
+__all__ = ["WranglerLogger", "setupLogging"]
+
+
+# for all the Wrangler logging needs!
+WranglerLogger = logging.getLogger("WranglerLogger")
+
+
+
[docs]def setupLogging(infoLogFilename, debugLogFilename, logToConsole=True): + """Sets up the logger. The infoLog is terse, just gives the bare minimum of details + so the network composition will be clear later. + The debuglog is very noisy, for debugging. + + Pass none to either. + Spews it all out to console too, if logToConsole is true. + """ + # clear handlers if any exist already + WranglerLogger.handlers = [] + + # create a logger + WranglerLogger.setLevel(logging.DEBUG) + + if infoLogFilename: + infologhandler = logging.StreamHandler(open(infoLogFilename, "w")) + infologhandler.setLevel(logging.INFO) + infologhandler.setFormatter( + logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") + ) + WranglerLogger.addHandler(infologhandler) + + if debugLogFilename: + debugloghandler = logging.StreamHandler(open(debugLogFilename, "w")) + debugloghandler.setLevel(logging.DEBUG) + debugloghandler.setFormatter( + logging.Formatter("%(asctime)s %(levelname)s %(message)s", "%Y-%m-%d %H:%M") + ) + WranglerLogger.addHandler(debugloghandler) + + if logToConsole: + consolehandler = logging.StreamHandler() + consolehandler.setLevel(logging.DEBUG) + consolehandler.setFormatter( + logging.Formatter("%(name)-12s: %(levelname)-8s %(message)s") + ) + WranglerLogger.addHandler(consolehandler)
+
+ +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_modules/lasso/parameters/index.html b/_modules/lasso/parameters/index.html new file mode 100644 index 0000000..69f40ca --- /dev/null +++ b/_modules/lasso/parameters/index.html @@ -0,0 +1,1029 @@ + + + + + + lasso.parameters — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.parameters

+import os
+from .logger import WranglerLogger
+
+
+from pyproj import CRS
+
+
+def get_base_dir(lasso_base_dir=os.getcwd()):
+    d = lasso_base_dir
+    for i in range(3):
+        if "metcouncil_data" in os.listdir(d):
+
+            WranglerLogger.info("Lasso base directory set as: {}".format(d))
+            return d
+        d = os.path.dirname(d)
+
+    msg = "Cannot find Lasso base directory from {}, please input using keyword in parameters: `lasso_base_dir =` ".format(
+        lasso_base_dir
+    )
+    WranglerLogger.error(msg)
+    raise (ValueError(msg))
+
+
+
[docs]class Parameters: + """A class representing all the parameters defining the networks + including time of day, categories, etc. + + Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + .. highlight:: python + + Attr: + time_period_to_time (dict): Maps time period abbreviations used in + Cube to time of days used on gtfs and highway network standard + Default: + :: + { + "EA": ("3:00", "6:00"), + "AM": ("6:00, "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + cube_time_periods (dict): Maps cube time period numbers used in + transit line files to the time period abbreviations in time_period_to_time + dictionary. + Default: + :: + {"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"} + categories (dict): Maps demand category abbreviations to a list of + network categories they are allowed to use. + Default: + :: + { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + properties_to_split (dict): Dictionary mapping variables in standard + roadway network to categories and time periods that need to be + split out in final model network to get variables like LANES_AM. + Default: + :: + { + "lanes": { + "v": "lanes", + "time_periods": self.time_periods_to_time + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_periods_to_time + }, + "use": { + "v": "use", + "time_periods": self.time_periods_to_time + }, + } + + county_shape (str): File location of shapefile defining counties. + Default: + :: + r"metcouncil_data/county/cb_2017_us_county_5m.shp" + + county_variable_shp (str): Property defining the county n ame in + the county_shape file. + Default: + :: + NAME + lanes_lookup_file (str): Lookup table of number of lanes for different data sources. + Default: + :: + r"metcouncil_data/lookups/lanes.csv" + centroid_connect_lanes (int): Number of lanes for centroid connectors. + Default: + :: + 1 + mpo_counties (list): list of county names within MPO boundary. + Default: + :: + [ + "ANOKA", + "DAKOTA", + "HENNEPIN", + "RAMSEY", + "SCOTT", + "WASHINGTON", + "CARVER", + ] + + taz_shape (str): + Default: + :: + r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp" + taz_data (str): + Default: + :: + ?? + highest_taz_number (int): highest TAZ number in order to define + centroid connectors. + Default: + :: + 3100 + + output_variables (list): list of variables to output in final model + network. + Default: + :: + [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + "distance", + "roadway", + "name", + "roadway_class", + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "trn_priority_EA", + "trn_priority_AM", + "trn_priority_MD", + "trn_priority_PM", + "trn_priority_EV", + "ttime_assert_EA", + "ttime_assert_AM", + "ttime_assert_MD", + "ttime_assert_PM", + "ttime_assert_EV", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "price_sov_EA", + "price_hov2_EA", + "price_hov3_EA", + "price_truck_EA", + "price_sov_AM", + "price_hov2_AM", + "price_hov3_AM", + "price_truck_AM", + "price_sov_MD", + "price_hov2_MD", + "price_hov3_MD", + "price_truck_MD", + "price_sov_PM", + "price_hov2_PM", + "price_hov3_PM", + "price_truck_PM", + "price_sov_EV", + "price_hov2_EV", + "price_hov3_EV", + "price_truck_EV", + "roadway_class_idx", + "facility_type", + "county", + "centroidconnect", + "model_node_id", + "N", + "osm_node_id", + "bike_node", + "transit_node", + "walk_node", + "drive_node", + "geometry", + "X", + "Y", + "ML_lanes_EA", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "segment_id", + "managed", + "bus_only", + "rail_only" + ] + + osm_facility_type_dict (dict): Mapping between OSM Roadway variable + and facility type. Default: + + area_type_shape (str): Location of shapefile defining area type. + Default: + :: + r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp" + area_type_variable_shp (str): property in area_type_shape with area + type in it. + Default: + :: + "COMDES2040" + area_type_code_dict (dict): Mapping of the area_type_variable_shp to + the area type code used in the MetCouncil cube network. + Default: + :: + { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + downtown_area_type_shape (str): Location of shapefile defining downtown area type. + Default: + :: + r"metcouncil_data/area_type/downtownzones_TAZ.shp" + downtown_area_type (int): Area type integer for downtown. + Default: + :: + 5 + mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp" + mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp + associated with roadway class. Default: + :: + "ROUTE_SYS" + widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp" + widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape + associated with roadway class.Default: + :: + "RDWY_CTGY_" + mndot_count_shape (str): Shapefile of MnDOT links with a property + associated with counts. Default: + :: + r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp" + mndot_count_variable_shp (str): The property in mndot_count_shape + associated with counts. Default: + + :: + "lookups/osm_highway_facility_type_crosswalk.csv" + legacy_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from Legacy TM2 network. Default: + :: + "lookups/legacy_tm2_attributes.csv" + osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId + from OSM. Default: + :: + "lookups/osm_lanes_attributes.csv" + tam_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from TAM TM2 network. Default: + :: + "lookups/tam_tm2_attributes.csv" + tom_tom_attributes (str): CSV file of link attributes by + shStReferenceId from TomTom network. Default: + :: + "lookups/tomtom_attributes.csv" + sfcta_attributes (str): CSV file of link attributes by + shStReferenceId from SFCTA network. Default: + :: + "lookups/sfcta_attributes.csv" + output_epsg (int): EPSG type of geographic projection for output + shapefiles. Default: + :: + 102646 + output_link_shp (str): Output shapefile for roadway links. Default: + :: + r"tests/scratch/links.shp" + output_node_shp (str): Output shapefile for roadway nodes. Default: + :: + r"tests/scratch/nodes.shp" + output_link_csv (str): Output csv for roadway links. Default: + :: + r"tests/scratch/links.csv" + output_node_csv (str): Output csv for roadway nodes. Default: + :: + r"tests/scratch/nodes.csv" + output_link_txt (str): Output fixed format txt for roadway links. Default: + :: + r"tests/scratch/links.txt" + output_node_txt (str): Output fixed format txt for roadway nodes. Default: + :: + r"tests/scratch/nodes.txt" + output_link_header_width_txt (str): Header for txt roadway links. Default: + :: + r"tests/scratch/links_header_width.txt" + output_node_header_width_txt (str): Header for txt for roadway Nodes. Default: + :: + r"tests/scratch/nodes_header_width.txt" + output_cube_network_script (str): Cube script for importing + fixed-format roadway network. Default: + :: + r"tests/scratch/make_complete_network_from_fixed_width_file.s + + + + """ + +
[docs] def __init__(self, **kwargs): + """ + Time period and category splitting info + """ + if "time_periods_to_time" in kwargs: + self.time_periods_to_time = kwargs.get("time_periods_to_time") + else: + self.time_period_to_time = { + "EA": ("3:00", "6:00"), + "AM": ("6:00", "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + + #MTC + self.cube_time_periods = { + "1": "EA", + "2": "AM", + "3": "MD", + "4": "PM", + "5": "EV", + } + + """ + #MC + self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7} + + self.route_type_mode_dict = {0: 8, 2: 9} + + self.cube_time_periods = {"1": "AM", "2": "MD"} + self.cube_time_periods_name = {"AM": "pk", "MD": "op"} + """ + if "categories" in kwargs: + self.categories = kwargs.get("categories") + else: + self.categories = { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + + # prefix, source variable, categories + self.properties_to_split = { + "lanes": { + "v": "lanes", + "time_periods": self.time_period_to_time, + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_period_to_time, + }, + "useclass": { + "v": "useclass", + "time_periods": self.time_period_to_time, + }, + } + + """ + Details for calculating the county based on the centroid of the link. + The NAME varible should be the name of a field in shapefile. + """ + #MTC + if 'lasso_base_dir' in kwargs: + self.base_dir = get_base_dir(lasso_base_dir = kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + + if 'data_file_location' in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "mtc_data") + + #MC + if "lasso_base_dir" in kwargs: + self.base_dir = get_base_dir(lasso_base_dir=kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + """ + if "data_file_location" in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "metcouncil_data") + """ + + #-------- + if "settings_location" in kwargs: + self.settings_location = kwargs.get("settings_location") + else: + self.settings_location = os.path.join(self.base_dir, "examples", "settings") + + if "scratch_location" in kwargs: + self.scratch_location = kwargs.get("scratch_location") + else: + self.scratch_location = os.path.join(self.base_dir, "tests", "scratch") + + ### COUNTIES + + self.county_shape = os.path.join( + self.data_file_location, "county", "county.shp" + ) + self.county_variable_shp = "NAME" + + #MTC + self.county_code_dict = { + 'San Francisco':1, + 'San Mateo':2, + 'Santa Clara':3, + 'Alameda':4, + 'Contra Costa':5, + 'Solano':6, + 'Napa':7, + 'Sonoma':8, + 'Marin':9, + 'External':10, + } + + self.county_centroid_range_dict = { + 'San Francisco':range(1,100000), + 'San Mateo':range(100001,200000), + 'Santa Clara':range(200001,300000), + 'Alameda':range(300001,400000), + 'Contra Costa':range(400001,500000), + 'Solano':range(500001,600000), + 'Napa':range(600001,700000), + 'Sonoma':range(700001,800000), + 'Marin':range(800001,900000), + 'External':range(900001,1000000) + } + + self.county_node_range_dict = { + 'San Francisco':range(1000000,1500000), + 'San Mateo':range(1500000,2000000), + 'Santa Clara':range(2000000,2500000), + 'Alameda':range(2500000,3000000), + 'Contra Costa':range(3000000,3500000), + 'Solano':range(3500000,4000000), + 'Napa':range(4000000,4500000), + 'Sonoma':range(4500000,5000000), + 'Marin':range(5000000,5500000), + } + + self.county_hov_node_range_dict = { + 'San Francisco':range(5500000,6000000), + 'San Mateo':range(6000000,6500000), + 'Santa Clara':range(6500000,7000000), + 'Alameda':range(7000000,7500000), + 'Contra Costa':range(7500000,8000000), + 'Solano':range(8000000,8500000), + 'Napa':range(8500000,9000000), + 'Sonoma':range(9000000,9500000), + 'Marin':range(9500000,10000000), + } + + self.county_link_range_dict = { + 'San Francisco':range(1,1000000), + 'San Mateo':range(1000000,2000000), + 'Santa Clara':range(2000000,3000000), + 'Alameda':range(3000000,4000000), + 'Contra Costa':range(4000000,5000000), + 'Solano':range(5000000,6000000), + 'Napa':range(6000000,7000000), + 'Sonoma':range(7000000,8000000), + 'Marin':range(8000000,9000000) + } + + #MC + """ + self.county_code_dict = { + "Anoka": 1, + "Carver": 2, + "Dakota": 3, + "Hennepin": 4, + "Ramsey": 5, + "Scott": 6, + "Washington": 7, + "external": 10, + "Chisago": 11, + "Goodhue": 12, + "Isanti": 13, + "Le Sueur": 14, + "McLeod": 15, + "Pierce": 16, + "Polk": 17, + "Rice": 18, + "Sherburne": 19, + "Sibley": 20, + "St. Croix": 21, + "Wright": 22, + } + """ + + self.mpo_counties = [ + 1, + 3, + 4, + 5, + 6, + 7, + 8, + 9 + ] + + self.taz_N_list = list(range(1, 10000)) + list(range(100001, 110000)) + list(range(200001, 210000)) + list(range(300001, 310000))\ + + list(range(400001, 410000)) + list(range(500001, 510000)) + list(range(600001, 610000)) + list(range(700001, 710000))\ + + list(range(800001, 810000)) + list(range(900001, 1000000)) + + self.maz_N_list = list(range(10001, 90000)) + list(range(110001, 190000)) + list(range(210001, 290000)) + list(range(310001, 390000))\ + + list(range(410001, 490000)) + list(range(510001, 590000)) + list(range(610001, 690000)) + list(range(710001, 790000))\ + + list(range(810001, 890000)) + + self.tap_N_list = list(range(90001, 99999)) + list(range(190001, 199999)) + list(range(290001, 299999)) + list(range(390001, 399999))\ + + list(range(490001, 499999)) + list(range(590001, 599999)) + list(range(690001, 699999)) + list(range(790001, 799999))\ + + list(range(890001, 899999)) + + self.tap_N_start = { + "San Francisco" : 90001, + "San Mateo" : 190001, + "Santa Clara" : 290001, + "Alameda" : 390001, + "Contra Costa" : 490001, + "Solano" : 590001, + "Napa" : 690001, + "Sonoma" : 790001, + "Marin" : 890001 + } + + #MTC + self.osm_facility_type_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_facility_type_crosswalk.csv" + ) + #MC + ### Lanes + self.lanes_lookup_file = os.path.join( + self.data_file_location, "lookups", "lanes.csv" + ) + + ### TAZS + + self.taz_shape = os.path.join( + self.data_file_location, "TAZ", "TAZOfficialWCurrentForecasts.shp" + ) + ###### + #MTC + self.osm_lanes_attributes = os.path.join( + self.data_file_location, "lookups", "osm_lanes_attributes.csv" + ) + + self.legacy_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "legacy_tm2_attributes.csv" + ) + + self.assignable_analysis = os.path.join( + self.data_file_location, "lookups", "assignable_analysis_links.csv" + ) + ### + ### AREA TYPE - MC + self.area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "ThriveMSP2040CommunityDesignation.shp", + ) + self.area_type_variable_shp = "COMDES2040" + # area type map from raw data to model category + + # source https://metrocouncil.org/Planning/Publications-And-Resources/Thrive-MSP-2040-Plan-(1)/7_ThriveMSP2040_LandUsePoliciesbyCD.aspx + # urban center + # urban + # suburban + # suburban edge + # emerging suburban edge + # rural center + # diversified rural + # rural residential + # agricultural + self.area_type_code_dict = { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + + self.downtown_area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "downtownzones_TAZ.shp", + ) + + self.downtown_area_type = int(5) + + self.centroid_connect_lanes = int(1) + + self.osm_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_asgngrp_crosswalk.csv" + ) + self.mrcc_roadway_class_shape = os.path.join( + self.data_file_location, "mrcc", "trans_mrcc_centerlines.shp" + ) + #### + ###MTC + self.tam_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "tam_tm2_attributes.csv" + ) + + self.sfcta_attributes = os.path.join( + self.data_file_location, "lookups", "sfcta_attributes.csv" + ) + + self.tomtom_attributes = os.path.join( + self.data_file_location, "lookups", "tomtom_attributes.csv" + ) + + self.pems_attributes = os.path.join( + self.data_file_location, "lookups", "pems_attributes.csv" + ) + + self.centroid_file = os.path.join( + self.data_file_location, "centroid", "centroid_node.pickle" + ) + #### + ###MC + self.widot_shst_data = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "widot.out.matched.geojson", + ) + #### + + self.centroid_connector_link_file = os.path.join( + self.data_file_location, "centroid", "cc_link.pickle" + ) + + self.centroid_connector_shape_file = os.path.join( + self.data_file_location, "centroid", "cc_shape.pickle" + ) + + self.tap_file = os.path.join( + self.data_file_location, "tap", "tap_node.pickle" + ) + + self.tap_connector_link_file = os.path.join( + self.data_file_location, "tap", "tap_link.pickle" + ) + + self.tap_connector_shape_file = os.path.join( + self.data_file_location, "tap", "tap_shape.pickle" + ) + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + ###MTC + self.log_to_net_crosswalk = os.path.join(self.settings_location, "log_to_net.csv") + + self.emme_name_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "emme_attribute_names.csv" + ) + #### + #MC + self.mndot_count_variable_shp = "AADT_mn" + + self.widot_county_shape = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "TRADAS_(counts).shp", + ) + ### + ###MTC + self.mode_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "gtfs_to_tm2_mode_crosswalk.csv" + ) + + self.veh_cap_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "transitSeatCap.csv" + ) + + self.faresystem_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "faresystem_crosswalk.txt" + ) + + # https://app.asana.com/0/12291104512575/1200287255197808/f + self.fare_2015_to_2010_deflator = 0.927 + #### + #MC + self.widot_count_variable_shp = "AADT_wi" + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + self.log_to_net_crosswalk = os.path.join( + self.settings_location, "log_to_net.csv" + ) + + self.subregion_boundary_file = os.path.join( + self.data_file_location, 'emme', 'subregion_boundary_for_active_modes.shp' + ) + + self.subregion_boundary_id_variable = 'subregion' + #### + + self.output_variables = [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + #MTC + 'name', + "distance", + #"roadway", + #"name", + #MC + #"shape_id", + #"distance", + #"roadway", + #"name", + #"roadway_class", + #### + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "county", + "model_node_id", + "N", + "osm_node_id", + "geometry", + "X", + "Y", + "segment_id", + "managed", + "bus_only", + "rail_only", + #MTC + "assignable", + "cntype", + "useclass_AM", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "useclass_EA", + "transit", + "tollbooth", + "tollseg", + "ft", + "tap_drive", + "tollbooth", + "tollseg", + "farezone", + "tap_id", + #### + #MC + "bike_facility", + "mrcc_id", + "ROUTE_SYS", # mrcc functional class + ] + + self.output_link_shp = os.path.join(self.scratch_location, "links.shp") + self.output_node_shp = os.path.join(self.scratch_location, "nodes.shp") + self.output_link_csv = os.path.join(self.scratch_location, "links.csv") + self.output_node_csv = os.path.join(self.scratch_location, "nodes.csv") + self.output_link_txt = os.path.join(self.scratch_location, "links.txt") + self.output_node_txt = os.path.join(self.scratch_location, "nodes.txt") + self.output_link_header_width_txt = os.path.join( + self.scratch_location, "links_header_width.txt" + ) + self.output_node_header_width_txt = os.path.join( + self.scratch_location, "nodes_header_width.txt" + ) + self.output_cube_network_script = os.path.join( + self.scratch_location, "make_complete_network_from_fixed_width_file.s" + ) + self.output_dir = os.path.join(self.scratch_location) + self.output_proj = CRS("epsg:2875") + self.output_proj4 = '+proj=lcc +lat_0=32.1666666666667 +lon_0=-116.25 +lat_1=33.8833333333333 +lat_2=32.7833333333333 +x_0=2000000.0001016 +y_0=500000.0001016 +ellps=GRS80 +towgs84=-0.991,1.9072,0.5129,-1.25033e-07,-4.6785e-08,-5.6529e-08,0 +units=us-ft +no_defs +type=crs' + self.prj_file = os.path.join(self.data_file_location, 'projection', '2875.prj') + self.wkt_projection = 'PROJCS["NAD83(HARN) / California zone 6 (ftUS)",GEOGCS["NAD83(HARN)",DATUM["NAD83_High_Accuracy_Reference_Network",SPHEROID["GRS 1980",6378137,298.257222101],TOWGS84[-0.991,1.9072,0.5129,-1.25033E-07,-4.6785E-08,-5.6529E-08,0]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4152"]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["latitude_of_origin",32.1666666666667],PARAMETER["central_meridian",-116.25],PARAMETER["standard_parallel_1",33.8833333333333],PARAMETER["standard_parallel_2",32.7833333333333],PARAMETER["false_easting",6561666.667],PARAMETER["false_northing",1640416.667],UNIT["US survey foot",0.304800609601219],AXIS["Easting",EAST],AXIS["Northing",NORTH],AUTHORITY["EPSG","2875"]]' + + self.fare_matrix_output_variables = ["faresystem", "origin_farezone", "destination_farezone", "price"] + + self.zones = 4756 + """ + Create all the possible headway variable combinations based on the cube time periods setting + """ + self.time_period_properties_list = [ + p + "[" + str(t) + "]" + for p in ["HEADWAY", "FREQ"] + for t in self.cube_time_periods.keys() + ] + + self.int_col = [ + "model_link_id", + "model_node_id", + "A", + "B", + #MTC + #"county", + ### + #MC + # "lanes", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "roadway_class", + "assign_group", + #"county", + "area_type", + "trn_priority", + "AADT", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "centroidconnect", + "bike_facility", + #### + "drive_access", + "walk_access", + "bike_access", + "truck_access", + #MTC + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "ML_lanes_EA", + ### + #MC + "drive_node", + "walk_node", + "bike_node", + "transit_node", + # "ML_lanes", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_NT", + #### + "segment_id", + "managed", + "bus_only", + "rail_only", + "transit", + ##MTC + "ft", + "assignable", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EA", + "lanes_EV", + "useclass_AM", + "useclass_EA", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "tollseg", + "tollbooth", + "farezone", + "tap_id", + ] + + self.float_col = [ + "distance", + "price", + "X", + "Y" + "mrcc_id", + ] + + self.float_col = ["distance", "ttime_assert", "price", "X", "Y"] + + self.string_col = [ + "osm_node_id", + "name", + "roadway", + "shstGeometryId", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "ROUTE_SYS", + ] + + self.__dict__.update(kwargs)
+
+ +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_modules/lasso/project/index.html b/_modules/lasso/project/index.html new file mode 100644 index 0000000..c96fe69 --- /dev/null +++ b/_modules/lasso/project/index.html @@ -0,0 +1,1503 @@ + + + + + + lasso.project — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.project

+import json
+import os
+import re
+from typing import Any, Dict, Optional, Union, List
+from csv import reader
+
+from pandas.core import base
+
+import numpy as np
+import pandas as pd
+from pandas import DataFrame
+import geopandas as gpd
+
+from network_wrangler import ProjectCard
+from network_wrangler import RoadwayNetwork
+
+from .transit import CubeTransit, StandardTransit
+from .logger import WranglerLogger
+from .parameters import Parameters
+from .roadway import ModelRoadwayNetwork
+from .util import column_name_to_parts
+
+
+
[docs]class Project(object): + """A single or set of changes to the roadway or transit system. + + Compares a base and a build transit network or a base and build + highway network and produces project cards. + + .. highlight:: python + + Typical usage example: + :: + test_project = Project.create_project( + base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + test_project.evaluate_changes() + test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + + Attributes: + DEFAULT_PROJECT_NAME: a class-level constant that defines what + the project name will be if none is set. + STATIC_VALUES: a class-level constant which defines values that + are not evaluated when assessing changes. + card_data (dict): {"project": <project_name>, "changes": <list of change dicts>} + roadway_link_changes (DataFrame): pandas dataframe of CUBE roadway link changes. + roadway_node_changes (DataFrame): pandas dataframe of CUBE roadway node changes. + transit_changes (CubeTransit): + base_roadway_network (RoadwayNetwork): + base_cube_transit_network (CubeTransit): + build_cube_transit_network (CubeTransit): + project_name (str): name of the project, set to DEFAULT_PROJECT_NAME if not provided + parameters: an instance of the Parameters class which sets a bunch of parameters + """ + + DEFAULT_PROJECT_NAME = "USER TO define" + + STATIC_VALUES = [ + "model_link_id", + "area_type", + "county", + # "assign_group", + "centroidconnect", + ] + CALCULATED_VALUES = [ + "area_type", + "county", + "assign_group", + "centroidconnect", + ] + +
[docs] def __init__( + self, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[DataFrame] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = "", + evaluate: Optional[bool] = False, + parameters: Union[dict, Parameters] = {}, + ): + """ + ProjectCard constructor. + + args: + roadway_link_changes: dataframe of roadway changes read from a log file + roadway_node_changes: dataframe of roadway changes read from a log file + transit_changes: dataframe of transit changes read from a log file + base_roadway_network: RoadwayNetwork instance for base case + base_transit_network: StandardTransit instance for base case + base_cube_transit_network: CubeTransit instance for base transit network + build_cube_transit_network: CubeTransit instance for build transit network + project_name: name of the project + evaluate: defaults to false, but if true, will create card data + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + returns: instance of ProjectCard + """ + self.card_data = Dict[str, Dict[str, Any]] + + self.roadway_link_changes = roadway_link_changes + self.roadway_node_changes = roadway_node_changes + self.base_roadway_network = base_roadway_network + self.base_transit_network = base_transit_network + self.base_cube_transit_network = base_cube_transit_network + self.build_cube_transit_network = build_cube_transit_network + self.transit_changes = transit_changes + self.project_name = ( + project_name if project_name else Project.DEFAULT_PROJECT_NAME + ) + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + if base_roadway_network != None: + self.determine_roadway_network_changes_compatability( + self.base_roadway_network, + self.roadway_link_changes, + self.roadway_node_changes, + self.parameters + ) + + if evaluate: + self.evaluate_changes()
+ +
[docs] def write_project_card(self, filename: str = None): + """ + Writes project cards. + + Args: + filename (str): File path to output .yml + + Returns: + None + """ + ProjectCard(self.card_data).write(out_filename=filename)
+ +
[docs] @staticmethod + def create_project( + roadway_log_file: Union[str, List[str], None] = None, + roadway_shp_file: Optional[str] = None, + roadway_csv_file: Optional[str] = None, + network_build_file: Optional[str] = None, + emme_node_id_crosswalk_file: Optional[str] = None, + emme_name_crosswalk_file: Optional[str] = None, + base_roadway_dir: Optional[str] = None, + base_transit_dir: Optional[str] = None, + base_cube_transit_source: Optional[str] = None, + build_cube_transit_source: Optional[str] = None, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[CubeTransit] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = None, + recalculate_calculated_variables: Optional[bool] = False, + recalculate_distance: Optional[bool] = False, + parameters: Optional[dict] = {}, + **kwargs, + ): + """ + Constructor for a Project instance. + + Args: + roadway_log_file: File path to consuming logfile or a list of logfile paths. + roadway_shp_file: File path to consuming shape file for roadway changes. + roadway_csv_file: File path to consuming csv file for roadway changes. + network_build_file: File path to consuming EMME network build for network changes. + base_roadway_dir: Folder path to base roadway network. + base_transit_dir: Folder path to base transit network. + base_cube_transit_source: Folder path to base transit network or cube line file string. + base_cube_transit_file: File path to base transit network. + build_cube_transit_source: Folder path to build transit network or cube line file string. + build_cube_transit_file: File path to build transit network. + roadway_link_changes: pandas dataframe of CUBE roadway link changes. + roadway_node_changes: pandas dataframe of CUBE roadway node changes. + transit_changes: build transit changes. + base_roadway_network: Base roadway network object. + base_cube_transit_network: Base cube transit network object. + build_cube_transit_network: Build cube transit network object. + project_name: If not provided, will default to the roadway_log_file filename if + provided (or the first filename if a list is provided) + recalculate_calculated_variables: if reading in a base network, if this is true it + will recalculate variables such as area type, etc. This only needs to be true + if you are creating project cards that are changing the calculated variables. + recalculate_distance: recalculate the distance variable. This only needs to be + true if you are creating project cards that change the distance. + parameters: dictionary of parameters + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in + the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables + in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + + Returns: + A Project instance. + """ + + if base_cube_transit_source and base_cube_transit_network: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_cube_transit_network' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_cube_transit_source: + base_cube_transit_network = CubeTransit.create_from_cube(base_cube_transit_source, parameters) + WranglerLogger.debug( + "Base network has {} lines".format(len(base_cube_transit_network.lines)) + ) + if len(base_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Base network lines: {}".format( + "\n - ".join(base_cube_transit_network.lines) + ) + ) + elif base_cube_transit_network: + pass + else: + msg = "No base cube transit network." + WranglerLogger.info(msg) + base_cube_transit_network = None + + if build_cube_transit_source and transit_changes: + msg = "Method takes only one of 'build_cube_transit_source' and 'transit_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if build_cube_transit_source: + WranglerLogger.debug("build") + build_cube_transit_network = CubeTransit.create_from_cube(build_cube_transit_source, parameters) + WranglerLogger.debug( + "Build network has {} lines".format(len(build_cube_transit_network.lines)) + ) + if len(build_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Build network lines: {}".format( + "\n - ".join(build_cube_transit_network.lines) + ) + ) + elif transit_changes: + pass + else: + msg = "No cube transit changes given or processed." + WranglerLogger.info(msg) + transit_changes = None + + if roadway_log_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_log_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_csv_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_csv_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_shp_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_shp_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and not project_name: + if type(roadway_log_file) == list: + project_name = os.path.splitext(os.path.basename(roadway_log_file[0]))[ + 0 + ] + WranglerLogger.info( + "No Project Name - Using name of first log file in list" + ) + else: + project_name = os.path.splitext(os.path.basename(roadway_log_file))[0] + WranglerLogger.info("No Project Name - Using name of log file") + if network_build_file and not project_name: + if type(network_build_file) == list: + with open(network_build_file[0]) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info( + "No Project Name - Using metadata of first network build file in list" + ) + else: + with open(network_build_file) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info("No Project Name - Using metadata of network build file") + if roadway_log_file: + roadway_link_changes, roadway_node_changes = Project.read_logfile(roadway_log_file) + elif roadway_shp_file: + roadway_changes = gpd.read_file(roadway_shp_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_link_changes = DataFrame(roadway_link_changes.drop("geometry", axis=1)) + roadway_node_changes = DataFrame(roadway_node_changes.drop("geometry", axis=1)) + roadway_node_changes["model_node_id"] = 0 + elif roadway_csv_file: + roadway_changes = pd.read_csv(roadway_csv_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_node_changes["model_node_id"] = 0 + elif network_build_file: + roadway_link_changes, roadway_node_changes, transit_changes = Project.read_network_build_file(network_build_file) + if emme_node_id_crosswalk_file: + # get wrangler IDs from emme element_id + roadway_link_changes, roadway_node_changes, transit_changes = Project.emme_id_to_wrangler_id( + roadway_link_changes, + roadway_node_changes, + transit_changes, + emme_node_id_crosswalk_file + ) + else: + msg = "User needs to specify emme node id crosswalk file using emme_node_id_crosswalk_file = " + WranglerLogger.error(msg) + raise ValueError(msg) + # rename emme attributes to wrangler attributes + if emme_name_crosswalk_file is None: + emme_name_crosswalk_file = parameters.emme_name_crosswalk_file + roadway_link_changes, roadway_node_changes = Project.emme_name_to_wrangler_name( + roadway_link_changes, + roadway_node_changes, + emme_name_crosswalk_file + ) + elif roadway_link_changes: + pass + elif roadway_node_changes: + pass + else: + msg = "No roadway changes given or processed." + WranglerLogger.info(msg) + roadway_link_changes = pd.DataFrame({}) + roadway_node_changes = pd.DataFrame({}) + + if base_roadway_network and base_roadway_dir: + msg = "Method takes only one of 'base_roadway_network' and 'base_roadway_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_roadway_dir: + base_roadway_network = ModelRoadwayNetwork.read( + os.path.join(base_roadway_dir, "link.json"), + os.path.join(base_roadway_dir, "node.geojson"), + os.path.join(base_roadway_dir, "shape.geojson"), + fast=True, + recalculate_calculated_variables=recalculate_calculated_variables, + recalculate_distance=recalculate_distance, + parameters=parameters, + **kwargs, + ) + base_roadway_network.split_properties_by_time_period_and_category() + elif base_roadway_network: + base_roadway_network.split_properties_by_time_period_and_category() + else: + msg = "No base roadway network." + WranglerLogger.info(msg) + base_roadway_network = None + + if base_cube_transit_source and base_transit_dir: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_transit_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_transit_dir: + base_transit_network = StandardTransit.read_gtfs( + gtfs_feed_dir=base_transit_dir, + parameters=parameters + ) + else: + msg = "No base transit network." + WranglerLogger.info(msg) + base_transit_network = None + + project = Project( + roadway_link_changes=roadway_link_changes, + roadway_node_changes=roadway_node_changes, + transit_changes=transit_changes, + base_roadway_network=base_roadway_network, + base_transit_network=base_transit_network, + base_cube_transit_network=base_cube_transit_network, + build_cube_transit_network=build_cube_transit_network, + evaluate=True, + project_name=project_name, + parameters=parameters, + ) + + return project
+ +
[docs] @staticmethod + def read_logfile(logfilename: Union[str, List[str]]): + """ + Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + logfilename (str or list[str]): File path to CUBE logfile or list of logfile paths. + + Returns: + A DataFrame reprsentation of the log file. + """ + if type(logfilename) == str: + logfilename = [logfilename] + + link_df = pd.DataFrame() + node_df = pd.DataFrame() + + for file in logfilename: + WranglerLogger.info("Reading logfile: {}".format(file)) + with open(file) as f: + _content = f.readlines() + + _node_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("N") + ] + WranglerLogger.debug("node lines: {}".format(_node_lines)) + _link_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("L") + ] + WranglerLogger.debug("link lines: {}".format(_link_lines)) + + _nodecol = ["OBJECT", "OPERATION", "GROUP"] + _node_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Node Cols: {}".format(_nodecol)) + _linkcol = ["OBJECT", "OPERATION", "GROUP"] + _link_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Link Cols: {}".format(_linkcol)) + + def split_log(x): + return list(reader([x], delimiter=',', quotechar='"'))[0] + + _node_df = pd.DataFrame([split_log(x) for x in _node_lines[1:]],columns = _nodecol) + WranglerLogger.debug("Node DF: {}".format(_node_df)) + _link_df = pd.DataFrame([split_log(x) for x in _link_lines[1:]],columns = _linkcol) + WranglerLogger.debug("Link DF: {}".format(_link_df)) + + node_df = pd.concat([node_df, _node_df]) + link_df = pd.concat([link_df, _link_df]) + + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + link_df.columns = [c.split("[")[0] for c in link_df.columns] + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + node_df.columns = [c.split("[")[0] for c in node_df.columns] + + if len(link_df) > 0: + # create operation history + action_history_df = ( + link_df.groupby(['A', 'B'])["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + link_df = pd.merge(link_df, action_history_df, on=['A', 'B'], how="left") + + if len(node_df) > 0: + action_history_df = ( + node_df.groupby('N')["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + node_df = pd.merge(node_df, action_history_df, on='N', how="left") + + WranglerLogger.info( + "Processed {} Node lines and {} Link lines".format( + node_df.shape[0], link_df.shape[0] + ) + ) + + return link_df, node_df
+ +
[docs] @staticmethod + def read_network_build_file(networkbuildfilename: Union[str, List[str]]): + """ + Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + networkbuildfilename (str or list[str]): File path to emme nework build file or list of network build file paths. + + Returns: + A DataFrame representation of the network build file + """ + if type(networkbuildfilename) == str: + networkbuildfilename = [networkbuildfilename] + + _link_command_history_df = DataFrame() + _node_command_history_df = DataFrame() + _transit_command_history_df = DataFrame() + + for file in networkbuildfilename: + WranglerLogger.info("Reading network build file: {}".format(file)) + with open(file) as f: + _content = json.load(f) + + _command_history = _content.get('command_history') + + # loop through all the commands + for command in _command_history: + if command.get('command') == 'set_attribute': + element_id = command.get('parameters').get('element_ids') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + _command_df[command.get('parameters').get('attribute_name')] = command.get('parameters').get('value') + + if command.get('command') in ['create_link', 'create_node']: + if command.get('command') == 'create_link': + element_id = command.get('results').get('changes').get('added').get('LINK') + if command.get('command') == 'create_node': + element_id = command.get('results').get('changes').get('added').get('NODE') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + for attribute_name, attribute_value in command.get('parameters').get('attributes').items(): + _command_df[attribute_name] = attribute_value + + if command.get('command') == 'delete_link': + element_id = command.get('results').get('changes').get('removed').get('LINK') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + if command.get('command') == 'modify_transit_line': + element_id = command.get('parameters').get('line_id') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : pd.Series(element_id), + 'object' : pd.Series(object), + 'operation' : pd.Series(operation) + } + ) + + _command_df['new_itinerary'] = [command.get('parameters').get('new_itinerary')] + + if ('L' in _command_df['object'].unique()): + _link_command_history_df = _link_command_history_df.append( + _command_df[_command_df['object'] == 'L'], + sort = False, + ignore_index = True + ) + + if ('N' in _command_df['object'].unique()): + _node_command_history_df = _node_command_history_df.append( + _command_df[_command_df['object'] == 'N'], + sort = False, + ignore_index = True + ) + + if ( + ('TRANSIT_LINE' in _command_df['object'].unique()) | + ('TRANSIT_STOP' in _command_df['object'].unique()) | + ('TRANSIT_SHAPE' in _command_df['object'].unique()) + ): + _transit_command_history_df = _transit_command_history_df.append( + _command_df[_command_df['object'].isin(['TRANSIT_LINE', 'TRANSIT_STOP', 'TRANSIT_SHAPE'])], + sort = False, + ignore_index = True + ) + + if len(_link_command_history_df) > 0: + # create operation history + link_action_history_df = ( + _link_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + link_action_history_df["operation_final"] = link_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + # get the last none null value for each element + # consolidate elements to single record + def get_last_valid(series): + if len(series.dropna()) > 0: + return series.dropna().iloc[-1] + else: + return np.nan + + #_command_history_df = _command_history_df.groupby(['element_id']).apply(get_last_valid).reset_index() + _link_command_history_df = _link_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _link_command_history_df = pd.merge(_link_command_history_df, link_action_history_df, on='element_id', how="left") + + if len(_node_command_history_df) > 0: + # create node operation history + node_action_history_df = ( + _node_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + node_action_history_df["operation_final"] = node_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + _node_command_history_df = _node_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _node_command_history_df = pd.merge(_node_command_history_df, node_action_history_df, on='element_id', how="left") + + WranglerLogger.info( + "Processed {} link element commands, {} node element commands".format( + _link_command_history_df.shape[0], + _node_command_history_df.shape[0] + ) + ) + + return _link_command_history_df, _node_command_history_df, _transit_command_history_df
+ +
[docs] @staticmethod + def emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file): + """ + rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder + """ + WranglerLogger.info('Reading emme node id crosswalk file from {}'.format(emme_node_id_crosswalk_file)) + emme_node_id_crosswalk_df = pd.read_csv(emme_node_id_crosswalk_file) + emme_node_id_dict = dict(zip(emme_node_id_crosswalk_df['emme_node_id'], emme_node_id_crosswalk_df['model_node_id'])) + + # get node changes + if len(emme_node_change_df) > 0: + emme_node_change_df['emme_id'] = emme_node_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + + # get new emme nodes + new_emme_node_id_list = [ + n for n in emme_node_change_df['emme_id'] if n not in emme_node_id_crosswalk_df['emme_node_id'] + ] + WranglerLogger.info('New emme node id list {}'.format(new_emme_node_id_list)) + new_wrangler_node = emme_node_id_crosswalk_df['model_node_id'].max() + + # add crosswalk for new emme nodes + for new_emme_node in new_emme_node_id_list: + new_wrangler_node = new_wrangler_node + 1 + emme_node_id_dict.update({new_emme_node : new_wrangler_node}) + + # for nodes update model_node_id + emme_node_change_df['model_node_id'] = emme_node_change_df['emme_id'].map(emme_node_id_dict).fillna(0) + + if len(emme_link_change_df) > 0: + emme_link_change_df['A'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + emme_link_change_df['B'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[-1])) + # for links update A,B nodes + emme_link_change_df['A'] = emme_link_change_df['A'].map(emme_node_id_dict) + emme_link_change_df['B'] = emme_link_change_df['B'].map(emme_node_id_dict) + + if len(emme_transit_changes_df) > 0: + emme_transit_changes_df['i_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-3] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + emme_transit_changes_df['j_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-2] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + # update i,j nodes + emme_transit_changes_df['i_node'] = emme_transit_changes_df[ + 'i_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + emme_transit_changes_df['j_node'] = emme_transit_changes_df[ + 'j_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + + # update routing nodes + emme_transit_changes_df['new_itinerary'] = emme_transit_changes_df.apply( + lambda x: [emme_node_id_dict.get(n) for n in x['new_itinerary']] if x['object'] == 'TRANSIT_SHAPE' else 0, + axis = 1 + ) + + return emme_link_change_df, emme_node_change_df, emme_transit_changes_df
+ +
[docs] def get_object_from_network_build_command(row): + """ + determine the network build object is node or link + + Args: + row: network build command history dataframe + + Returns: + 'N' for node, 'L' for link + """ + + if row.get('command') == 'create_link': + return 'L' + + if row.get('command') == 'create_node': + return 'N' + + if row.get('command') == 'delete_link': + return 'L' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'L' + if row.get('parameters').get('element_type') == 'NODE': + return 'N' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'TRANSIT_LINE' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'TRANSIT_STOP' + + if row.get('command') == 'modify_transit_line': + return 'TRANSIT_SHAPE'
+ +
[docs] def get_operation_from_network_build_command(row): + """ + determine the network build object action type + + Args: + row: network build command history dataframe + + Returns: + 'A', 'C', 'D' + """ + + if row.get('command') == 'create_link': + return 'A' + + if row.get('command') == 'create_node': + return 'A' + + if row.get('command') == 'delete_link': + return 'D' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'C' + if row.get('parameters').get('element_type') == 'NODE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'C' + + if row.get('command') == 'modify_transit_line': + return 'C'
+ +
[docs] @staticmethod + def emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file): + """ + rename emme names to wrangler names using crosswalk file + """ + + WranglerLogger.info('Reading emme attribute name crosswalk file {}'.format(emme_name_crosswalk_file)) + emme_name_crosswalk_df = pd.read_csv(emme_name_crosswalk_file) + emme_name_crosswalk_dict = dict(zip(emme_name_crosswalk_df['emme_name'], emme_name_crosswalk_df['wrangler_name'])) + + # drop columns we don't need from emme to avoid confusion + ignore_columns = [ + c for c in emme_link_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'A', 'B'] + ] + WranglerLogger.info('Ignoring link changes in {}'.format(ignore_columns)) + emme_link_change_df = emme_link_change_df.drop(ignore_columns, axis = 1) + + ignore_columns = [ + c for c in emme_node_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'model_node_id'] + ] + WranglerLogger.info('Ignoring node changes in {}'.format(ignore_columns)) + emme_node_change_df = emme_node_change_df.drop(ignore_columns, axis = 1) + + # rename emme name to wrangler name + emme_link_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + emme_node_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + + return emme_link_change_df, emme_node_change_df
+ +
[docs] @staticmethod + def determine_roadway_network_changes_compatability( + base_roadway_network: ModelRoadwayNetwork, + roadway_link_changes: DataFrame, + roadway_node_changes: DataFrame, + parameters: Parameters, + ): + """ + Checks to see that any links or nodes that change exist in base roadway network. + """ + WranglerLogger.info( + "Evaluating compatibility between roadway network changes and base network. Not evaluating deletions." + ) + + # CUBE log file saves all variable names in upper cases, need to convert them to be same as network + log_to_net_df = pd.read_csv(parameters.log_to_net_crosswalk) + log_to_net_dict = dict(zip(log_to_net_df["log"], log_to_net_df["net"])) + + dbf_to_net_df = pd.read_csv(parameters.net_to_dbf_crosswalk) + dbf_to_net_dict = dict(zip(dbf_to_net_df["dbf"], dbf_to_net_df["net"])) + + for c in roadway_link_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_link_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_link_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_link_changes.rename(columns=dbf_to_net_dict, inplace=True) + + for c in roadway_node_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_node_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_node_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_node_changes.rename(columns=dbf_to_net_dict, inplace=True) + + # for links "L" that change "C", + # find locations where there isn't a base roadway link + if len(roadway_link_changes) > 0: + link_changes_df = roadway_link_changes[ + roadway_link_changes["operation_final"] == "C" + ].copy() + + link_merge_df = pd.merge( + link_changes_df[["A", "B"]].astype(str), + base_roadway_network.links_df[["A", "B", "model_link_id"]].astype(str), + how="left", + on=["A", "B"], + ) + + missing_links = link_merge_df.loc[link_merge_df["model_link_id"].isna()] + + if missing_links.shape[0]: + msg = "Network missing the following AB links:\n{}".format(missing_links) + WranglerLogger.error(msg) + raise ValueError(msg) + + # for links "N" that change "C", + # find locations where there isn't a base roadway node + if len(roadway_node_changes) > 0: + node_changes_df = roadway_node_changes[ + roadway_node_changes["operation_final"] == "C" + ].copy() + + node_merge_df = pd.merge( + node_changes_df[["model_node_id"]], + base_roadway_network.nodes_df[["model_node_id", "geometry"]], + how="left", + on=["model_node_id"], + ) + missing_nodes = node_merge_df.loc[node_merge_df["geometry"].isna()] + if missing_nodes.shape[0]: + msg = "Network missing the following nodes:\n{}".format(missing_nodes) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] def evaluate_changes(self): + """ + Determines which changes should be evaluated, initiates + self.card_data to be an aggregation of transit and highway changes. + """ + highway_change_list = [] + transit_change_list = [] + + WranglerLogger.info("Evaluating project changes.") + + if (not self.roadway_link_changes.empty) | (not self.roadway_node_changes.empty): + highway_change_list = self.add_highway_changes() + + if (not self.transit_changes.empty) or ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + self.card_data = { + "project": self.project_name, + "changes": transit_change_list + highway_change_list, + }
+ +
[docs] def add_transit_changes(self): + """ + Evaluates changes between base and build transit objects and + adds entries into the self.card_data dictionary. + """ + if self.build_cube_transit_network: + transit_change_list = self.build_cube_transit_network.evaluate_differences( + self.base_cube_transit_network + ) + elif self.base_transit_network: + transit_change_list = self.base_transit_network.evaluate_differences( + self.transit_changes + ) + return transit_change_list
+ + @staticmethod + def _final_op(x): + if x["operation_history"][-1] == "D": + if "A" in x["operation_history"][:-1]: + return "N" + else: + return "D" + elif x["operation_history"][-1] == "A": + if "D" in x["operation_history"][:-1]: + return "C" + else: + return "A" + else: + if "A" in x["operation_history"][:-1]: + return "A" + else: + return "C" + +
[docs] def add_highway_changes(self, limit_variables_to_existing_network=False): + """ + Evaluates changes from the log file based on the base highway object and + adds entries into the self.card_data dictionary. + + Args: + limit_variables_to_existing_network (bool): True if no ad-hoc variables. Default to False. + """ + + for c in self.parameters.string_col: + if c in self.roadway_link_changes.columns: + self.roadway_link_changes[c] = self.roadway_link_changes[c].str.lstrip(" ") + if c in self.roadway_node_changes.columns: + self.roadway_node_changes[c] = self.roadway_node_changes[c].str.lstrip(" ") + + ## if worth it, could also add some functionality to network wrangler itself. + node_changes_df = self.roadway_node_changes.copy() + + link_changes_df = self.roadway_link_changes.copy() + + def _process_deletions(link_changes_df): + """ + create deletion section in project card + """ + WranglerLogger.debug("Processing link deletions") + + cube_delete_df = link_changes_df[link_changes_df["operation_final"] == "D"].copy() + + # make sure columns has the same type as base network + cube_delete_df['A'] = cube_delete_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_delete_df['B'] = cube_delete_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_delete_df.columns: + cube_delete_df.drop(['model_link_id'], axis = 1, inplace = True) + + cube_delete_df = pd.merge( + cube_delete_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if len(cube_delete_df) > 0: + links_to_delete = cube_delete_df["model_link_id"].tolist() + delete_link_dict = { + "category": "Roadway Deletion", + "links": {"model_link_id": links_to_delete}, + } + WranglerLogger.debug("{} Links Deleted.".format(len(links_to_delete))) + else: + delete_link_dict = None + WranglerLogger.debug("No link deletions processed") + + return delete_link_dict + + def _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ): + """""" + WranglerLogger.debug("Processing link additions") + cube_add_df = link_changes_df[link_changes_df["operation_final"] == "A"] + if len(cube_add_df) == 0: + WranglerLogger.debug("No link additions processed") + return {} + + if limit_variables_to_existing_network: + add_col = [ + c + for c in cube_add_df.columns + if c in self.base_roadway_network.links_df.columns + ] + else: + add_col = [ + c for c in cube_add_df.columns if c not in ["operation_final"] + ] + # can leave out "operation_final" from writing out, is there a reason to write it out? + + for x in add_col: + cube_add_df[x] = cube_add_df[x].astype(self.base_roadway_network.links_df[x].dtype) + + add_link_properties = cube_add_df[add_col].to_dict("records") + + # WranglerLogger.debug("Add Link Properties: {}".format(add_link_properties)) + WranglerLogger.debug("{} Links Added".format(len(add_link_properties))) + + return {"category": "Add New Roadway", "links": add_link_properties} + + def _process_node_additions(node_add_df): + """""" + WranglerLogger.debug("Processing node additions") + + if len(node_add_df) == 0: + WranglerLogger.debug("No node additions processed") + return [] + + node_add_df = node_add_df.drop(["operation_final"], axis=1) + + for x in node_add_df.columns: + node_add_df[x] = node_add_df[x].astype(self.base_roadway_network.nodes_df[x].dtype) + + add_nodes_dict_list = node_add_df.to_dict( + "records" + ) + WranglerLogger.debug("{} Nodes Added".format(len(add_nodes_dict_list))) + + return add_nodes_dict_list + + def _process_single_link_change(change_row, changeable_col): + """""" + + # 1. Find associated base year network values + base_df = self.base_roadway_network.links_df[ + (self.base_roadway_network.links_df["A"] == int(change_row.A)) + & (self.base_roadway_network.links_df["B"] == int(change_row.B)) + ] + + if not base_df.shape[0]: + msg = "No match found in network for AB combination: ({},{}). Incompatible base network.".format( + change_row.A, change_row.B + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + elif base_df.shape[0] > 1: + WranglerLogger.warning( + "Found more than one match in base network for AB combination: ({},{}). Selecting first one to operate on but AB should be unique to network.".format( + change_row.A, change_row.B + ) + ) + + base_row = base_df.iloc[0] + # WranglerLogger.debug("Properties with changes: {}".format(changeable_col)) + + # 2. find columns that changed (enough) + changed_col = [] + for col in changeable_col: + WranglerLogger.debug("Assessing Column: {}".format(col)) + # if it is the same as before, or a static value, don't process as a change + if str(change_row[col]).strip('"\'') == str(base_row[col]).strip('"\''): + continue + # if it is NaN or None, don't process as a change + if (change_row[col] != change_row[col]) | (change_row[col] is None): + continue + if (col == "roadway_class") & (change_row[col] == 0): + continue + # only look at distance if it has significantly changed + if col == "distance": + if ( + abs( + (change_row[col] - float(base_row[col])) + / base_row[col].astype(float) + ) + > 0.01 + ): + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + else: + continue + else: + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + + WranglerLogger.debug( + "Properties with changes that will be processed: {}".format(changed_col) + ) + + if not changed_col: + return pd.DataFrame() + + # 3. Iterate through columns with changed values and structure the changes as expected in project card + property_dict_list = [] + processed_properties = [] + + # check if it's a manged lane change + for c in changed_col: + if c.startswith("ML_"): + # TODO ML project card skeleton + msg = "Detected managed lane changes, please create managed lane project card!" + WranglerLogger.error(msg) + raise ValueError(msg) + return + + # regular roadway property change + for c in changed_col: + # WranglerLogger.debug("Processing Column: {}".format(c)) + ( + p_base_name, + p_time_period, + p_category, + managed_lane, + ) = column_name_to_parts(c, self.parameters) + + _d = { + "existing": base_row[c], + "set": change_row[c], + } + if c in Project.CALCULATED_VALUES: + _d = { + "set": change_row[c], + } + if p_time_period: + if managed_lane == 1: + _d["time"] = list( + self.parameters.time_period_to_time[p_time_period] + ) + if p_category: + _d["category"] = p_category + + # iterate through existing properties that have been changed and see if you should just add + if (p_base_name in processed_properties) & (managed_lane == 1): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + processed_p["timeofday"] += [_d] + elif (p_base_name in processed_properties) & (managed_lane == 0): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + if processed_p["set"] != change_row[c]: + msg = "Detected different changes for split-property variables on regular roadway links: " + msg += "conflicting \"{}\" values \"{}\", \"{}\"".format(p_base_name, processed_p["set"], change_row[c]) + WranglerLogger.error(msg) + raise ValueError(msg) + elif p_time_period: + if managed_lane == 1: + property_dict = {"property": p_base_name, "timeofday": [_d]} + processed_properties.append(p_base_name) + property_dict_list.append(property_dict) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + + card_df = pd.DataFrame( + { + "properties": pd.Series([property_dict_list]), + "model_link_id": pd.Series(base_row["model_link_id"]), + } + ) + + # WranglerLogger.debug('single change card_df:\n {}'.format(card_df)) + + return card_df + + def _process_link_changes(link_changes_df, changeable_col): + """""" + cube_change_df = link_changes_df[link_changes_df["operation_final"] == "C"].copy() + + # make sure columns has the same type as base network + cube_change_df['A'] = cube_change_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_change_df['B'] = cube_change_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_change_df.columns: + cube_change_df.drop('model_link_id', axis = 1, inplace = True) + + cube_change_df = pd.merge( + cube_change_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if not cube_change_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + change_link_dict_df = pd.DataFrame(columns=["properties", "model_link_id"]) + + for index, row in cube_change_df.iterrows(): + card_df = _process_single_link_change(row, changeable_col) + + change_link_dict_df = pd.concat( + [change_link_dict_df, card_df], ignore_index=True, sort=False + ) + + if not change_link_dict_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + # WranglerLogger.debug('change_link_dict_df Unaggregated:\n {}'.format(change_link_dict_df)) + + # Have to change to string so that it is a hashable type for the aggregation + change_link_dict_df["properties"] = change_link_dict_df[ + "properties" + ].astype(str) + # Group the changes that are the same + change_link_dict_df = ( + change_link_dict_df.groupby("properties")[["model_link_id"]] + .agg(lambda x: list(x)) + .reset_index() + ) + # WranglerLogger.debug('change_link_dict_df Aggregated:\n {}'.format(change_link_dict_df)) + + # Reformat model link id to correct "facility" format + change_link_dict_df["facility"] = change_link_dict_df.apply( + lambda x: {"link": [{"model_link_id": x.model_link_id}]}, axis=1 + ) + + # WranglerLogger.debug('change_link_dict_df 3: {}'.format(change_link_dict_df)) + change_link_dict_df["properties"] = change_link_dict_df["properties"].apply( + lambda x: json.loads( + x.replace("'\"", "'").replace("\"'", "'").replace("'", '"') + ) + ) + + change_link_dict_df["category"] = "Roadway Property Change" + + change_link_dict_list = change_link_dict_df[ + ["category", "facility", "properties"] + ].to_dict("record") + + WranglerLogger.debug( + "{} Changes Processed".format(len(change_link_dict_list)) + ) + return change_link_dict_list + + def _consolidate_actions(log, base, key_list): + log_df = log.copy() + # will be changed if to allow new variables being added/changed that are not in base network + changeable_col = [x for x in log_df.columns if x in base.columns] + #print(log_df) + #for x in changeable_col: + # print(x) + #log_df[x] = log_df[x].astype(base[x].dtype) + + if 'operation_final' not in log_df.columns: + action_history_df = ( + log_df.groupby(key_list)["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + + log_df = pd.merge(log_df, action_history_df, on=key_list, how="left") + log_df.drop_duplicates(subset=key_list, keep="last", inplace=True) + log_df["operation_final"] = log_df.apply(lambda x: Project._final_op(x), axis=1) + + return log_df[changeable_col + ["operation_final"]] + + delete_link_dict = None + add_link_dict = None + change_link_dict_list = [] + + if len(link_changes_df) != 0: + link_changes_df = _consolidate_actions( + link_changes_df, self.base_roadway_network.links_df, ["A", "B"] + ) + + # process deletions + delete_link_dict = _process_deletions(link_changes_df) + + # process additions + add_link_dict = _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ) + + # process changes + WranglerLogger.debug("Processing changes") + WranglerLogger.debug(link_changes_df) + changeable_col = list( + ( + set(link_changes_df.columns) + & set(self.base_roadway_network.links_df.columns) + ) + - set(Project.STATIC_VALUES) + ) + + cols_in_changes_not_in_net = list( + set(link_changes_df.columns) + - set(self.base_roadway_network.links_df.columns) + ) + + if cols_in_changes_not_in_net: + WranglerLogger.warning( + "The following attributes are specified in the changes but do not exist in the base network: {}".format( + cols_in_changes_not_in_net + ) + ) + + change_link_dict_list = _process_link_changes(link_changes_df, changeable_col) + + if len(node_changes_df) != 0: + node_changes_df = _consolidate_actions( + node_changes_df, self.base_roadway_network.nodes_df, ["model_node_id"] + ) + + # print error message for node change and node deletion + if ( + len(node_changes_df[node_changes_df["operation_final"].isin(["C", "D"])]) + > 0 + ): + msg = "NODE changes and deletions are not allowed!" + WranglerLogger.warning(msg) + #raise ValueError(msg) + node_add_df = node_changes_df[node_changes_df["operation_final"] == "A"] + + if add_link_dict: + add_link_dict["nodes"] = _process_node_additions(node_add_df) + else: + add_link_dict = {"category": "Add New Roadway", "nodes": _process_node_additions(node_add_df)} + + else: + None + + # combine together + + highway_change_list = list( + filter(None, [delete_link_dict] + [add_link_dict] + change_link_dict_list) + ) + + return highway_change_list
+
+ +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_modules/lasso/roadway/index.html b/_modules/lasso/roadway/index.html new file mode 100644 index 0000000..d567c36 --- /dev/null +++ b/_modules/lasso/roadway/index.html @@ -0,0 +1,2029 @@ + + + + + + lasso.roadway — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.roadway

+import copy
+import glob
+import os
+from typing import Optional, Union
+
+import geopandas as gpd
+import pandas as pd
+
+from geopandas import GeoDataFrame
+from pandas import DataFrame
+import numpy as np
+
+from network_wrangler import RoadwayNetwork
+from .parameters import Parameters
+from .logger import WranglerLogger
+
+
+
[docs]class ModelRoadwayNetwork(RoadwayNetwork): + """ + Subclass of network_wrangler class :ref:`RoadwayNetwork <network_wrangler:RoadwayNetwork>` + + A representation of the physical roadway network and its properties. + """ + + CALCULATED_VALUES = [ + "area_type", + "county", + "centroidconnect", + ] + +
[docs] def __init__( + self, + nodes: GeoDataFrame, + links: DataFrame, + shapes: GeoDataFrame, + parameters: Union[Parameters, dict] = {}, + **kwargs, + ): + """ + Constructor + + Args: + nodes: geodataframe of nodes + links: dataframe of links + shapes: geodataframe of shapes + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. + If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + """ + super().__init__(nodes, links, shapes, **kwargs) + + # will have to change if want to alter them + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.links_metcouncil_df = None + self.nodes_metcouncil_df = None + + self.fill_na() + self.convert_int()
+ # self.shapes_metcouncil_df = None + ##todo also write to file + # WranglerLogger.debug("Used PARAMS\n", '\n'.join(['{}: {}'.format(k,v) for k,v in self.parameters.__dict__.items()])) + +
[docs] @staticmethod + def read( + link_filename: str, + node_filename: str, + shape_filename: str, + fast: bool = False, + recalculate_calculated_variables=False, + recalculate_distance=False, + parameters: Union[dict, Parameters] = {}, + **kwargs, + ): + """ + Reads in links and nodes network standard. + + Args: + link_filename (str): File path to link json. + node_filename (str): File path to node geojson. + shape_filename (str): File path to link true shape geojson + fast (bool): boolean that will skip validation to speed up read time. + recalculate_calculated_variables (bool): calculates fields from spatial joins, etc. + recalculate_distance (bool): re-calculates distance. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + Returns: + ModelRoadwayNetwork + """ + + nodes_df, links_df, shapes_df = RoadwayNetwork.load_transform_network( + node_filename, + link_filename, + shape_filename, + validate_schema=not fast, + **kwargs, + ) + + m_road_net = ModelRoadwayNetwork( + nodes_df, + links_df, + shapes_df, + parameters=parameters, + **kwargs, + ) + + if recalculate_calculated_variables: + m_road_net.create_calculated_variables() + if recalculate_distance: + m_road_net.calculate_distance(overwrite=True) + + m_road_net.fill_na() + # this method is making period values as string "NaN", need to revise. + m_road_net.split_properties_by_time_period_and_category() + for c in m_road_net.links_df.columns: + m_road_net.links_df[c] = m_road_net.links_df[c].replace("NaN", np.nan) + m_road_net.convert_int() + + return m_road_net
+ +
[docs] @staticmethod + def from_RoadwayNetwork( + roadway_network_object, + parameters: Union[dict, Parameters] = {}, + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + roadway_network_object (RoadwayNetwork). + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + Returns: + ModelRoadwayNetwork + """ + + additional_params_dict = { + k: v + for k, v in roadway_network_object.__dict__.items() + if k not in ["nodes_df", "links_df", "shapes_df", "parameters"] + } + + return ModelRoadwayNetwork( + roadway_network_object.nodes_df, + roadway_network_object.links_df, + roadway_network_object.shapes_df, + parameters=parameters, + **additional_params_dict, + )
+ +
[docs] def split_properties_by_time_period_and_category(self, properties_to_split=None): + """ + Splits properties by time period, assuming a variable structure of + + Args: + properties_to_split: dict + dictionary of output variable prefix mapped to the source variable and what to stratify it by + e.g. + { + 'lanes' : {'v':'lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'ML_lanes' : {'v':'ML_lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'use' : {'v':'use', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + } + + """ + import itertools + + if properties_to_split == None: + properties_to_split = self.parameters.properties_to_split + + for out_var, params in properties_to_split.items(): + if params["v"] not in self.links_df.columns: + WranglerLogger.warning( + "Specified variable to split: {} not in network variables: {}. Returning 0.".format( + params["v"], str(self.links_df.columns) + ) + ) + if params.get("time_periods") and params.get("categories"): + + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + time_suffix + "_" + category_suffix + ] = 0 + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[out_var + "_" + time_suffix] = 0 + elif params.get("time_periods") and params.get("categories"): + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + category_suffix + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=params["categories"][category_suffix], + time_period=params["time_periods"][time_suffix], + ) + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[ + out_var + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=None, + time_period=params["time_periods"][time_suffix], + ) + else: + raise ValueError( + "Shoudn't have a category without a time period: {}".format(params) + )
+ +
[docs] def create_calculated_variables(self): + """ + Creates calculated roadway variables. + + Args: + None + """ + WranglerLogger.info("Creating calculated roadway variables.") + + #MTC + self.create_ML_variable() + #/MTC + #MC + self.calculate_area_type() + self.calculate_county() + self.calculate_mpo() + self.add_counts() + self.create_ML_variable() + self.create_hov_corridor_variable() + self.create_managed_variable()
+ #/MC + +
[docs] def calculate_county( + self, + county_shape=None, + county_shape_variable=None, + network_variable="county", + county_codes_dict=None, + overwrite=False, + ): + """ + #MC + Calculates county variable. + + This uses the centroid of the geometry field to determine which county it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + county_shape (str): The File path to county geodatabase. + county_shape_variable (str): The variable name of county in county geodadabase. + network_variable (str): The variable name of county in network standard. Default to "county". + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing County Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "County Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + county_shape = county_shape if county_shape else self.parameters.county_shape + + county_shape_variable = ( + county_shape_variable + if county_shape_variable + else self.parameters.county_variable_shp + ) + + WranglerLogger.info( + "Adding roadway network variable for county using a spatial join with: {}".format( + county_shape + ) + ) + + county_codes_dict = ( + county_codes_dict if county_codes_dict else self.parameters.county_code_dict + ) + if not county_codes_dict: + msg = "No county codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + county_gdf = gpd.read_file(county_shape) + county_gdf = county_gdf.to_crs(epsg=self.crs) + joined_gdf = gpd.sjoin(centroids_gdf, county_gdf, how="left", op="intersects") + + joined_gdf[county_shape_variable] = ( + joined_gdf[county_shape_variable] + .map(county_codes_dict) + .fillna(10) + .astype(int) + ) + + self.links_df[network_variable] = joined_gdf[county_shape_variable] + + WranglerLogger.info( + "Finished Calculating county variable: {}".format(network_variable) + )
+ +
[docs] def calculate_area_type( + self, + area_type_shape=None, + area_type_shape_variable=None, + network_variable="area_type", + area_type_codes_dict=None, + downtown_area_type_shape=None, + downtown_area_type=None, + overwrite=False, + ): + """ + #MC + Calculates area type variable. + + This uses the centroid of the geometry field to determine which area it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + area_type_shape (str): The File path to area geodatabase. + area_type_shape_variable (str): The variable name of area type in area geodadabase. + network_variable (str): The variable name of area type in network standard. Default to "area_type". + area_type_codes_dict: The dictionary to map input area_type_shape_variable to network_variable + downtown_area_type_shape: The file path to the downtown area type boundary. + downtown_area_type (int): Integer value of downtown area type + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Area Type Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Area Type Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Area Type from Spatial Data and adding as roadway network variable: {}".format( + network_variable + ) + ) + + """ + Verify inputs + """ + + area_type_shape = ( + area_type_shape if area_type_shape else self.parameters.area_type_shape + ) + + if not area_type_shape: + msg = "No area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(area_type_shape): + msg = "File not found for area type shape: {}".format(area_type_shape) + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_shape_variable = ( + area_type_shape_variable + if area_type_shape_variable + else self.parameters.area_type_variable_shp + ) + + if not area_type_shape_variable: + msg = "No area type shape varible specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_codes_dict = ( + area_type_codes_dict + if area_type_codes_dict + else self.parameters.area_type_code_dict + ) + if not area_type_codes_dict: + msg = "No area type codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type_shape = ( + downtown_area_type_shape + if downtown_area_type_shape + else self.parameters.downtown_area_type_shape + ) + + if not downtown_area_type_shape: + msg = "No downtown area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(downtown_area_type_shape): + msg = "File not found for downtown area type shape: {}".format( + downtown_area_type_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type = ( + downtown_area_type + if downtown_area_type + else self.parameters.downtown_area_type + ) + if not downtown_area_type: + msg = "No downtown area type value specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + WranglerLogger.debug("Reading Area Type Shapefile {}".format(area_type_shape)) + area_type_gdf = gpd.read_file(area_type_shape) + area_type_gdf = area_type_gdf.to_crs(epsg=self.crs) + + downtown_gdf = gpd.read_file(downtown_area_type_shape) + downtown_gdf = downtown_gdf.to_crs(epsg=self.crs) + + joined_gdf = gpd.sjoin( + centroids_gdf, area_type_gdf, how="left", op="intersects" + ) + + joined_gdf[area_type_shape_variable] = ( + joined_gdf[area_type_shape_variable] + .map(area_type_codes_dict) + .fillna(1) + .astype(int) + ) + + WranglerLogger.debug("Area Type Codes Used: {}".format(area_type_codes_dict)) + + d_joined_gdf = gpd.sjoin( + centroids_gdf, downtown_gdf, how="left", op="intersects" + ) + + d_joined_gdf["downtown_area_type"] = d_joined_gdf["Id"].fillna(-99).astype(int) + + joined_gdf.loc[ + d_joined_gdf["downtown_area_type"] == 0, area_type_shape_variable + ] = downtown_area_type + + WranglerLogger.debug( + "Downtown Area Type used boundary file: {}".format(downtown_area_type_shape) + ) + + self.links_df[network_variable] = joined_gdf[area_type_shape_variable] + + WranglerLogger.info( + "Finished Calculating Area Type from Spatial Data into variable: {}".format( + network_variable + ) + )
+ +
[docs] def calculate_mpo( + self, + county_network_variable="county", + network_variable="mpo", + as_integer=True, + mpo_counties=None, + overwrite=False, + ): + """ + Calculates mpo variable. + #MC + Args: + county_variable (str): Name of the variable where the county names are stored. Default to "county". + network_variable (str): Name of the variable that should be written to. Default to "mpo". + as_integer (bool): If true, will convert true/false to 1/0s. + mpo_counties (list): List of county names that are within mpo region. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating MPO as roadway network variable: {}".format(network_variable) + ) + """ + Verify inputs + """ + county_network_variable = ( + county_network_variable + if county_network_variable + else self.parameters.county_network_variable + ) + + if not county_network_variable: + msg = "No variable specified as containing 'county' in the network." + WranglerLogger.error(msg) + raise ValueError(msg) + if county_network_variable not in self.links_df.columns: + msg = "Specified county network variable: {} does not exist in network. Try running or debuging county calculation." + WranglerLogger.error(msg) + raise ValueError(msg) + + mpo_counties = mpo_counties if mpo_counties else self.parameters.mpo_counties + + if not mpo_counties: + msg = "No MPO Counties specified in method call or in parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("MPO Counties: {}".format(",".join(str(mpo_counties)))) + + """ + Start actual process + """ + + mpo = self.links_df[county_network_variable].isin(mpo_counties) + + if as_integer: + mpo = mpo.astype(int) + + self.links_df[network_variable] = mpo + + WranglerLogger.info( + "Finished calculating MPO variable: {}".format(network_variable) + )
+ +
[docs] def add_variable_using_shst_reference( + self, + var_shst_csvdata=None, + shst_csv_variable=None, + network_variable=None, + network_var_type=int, + overwrite=False, + ): + """ + Join network links with source data, via SHST API node match result. + + Args: + var_shst_csvdata (str): File path to SHST API return. + shst_csv_variable (str): Variable name in the source data. + network_variable (str): Name of the variable that should be written to. + network_var_type : Variable type in the written network. + overwrite (bool): True is overwriting existing variable. Default to False. + + Returns: + None + + """ + WranglerLogger.info( + "Adding Variable {} using Shared Streets Reference from {}".format( + network_variable, var_shst_csvdata + ) + ) + + var_shst_df = pd.read_csv(var_shst_csvdata) + + if "shstReferenceId" not in var_shst_df.columns: + msg = "'shstReferenceId' required but not found in {}".format(var_shst_data) + WranglerLogger.error(msg) + raise ValueError(msg) + + if shst_csv_variable not in var_shst_df.columns: + msg = "{} required but not found in {}".format( + shst_csv_variable, var_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + join_gdf = pd.merge( + self.links_df, + var_shst_df[["shstReferenceId", shst_csv_variable]], + how="left", + on="shstReferenceId", + ) + + join_gdf[shst_csv_variable].fillna(0, inplace=True) + + if network_variable in self.links_df.columns and not overwrite: + join_gdf.loc[join_gdf[network_variable] > 0, network_variable] = join_gdf[ + shst_csv_variable + ].astype(network_var_type) + else: + join_gdf[network_variable] = join_gdf[shst_csv_variable].astype( + network_var_type + ) + + self.links_df[network_variable] = join_gdf[network_variable] + + WranglerLogger.info( + "Added variable: {} using Shared Streets Reference".format(network_variable) + )
+ +
[docs] def add_counts( + self, + network_variable="AADT", + mndot_count_shst_data=None, + widot_count_shst_data=None, + mndot_count_variable_shp=None, + widot_count_variable_shp=None, + ): + + """ + Adds count variable. + #MC + join the network with count node data, via SHST API node match result + + Args: + network_variable (str): Name of the variable that should be written to. Default to "AADT". + mndot_count_shst_data (str): File path to MNDOT count location SHST API node match result. + widot_count_shst_data (str): File path to WIDOT count location SHST API node match result. + mndot_count_variable_shp (str): File path to MNDOT count location geodatabase. + widot_count_variable_shp (str): File path to WIDOT count location geodatabase. + + Returns: + None + """ + + WranglerLogger.info("Adding Counts") + + """ + Verify inputs + """ + + mndot_count_shst_data = ( + mndot_count_shst_data + if mndot_count_shst_data + else self.parameters.mndot_count_shst_data + ) + widot_count_shst_data = ( + widot_count_shst_data + if widot_count_shst_data + else self.parameters.widot_count_shst_data + ) + mndot_count_variable_shp = ( + mndot_count_variable_shp + if mndot_count_variable_shp + else self.parameters.mndot_count_variable_shp + ) + widot_count_variable_shp = ( + widot_count_variable_shp + if widot_count_variable_shp + else self.parameters.widot_count_variable_shp + ) + + for varname, var in { + "mndot_count_shst_data": mndot_count_shst_data, + "widot_count_shst_data": widot_count_shst_data, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(var): + msg = "{}' not found at following location: {}.".format(varname, var) + WranglerLogger.error(msg) + raise ValueError(msg) + + for varname, var in { + "mndot_count_variable_shp": mndot_count_variable_shp, + "widot_count_variable_shp": widot_count_variable_shp, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + WranglerLogger.debug( + "Adding MNDOT Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + mndot_count_shst_data, mndot_count_variable_shp, network_variable + ) + ) + # Add Minnesota Counts + self.add_variable_using_shst_reference( + var_shst_csvdata=mndot_count_shst_data, + shst_csv_variable=mndot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=True, + ) + WranglerLogger.debug( + "Adding WiDot Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + widot_count_shst_data, widot_count_variable_shp, network_variable + ) + ) + # Add Wisconsin Counts, but don't overwrite Minnesota + self.add_variable_using_shst_reference( + var_shst_csvdata=widot_count_shst_data, + shst_csv_variable=widot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=False, + ) + + self.links_df["count_AM"] = self.links_df[network_variable] / 4 + self.links_df["count_MD"] = self.links_df[network_variable] / 4 + self.links_df["count_PM"] = self.links_df[network_variable] / 4 + self.links_df["count_NT"] = self.links_df[network_variable] / 4 + + self.links_df["count_daily"] = self.links_df[network_variable] + self.links_df["count_year"] = 2017 + + WranglerLogger.info( + "Finished adding counts variable: {}".format(network_variable) + )
+ +
[docs] @staticmethod + def read_match_result(path): + """ + Reads the shst geojson match returns. + + Returns shst dataframe. + + Reading lots of same type of file and concatenating them into a single DataFrame. + + Args: + path (str): File path to SHST match results. + + Returns: + geodataframe: geopandas geodataframe + + ##todo + not sure why we need, but should be in utilities not this class + """ + refId_gdf = DataFrame() + refid_file = glob.glob(path) + for i in refid_file: + new = gpd.read_file(i) + refId_gdf = pd.concat([refId_gdf, new], ignore_index=True, sort=False) + return refId_gdf
+ +
[docs] @staticmethod + def get_attribute( + links_df, + join_key, # either "shstReferenceId", or "shstGeometryId", tests showed the latter gave better coverage + source_shst_ref_df, # source shst refId + source_gdf, # source dataframe + field_name, # , # targetted attribute from source + ): + """ + Gets attribute from source data using SHST match result. + + Args: + links_df (dataframe): The network dataframe that new attribute should be written to. + join_key (str): SHST ID variable name used to join source data with network dataframe. + source_shst_ref_df (str): File path to source data SHST match result. + source_gdf (str): File path to source data. + field_name (str): Name of the attribute to get from source data. + + Returns: + None + """ + # join based on shared streets geometry ID + # pp_link_id is shared streets match return + # source_ink_id is mrcc + WranglerLogger.debug( + "source ShSt rename_variables_for_dbf columns\n{}".format( + source_shst_ref_df.columns + ) + ) + WranglerLogger.debug("source gdf columns\n{}".format(source_gdf.columns)) + # end up with OSM network with the MRCC Link ID + # could also do with route_sys...would that be quicker? + join_refId_df = pd.merge( + links_df, + source_shst_ref_df[[join_key, "pp_link_id", "score"]].rename( + columns={"pp_link_id": "source_link_id", "score": "source_score"} + ), + how="left", + on=join_key, + ) + + # joined with MRCC dataframe to get route_sys + + join_refId_df = pd.merge( + join_refId_df, + source_gdf[["LINK_ID", field_name]].rename( + columns={"LINK_ID": "source_link_id"} + ), + how="left", + on="source_link_id", + ) + + # drop duplicated records with same field value + + join_refId_df.drop_duplicates( + subset=["model_link_id", "shstReferenceId", field_name], inplace=True + ) + + # more than one match, take the best score + + join_refId_df.sort_values( + by=["model_link_id", "source_score"], + ascending=True, + na_position="first", + inplace=True, + ) + + join_refId_df.drop_duplicates( + subset=["model_link_id"], keep="last", inplace=True + ) + + # self.links_df[field_name] = join_refId_df[field_name] + + return join_refId_df[links_df.columns.tolist() + [field_name, "source_link_id"]]
+ +
[docs] def calculate_use( + self, + network_variable="use", + as_integer=True, + overwrite=False, + ): + """ + Calculates use variable. + + Args: + network_variable (str): Variable that should be written to in the network. Default to "use" + as_integer (bool): If True, will convert true/false to 1/0s. Defauly to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "'use' Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating hov and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + #MTC + self.links_df[network_variable] = int(1) + #/MTC + + self.links_df[network_variable] = 0 + + self.links_df.loc[ + (self.links_df["assign_group"] == 8) | (self.links_df["access"] == "hov"), + network_variable, + ] = 100 + #/MC + + + if as_integer: + self.links_df[network_variable] = self.links_df[network_variable].astype( + int + ) + WranglerLogger.info( + "Finished calculating hov variable: {}".format(network_variable) + )
+ +
[docs] def create_ML_variable( + self, + network_variable="ML_lanes", + overwrite=False, + ): + """ + Created ML lanes placeholder for project to write out ML changes + + ML lanes default to 0, ML info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing ML Variable '{}' already in network".format( + network_variable + ) + ) + self.links_df[network_variable] = int(0) + else: + WranglerLogger.info( + "ML Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + WranglerLogger.info( + "Finished creating ML lanes variable: {}".format(network_variable) + )
+ +
[docs] def create_hov_corridor_variable( + self, + network_variable="segment_id", + overwrite=False, + ): + """ + Created hov corridor placeholder for project to write out corridor changes + + hov corridor id default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov corridor Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Hov corridor Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating hov corridor variable: {}".format(network_variable) + )
+ +
[docs] def create_managed_variable( + self, + network_variable="managed", + overwrite=False, + ): + """ + Created placeholder for project to write out managed + + managed default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing managed Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Managed Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating managed variable: {}".format(network_variable) + )
+ +
[docs] def calculate_centroidconnect( + self, + parameters, + network_variable="centroidconnect", + highest_taz_number=None, + as_integer=True, + overwrite=False, + ): + """ + Calculates centroid connector variable. + + Args: + parameters (Parameters): A Lasso Parameters, which stores input files. + network_variable (str): Variable that should be written to in the network. Default to "centroidconnect" + highest_taz_number (int): the max TAZ number in the network. + as_integer (bool): If True, will convert true/false to 1/0s. Default to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + Returns: + RoadwayNetwork + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Centroid Connector Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Centroid Connector Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Centroid Connector and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + highest_taz_number = ( + highest_taz_number if highest_taz_number else parameters.highest_taz_number + ) + + if not highest_taz_number: + msg = "No highest_TAZ number specified in method variable or in parameters" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug( + "Calculating Centroid Connectors using highest TAZ number: {}".format( + highest_taz_number + ) + ) + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + self.links_df[network_variable] = False + + self.links_df.loc[ + (self.links_df["A"] <= highest_taz_number) + | (self.links_df["B"] <= highest_taz_number), + network_variable, + ] = True + + if as_integer: + self.links_df[network_variable] = self.links_df[ + network_variable + ].astype(int) + WranglerLogger.info( + "Finished calculating centroid connector variable: {}".format(network_variable) + )
+ + +
[docs] def calculate_distance( + self, network_variable="distance", centroidconnect_only=False, overwrite=False + ): + """ + calculate link distance in miles + + Args: + centroidconnect_only (Bool): True if calculating distance for centroidconnectors only. Default to False. + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing distance Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Distance Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + #MC + if ("centroidconnect" not in self.links_df) & ("taz" not in self.links_df.roadway.unique()): + if centroidconnect_only: + msg = "No variable specified for centroid connector, calculating centroidconnect first" + WranglerLogger.error(msg) + raise ValueError(msg) + #/MC + + """ + Start actual process + """ + + temp_links_gdf = self.links_df.copy() + temp_links_gdf.crs = "EPSG:4326" + temp_links_gdf = temp_links_gdf.to_crs(epsg=26915) + + #MTC + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MTC + #MC + if centroidconnect_only: + WranglerLogger.info( + "Calculating {} for centroid connectors".format(network_variable) + ) + temp_links_gdf[network_variable] = np.where( + temp_links_gdf.centroidconnect == 1, + temp_links_gdf.geometry.length / 1609.34, + temp_links_gdf[network_variable], + ) + else: + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MC + + self.links_df[network_variable] = temp_links_gdf[network_variable]
+ +
[docs] def convert_int(self, int_col_names=[]): + """ + Convert integer columns + """ + + #MTC + WranglerLogger.info( + "Converting variable type to mtc standard" + ) + + int_col_names = self.parameters.int_col + #/MTC + #MC + """ + WranglerLogger.info("Converting variable type to MetCouncil standard") + + if not int_col_names: + int_col_names = self.parameters.int_col + #/MC + """ + ##Why are we doing this? + # int_col_names.remove("lanes") + + for c in list(set(self.links_df.columns) & set(int_col_names)): + try: + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + self.links_df[c] = self.links_df[c].replace("", 0) + self.links_df[c] = self.links_df[c].astype(int) + except: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + + for c in list(set(self.nodes_df.columns) & set(int_col_names)): + self.nodes_df[c] = self.nodes_df[c].replace("", 0) + self.nodes_df[c] = self.nodes_df[c].astype(int)
+ +
[docs] def fill_na(self): + """ + Fill na values from create_managed_lane_network() + """ + + WranglerLogger.info("Filling nan for network from network wrangler") + + num_col = self.parameters.int_col + self.parameters.float_col + + for x in list(self.links_df.columns): + if x in num_col: + self.links_df[x].fillna(0, inplace=True) + self.links_df[x] = self.links_df[x].apply( + lambda k: 0 if k in [np.nan, "", float("nan"), "NaN"] else k + ) + + else: + self.links_df[x].fillna("", inplace=True) + + for x in list(self.nodes_df.columns): + if x in num_col: + self.nodes_df[x].fillna(0, inplace=True) + else: + self.nodes_df[x].fillna("", inplace=True)
+ + +
[docs] def roadway_standard_to_met_council_network(self, output_epsg=None): + """ + Rename and format roadway attributes to be consistent with what metcouncil's model is expecting. + #MC + Args: + output_epsg (int): epsg number of output network. + + Returns: + None + """ + + WranglerLogger.info( + "Renaming roadway attributes to be consistent with what metcouncil's model is expecting" + ) + + """ + Verify inputs + """ + + output_epsg = output_epsg if output_epsg else self.parameters.output_epsg + + """ + Start actual process + """ + if "managed" in self.links_df.columns: + WranglerLogger.info("Creating managed lane network.") + self.create_managed_lane_network(in_place=True) + + # when ML and assign_group projects are applied together, assign_group is filled as "" by wrangler for ML links + for c in ModelRoadwayNetwork.CALCULATED_VALUES: + if c in self.links_df.columns and c in self.parameters.int_col: + self.links_df[c] = self.links_df[c].replace("", 0) + else: + WranglerLogger.info("Didn't detect managed lanes in network.") + + self.calculate_centroidconnect(self.parameters) + self.create_calculated_variables() + self.calculate_distance(overwrite=True) + + self.fill_na() + # no method to calculate price yet, will be hard coded in project card + WranglerLogger.info("Splitting variables by time period and category") + self.split_properties_by_time_period_and_category() + self.convert_int() + + self.links_metcouncil_df = self.links_df.copy() + self.nodes_metcouncil_df = self.nodes_df.copy() + + self.links_metcouncil_df = pd.merge( + self.links_metcouncil_df.drop( + "geometry", axis=1 + ), # drop the stick geometry in links_df + self.shapes_df[["shape_id", "geometry"]], + how="left", + on="shape_id", + ) + + self.links_metcouncil_df.crs = "EPSG:4326" + self.nodes_metcouncil_df.crs = "EPSG:4326" + WranglerLogger.info("Setting Coordinate Reference System to EPSG 26915") + self.links_metcouncil_df = self.links_metcouncil_df.to_crs(epsg=26915) + self.nodes_metcouncil_df = self.nodes_metcouncil_df.to_crs(epsg=26915) + + self.nodes_metcouncil_df["X"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.x + ) + self.nodes_metcouncil_df["Y"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.y + ) + + # CUBE expect node id to be N + self.nodes_metcouncil_df.rename(columns={"model_node_id": "N"}, inplace=True)
+ +
[docs] def rename_variables_for_dbf( + self, + input_df, + variable_crosswalk: str = None, + output_variables: list = None, + convert_geometry_to_xy=False, + ): + """ + Rename attributes for DBF/SHP, make sure length within 10 chars. + + Args: + input_df (dataframe): Network standard DataFrame. + variable_crosswalk (str): File path to variable name crosswalk from network standard to DBF names. + output_variables (list): List of strings for DBF variables. + convert_geometry_to_xy (bool): True if converting node geometry to X/Y + + Returns: + dataframe + + """ + WranglerLogger.info("Renaming variables so that they are DBF-safe") + + """ + Verify inputs + """ + + variable_crosswalk = ( + variable_crosswalk + if variable_crosswalk + else self.parameters.net_to_dbf_crosswalk + ) + + output_variables = ( + output_variables if output_variables else self.parameters.output_variables + ) + + """ + Start actual process + """ + + crosswalk_df = pd.read_csv(variable_crosswalk) + WranglerLogger.debug( + "Variable crosswalk: {} \n {}".format(variable_crosswalk, crosswalk_df) + ) + net_to_dbf_dict = dict(zip(crosswalk_df["net"], crosswalk_df["dbf"])) + + dbf_name_list = [] + + dbf_df = copy.deepcopy(input_df) + + # only write out variables that we specify + # if variable is specified in the crosswalk, rename it to that variable + for c in dbf_df.columns: + if c in output_variables: + try: + dbf_df.rename(columns={c: net_to_dbf_dict[c]}, inplace=True) + dbf_name_list += [net_to_dbf_dict[c]] + except: + dbf_name_list += [c] + + if "geometry" in dbf_df.columns: + if str(dbf_df["geometry"].iloc[0].geom_type) == "Point": + dbf_df["X"] = dbf_df.geometry.apply(lambda g: g.x) + dbf_df["Y"] = dbf_df.geometry.apply(lambda g: g.y) + dbf_name_list += ["X", "Y"] + + WranglerLogger.debug("DBF Variables: {}".format(",".join(dbf_name_list))) + + return dbf_df[dbf_name_list]
+ +
[docs] def write_roadway_as_shp( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + data_to_csv: bool = True, + data_to_dbf: bool = False, + output_link_shp: str = None, + output_node_shp: str = None, + output_link_csv: str = None, + output_node_csv: str = None, + output_gpkg: str = None, + output_link_gpkg_layer: str = None, + output_node_gpkg_layer: str = None, + output_gpkg_link_filter: str = None + ): + """ + Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names. + + Args: + output_dir (str): File path to directory + node_output_variables (list): List of strings for node output variables. + link_output_variables (list): List of strings for link output variables. + data_to_csv (bool): True if write network in csv format. + data_to_dbf (bool): True if write network in dbf/shp format. + output_link_shp (str): File name to output link dbf/shp. + output_node_shp (str): File name of output node dbf/shp. + output_link_csv (str): File name to output link csv. + output_node_csv (str): File name to output node csv. + output_gpkg (str): File name to output GeoPackage. + output_link_gpkg_layer (str): Layer name within output_gpkg to output links. + output_node_gpkg_layer (str): Layer name within output_gpkg to output links. + output_gpkg_link_filter (str): Optional column name to additional output link subset layers + + Returns: + None + """ + + WranglerLogger.info("Writing Network as Shapefile") + WranglerLogger.debug( + "Output Variables: \n - {}".format( + "\n - ".join(self.parameters.output_variables) + ) + ) + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_met_council_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + # unless specified that all the data goes to the DBF, only output A and B + dbf_link_output_variables = ( + #MTC + link_output_variables if link_output_variables else ["A", "B", "geometry"] + #MC + #link_output_variables if data_to_dbf else ["A", "B", "shape_id", "geometry"] + ) + + # Removing code to set this to versions from parameters + # User can use these as arg + + """ + Start Process + """ + # rename these to short only for shapefile option + if output_node_shp: + WranglerLogger.info("Renaming DBF Node Variables") + nodes_dbf_df = self.rename_variables_for_dbf(self.nodes_mtc_df, output_variables=node_output_variables) + else: + WranglerLogger.debug("nodes_mtc_df columns: {}".format(list(self.nodes_mtc_df.columns))) + nodes_dbf_df = self.nodes_mtc_df[node_output_variables] + + if output_link_shp: + WranglerLogger.info("Renaming DBF Link Variables") + links_dbf_df = self.rename_variables_for_dbf(self.links_mtc_df, output_variables=dbf_link_output_variables) + else: + WranglerLogger.debug("links_mtc_df columns: {}".format(list(self.links_mtc_df.columns))) + links_dbf_df = self.links_mtc_df[dbf_link_output_variables] + + links_dbf_df = gpd.GeoDataFrame(links_dbf_df, geometry=links_dbf_df["geometry"]) + + # temp debug + WranglerLogger.debug("links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))]:\n{}".format( + links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))] + )) + + if output_node_shp: + WranglerLogger.info("Writing Node Shapes: {}".format(os.path.join(output_dir, output_node_shp))) + nodes_dbf_df.to_file(os.path.join(output_dir, output_node_shp)) + + if output_gpkg and output_node_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Node Layer {}".format(os.path.join(output_dir, output_gpkg), output_node_gpkg_layer)) + nodes_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_node_gpkg_layer, driver="GPKG") + + if output_link_shp: + WranglerLogger.info("Writing Link Shapes: {}".format(os.path.join(output_dir, output_link_shp))) + links_dbf_df.to_file(os.path.join(output_dir, output_link_shp)) + + # debug test + link_schema = { + "properties": { + "A" : "int:8", + "B" : "int:8", + "model_link_id" : "int:10", + "shstGeometryId": "str:32", + "name" : "str:84", + "ft" : "int:2", + "assignable" : "int:18", + "cntype" : "str:80", + "distance" : "float", + "county" : "str:15", + "bike_access" : "int:2", + "drive_access" : "int:2", + "walk_access" : "int:2", + "rail_only" : "int:2", + "bus_only" : "int:2", + "transit" : "int:2", + "managed" : "int:2", + "tollbooth" : "int:2", + "tollseg" : "int:2", + "segment_id" : "int:4", + "lanes_EA" : "int:2", + "heuristic_num" : "int:2", + "lanes_AM" : "int:2", + "lanes_MD" : "int:2", + "lanes_PM" : "int:2", + "lanes_EV" : "int:2", + "useclass_EA" : "int:2", + "useclass_AM" : "int:2", + "useclass_MD" : "int:2", + "useclass_PM" : "int:2", + "useclass_EV" : "int:2" + }, + "geometry": "LineString" + } + if output_gpkg and output_link_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Link Layer {}".format(os.path.join(output_dir, output_gpkg), output_link_gpkg_layer)) + links_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_link_gpkg_layer, schema=link_schema, driver="GPKG") + + # output additional link layers if filter column is specified + # e.g. if county-subsets are output + if output_gpkg_link_filter: + link_value_counts = links_dbf_df[output_gpkg_link_filter].value_counts() + for filter_val,filter_count in link_value_counts.items(): + gpkg_layer_name = "{}_{}".format(output_link_gpkg_layer, filter_val) + gpkg_layer_name = gpkg_layer_name.replace(" ","_") + WranglerLogger.info("Writing GeoPackage {} with Link Layer {} for {} rows".format( + os.path.join(output_dir, output_gpkg), gpkg_layer_name, filter_count)) + links_dbf_df.loc[ links_dbf_df[output_gpkg_link_filter]==filter_val ].to_file( + os.path.join(output_dir, output_gpkg), layer=gpkg_layer_name, schema=link_schema, driver="GPKG") + + + + + if data_to_csv: + WranglerLogger.info( + "Writing Network Data to CSVs:\n - {}\n - {}".format( + output_link_csv, output_node_csv + ) + ) + self.links_mtc_df[link_output_variables].to_csv( + output_link_csv, index=False + ) + self.nodes_mtc_df[node_output_variables].to_csv( + output_node_csv, index=False + )
+ + + # this should be moved to util +
[docs] @staticmethod + def dataframe_to_fixed_width(df): + """ + Convert dataframe to fixed width format, geometry column will not be transformed. + + Args: + df (pandas DataFrame). + + Returns: + pandas dataframe: dataframe with fixed width for each column. + dict: dictionary with columns names as keys, column width as values. + """ + WranglerLogger.info("Starting fixed width conversion") + + # get the max length for each variable column + max_width_dict = dict( + [ + (v, df[v].apply(lambda r: len(str(r)) if r != None else 0).max()) + for v in df.columns.values + if v != "geometry" + ] + ) + + fw_df = df.drop("geometry", axis=1).copy() + for c in fw_df.columns: + fw_df[c] = fw_df[c].apply(lambda x: str(x)) + fw_df["pad"] = fw_df[c].apply(lambda x: " " * (max_width_dict[c] - len(x))) + fw_df[c] = fw_df.apply(lambda x: x["pad"] + x[c], axis=1) + + return fw_df, max_width_dict
+ +
[docs] def write_roadway_as_fixedwidth( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + output_link_txt: str = None, + output_node_txt: str = None, + output_link_header_width_txt: str = None, + output_node_header_width_txt: str = None, + output_cube_network_script: str = None, + drive_only: bool = False, + ): + """ + Writes out fixed width file. + + This function does: + 1. write out link and node fixed width data files for cube. + 2. write out header and width correspondence. + 3. write out cube network building script with header and width specification. + + Args: + output_dir (str): File path to where links, nodes and script will be written and run + node_output_variables (list): list of node variable names. + link_output_variables (list): list of link variable names. + output_link_txt (str): File name of output link database (within output_dir) + output_node_txt (str): File name of output node database (within output_dir) + output_link_header_width_txt (str): File name of link column width records (within output_dir) + output_node_header_width_txt (str): File name of node column width records (within output_dir) + output_cube_network_script (str): File name of CUBE network building script (within output_dir) + drive_only (bool): If True, only writes drive nodes and links + + Returns: + None + + """ + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_mtc_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + output_link_txt = ( + output_link_txt if output_link_txt else self.parameters.output_link_txt + ) + + output_node_txt = ( + output_node_txt if output_node_txt else self.parameters.output_node_txt + ) + + output_link_header_width_txt = ( + output_link_header_width_txt + if output_link_header_width_txt + else self.parameters.output_link_header_width_txt + ) + + output_node_header_width_txt = ( + output_node_header_width_txt + if output_node_header_width_txt + else self.parameters.output_node_header_width_txt + ) + + output_cube_network_script = ( + output_cube_network_script + if output_cube_network_script + else self.parameters.output_cube_network_script + ) + + """ + Start Process + """ + #MTC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_mtc_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df['drive_access'] == 1] + #/MTC + """ + #MC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_metcouncil_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df["drive_access"] == 1] + #/MC + """ + WranglerLogger.info("Writing out link database") + + link_ff_df.to_csv(os.path.join(output_dir, output_link_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out link header and width ----") + link_max_width_df = DataFrame( + list(link_max_width_dict.items()), columns=["header", "width"] + ) + link_max_width_df.to_csv(os.path.join(output_dir, output_link_header_width_txt), index=False) + + #MTC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_mtc_df[node_output_variables] + ) + #/MTC + """ + #MC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_metcouncil_df[node_output_variables] + ) + #/MC + """ + WranglerLogger.info("Writing out node database") + + if drive_only: + node_ff_df = node_ff_df.loc[node_ff_df["drive_node"] == 1] + + + node_ff_df.to_csv(os.path.join(output_dir, output_node_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out node header and width") + node_max_width_df = DataFrame( + list(node_max_width_dict.items()), columns=["header", "width"] + ) + node_max_width_df.to_csv(os.path.join(output_dir, output_node_header_width_txt), index=False) + + # write out cube script + s = 'RUN PGM = NETWORK MSG = "Read in network from fixed width file" \n' + s += 'FILEI LINKI[1] = "{}",'.format(output_link_txt) + start_pos = 1 + for i in range(len(link_max_width_df)): + s += " VAR=" + link_max_width_df.header.iloc[i] + + if ( + self.links_mtc_df.dtypes.loc[link_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(link_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(link_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += link_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += "\n" + s += 'FILEI NODEI[1] = "{}",'.format(output_node_txt) + start_pos = 1 + for i in range(len(node_max_width_df)): + s += " VAR=" + node_max_width_df.header.iloc[i] + + if ( + self.nodes_mtc_df.dtypes.loc[node_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(node_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(node_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += node_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += '\n' + s += 'FILEO NETO = "complete_network.net"\n\n' + s += ' ZONES = {}\n\n'.format(self.parameters.zones) + s += '; Trim leading whitespace from string variables\n' + # todo: The below should be built above based on columns that are strings + s += ' phase=NODEMERGE\n' + s += ' county = LTRIM(county)\n' + s += ' endphase\n' + s += ' phase=LINKMERGE\n' + s += ' name = LTRIM(name)\n' + s += ' county = LTRIM(county)\n' + s += ' cntype = LTRIM(cntype)\n' + s += ' endphase\n' + s += '\nENDRUN\n' + + with open(os.path.join(output_dir, output_cube_network_script), "w") as f: + f.write(s) + + # run the cube script to create the cube network + import subprocess + env = copy.copy(os.environ) + cube_cmd = '"C:\\Program Files\\Citilabs\\CubeVoyager\\runtpp.exe" {}'.format(output_cube_network_script) + try: + WranglerLogger.info("Running [{}] in cwd [{}]".format(cube_cmd, output_dir)) + ret = subprocess.run(cube_cmd, cwd=output_dir, capture_output=True, check=True) + + WranglerLogger.info("return code: {}".format(ret.returncode)) + + for line in ret.stdout.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stdout: {}".format(line)) + + for line in ret.stderr.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stderr: {}".format(line)) + + except Exception as e: + WranglerLogger.error(e)
+
+ +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_modules/lasso/transit/index.html b/_modules/lasso/transit/index.html new file mode 100644 index 0000000..ef8576b --- /dev/null +++ b/_modules/lasso/transit/index.html @@ -0,0 +1,2035 @@ + + + + + + lasso.transit — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.transit

+"""Transit-related classes to parse, compare, and write standard and cube transit files.
+
+  Typical usage example:
+
+    tn = CubeTransit.create_from_cube(CUBE_DIR)
+    transit_change_list = tn.evaluate_differences(base_transit_network)
+
+    cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+    cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+"""
+import os
+import copy
+import csv
+import datetime, time
+from typing import Any, Dict, Optional, Union
+
+from lark import Lark, Transformer, v_args
+from pandas import DataFrame
+
+import pandas as pd
+import partridge as ptg
+import numpy as np
+
+from network_wrangler import TransitNetwork
+
+from .logger import WranglerLogger
+from .parameters import Parameters
+
+
[docs]class CubeTransit(object): + """Class for storing information about transit defined in Cube line + files. + + Has the capability to: + + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + + .. highlight:: python + + Typical usage example: + :: + tn = CubeTransit.create_from_cube(CUBE_DIR) + transit_change_list = tn.evaluate_differences(base_transit_network) + + Attributes: + lines (list): list of strings representing unique line names in + the cube network. + line_properties (dict): dictionary of line properties keyed by line name. Property + values are stored in a dictionary by property name. These + properties are directly read from the cube line files and haven't + been translated to standard transit values. + shapes (dict): dictionary of shapes + keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns: + - 'node_id' (int): positive integer of node id + - 'node' (int): node number, with negative indicating a non-stop + - 'stop' (boolean): indicates if it is a stop + - 'order' (int): order within this shape + program_type (str): Either PT or TRNBLD + parameters (Parameters): + Parameters instance that will be applied to this instance which + includes information about time periods and variables. + source_list (list): + List of cube line file sources that have been read and added. + diff_dict (dict): + """ + +
[docs] def __init__(self, parameters: Union[Parameters, dict] = {}): + """ + Constructor for CubeTransit + + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + WranglerLogger.debug("Creating a new Cube Transit instance") + + self.lines = [] + + self.line_properties = {} + self.shapes = {} + + self.program_type = None + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.source_list = [] + + self.diff_dict = Dict[str, Any]
+ +
[docs] def add_cube(self, transit_source: str): + """Reads a .lin file and adds it to existing TransitNetwork instance. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + """ + + """ + Figure out what kind of transit source it is + """ + + parser = Lark(TRANSIT_LINE_FILE_GRAMMAR, debug="debug", parser="lalr") + + if "NAME=" in transit_source: + WranglerLogger.debug("reading transit source as string") + self.source_list.append("input_str") + parse_tree = parser.parse(transit_source) + elif os.path.isfile(transit_source): + print("reading: {}".format(transit_source)) + with open(transit_source) as file: + WranglerLogger.debug( + "reading transit source: {}".format(transit_source) + ) + self.source_list.append(transit_source) + parse_tree = parser.parse(file.read()) + elif os.path.isdir(transit_source): + import glob + + for lin_file in glob.glob(os.path.join(transit_source, "*.LIN")): + self.add_cube(lin_file) + return + else: + msg = "{} not a valid transit line string, directory, or file" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("finished parsing cube line file") + # WranglerLogger.debug("--Parse Tree--\n {}".format(parse_tree.pretty())) + transformed_tree_data = CubeTransformer().transform(parse_tree) + # WranglerLogger.debug("--Transformed Parse Tree--\n {}".format(transformed_tree_data)) + + _line_data = transformed_tree_data["lines"] + + line_properties_dict = {k: v["line_properties"] for k, v in _line_data.items()} + line_shapes_dict = {k: v["line_shape"] for k, v in _line_data.items()} + new_lines = list(line_properties_dict.keys()) + """ + Before adding lines, check to see if any are overlapping with existing ones in the network + """ + + overlapping_lines = set(new_lines) & set(self.lines) + if overlapping_lines: + msg = "Overlapping lines found when adding from {}. \nSource files:\n{}\n{} Overlapping Lines of {} total new lines.\n-->{}".format( + transit_source, + "\n - ".join(self.source_list), + len(new_lines), + len(overlapping_lines), + overlapping_lines, + ) + print(msg) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.program_type = transformed_tree_data.get("program_type", None) + + self.lines += new_lines + self.line_properties.update(line_properties_dict) + self.shapes.update(line_shapes_dict) + + WranglerLogger.debug("Added lines to CubeTransit: \n".format(new_lines))
+ +
[docs] @staticmethod + def create_from_cube(transit_source: str, parameters: Optional[dict] = {}): + """ + Reads a cube .lin file and stores as TransitNetwork object. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + Returns: + A ::CubeTransit object created from the transit_source. + """ + + tn = CubeTransit(parameters) + tn.add_cube(transit_source) + + return tn
+ +
[docs] def evaluate_differences(self, base_transit): + """ + 1. Identifies what routes need to be updated, deleted, or added + 2. For routes being added or updated, identify if the time periods + have changed or if there are multiples, and make duplicate lines if so + 3. Create project card dictionaries for each change. + + Args: + base_transit (CubeTransit): an instance of this class for the base condition + + Returns: + A list of dictionaries containing project card changes + required to evaluate the differences between the base network + and this transit network instance. + """ + transit_change_list = [] + + """ + Identify what needs to be evaluated + """ + lines_to_update = [l for l in self.lines if l in base_transit.lines] + lines_to_delete = [l for l in base_transit.lines if l not in self.lines] + lines_to_add = [l for l in self.lines if l not in base_transit.lines] + + project_card_changes = [] + + """ + Evaluate Property Updates + """ + + for line in lines_to_update: + WranglerLogger.debug( + "Finding differences in time periods for: {}".format(line) + ) + + """ + Find any additional time periods that might need to add or delete. + """ + base_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + base_transit.line_properties[line] + ) + ) + + try: + assert len(base_cube_time_period_numbers) == 1 + except: + msg = "Base network line {} should only have one time period per route, but {} found".format( + line, base_cube_time_period_numbers + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + base_cube_time_period_number = base_cube_time_period_numbers[0] + + build_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + + time_periods_to_add = [ + tp + for tp in build_cube_time_period_numbers + if tp not in base_cube_time_period_numbers + ] + + for tp in time_periods_to_add: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + time_periods_to_delete = [ + tp + for tp in base_cube_time_period_numbers + if tp not in build_cube_time_period_numbers + ] + + for tp in time_periods_to_delete: + lines_to_delete.append(line) + + WranglerLogger.debug("Evaluating differences in: {}".format(line)) + updated_properties = self.evaluate_route_property_differences( + self.line_properties[line], + base_transit.line_properties[line], + base_cube_time_period_number, + ) + updated_shapes = CubeTransit.evaluate_route_shape_changes( + self.shapes[line].node, base_transit.shapes[line].node + ) + if updated_properties: + update_prop_card_dict = self.create_update_route_card_dict( + line, updated_properties + ) + project_card_changes.append(update_prop_card_dict) + + if updated_shapes: + update_shape_card_dict = self.create_update_route_card_dict( + line, updated_shapes + ) + project_card_changes.append(update_shape_card_dict) + + """ + Evaluate Deletions + """ + for line in lines_to_delete: + delete_card_dict = self.create_delete_route_card_dict( + line, base_transit.line_properties[line] + ) + project_card_changes.append(delete_card_dict) + + """ + Evaluate Additions + + First assess if need to add multiple routes if there are multiple time periods + """ + for line in lines_to_add: + time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + if len(time_period_numbers) > 1: + for tp in time_period_numbers[1:]: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + for line in lines_to_add: + add_card_dict = self.create_add_route_card_dict(line) + project_card_changes.append(add_card_dict) + + return project_card_changes
+ +
[docs] def add_additional_time_periods( + self, new_time_period_number: int, orig_line_name: str + ): + """ + Copies a route to another cube time period with appropriate + values for time-period-specific properties. + + New properties are stored under the new name in: + - ::self.shapes + - ::self.line_properties + + Args: + new_time_period_number (int): cube time period number + orig_line_name(str): name of the originating line, from which + the new line will copy its properties. + + Returns: + Line name with new time period. + """ + WranglerLogger.debug( + "adding time periods {} to line {}".format( + new_time_period_number, orig_line_name + ) + ) + + ( + route_id, + _init_time_period, + agency_id, + direction_id, + ) = CubeTransit.unpack_route_name(orig_line_name) + new_time_period_name = self.parameters.cube_time_periods[new_time_period_number] + new_tp_line_name = CubeTransit.build_route_name( + route_id=route_id, + time_period=new_time_period_name, + agency_id=agency_id, + direction_id=direction_id, + ) + + try: + assert new_tp_line_name not in self.lines + except: + msg = "Trying to add a new time period {} to line {}, but constructed name {} is already in line list.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + WrangerLogger.error(msg) + raise ValueError(msg) + + # copy to a new line and add it to list of lines to add + self.line_properties[new_tp_line_name] = copy.deepcopy( + self.line_properties[orig_line_name] + ) + self.shapes[new_tp_line_name] = copy.deepcopy(self.shapes[orig_line_name]) + self.line_properties[new_tp_line_name]["NAME"] = new_tp_line_name + + """ + Remove entries that aren't for this time period from the new line's properties list. + """ + this_time_period_properties_list = [ + p + "[" + str(new_time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + self.line_properties[new_tp_line_name].pop(k, None) + + """ + Remove entries for time period from the original line's properties list. + """ + for k in this_time_period_properties_list: + self.line_properties[orig_line_name].pop(k, None) + + """ + Add new line to list of lines to add. + """ + WranglerLogger.debug( + "Adding new time period {} for line {} as {}.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + ) + return new_tp_line_name
+ +
[docs] def create_update_route_card_dict(self, line: str, updated_properties_dict: dict): + """ + Creates a project card change formatted dictionary for updating + the line. + + Args: + line: name of line that is being updated + updated_properties_dict: dictionary of attributes to update as + 'property': <property name>, + 'set': <new property value> + + Returns: + A project card change-formatted dictionary for the attribute update. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.split("_")[-2].strip("d\"")), + "shape_id": line.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + "properties": updated_properties_dict, + } + WranglerLogger.debug( + "Updating {} route to changes:\n{}".format(line, str(update_card_dict)) + ) + + return update_card_dict
+ +
[docs] def create_delete_route_card_dict( + self, line: str, base_transit_line_properties_dict: dict + ): + """ + Creates a project card change formatted dictionary for deleting a line. + + Args: + line: name of line that is being deleted + base_transit_line_properties_dict: dictionary of cube-style + attribute values in order to find time periods and + start and end times. + + Returns: + A project card change-formatted dictionary for the route deletion. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + base_transit_line_properties_dict + ) + + delete_card_dict = { + "category": "Delete Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + } + WranglerLogger.debug( + "Deleting {} route to changes:\n{}".format(line, delete_card_dict) + ) + + return delete_card_dict
+ +
[docs] def create_add_route_card_dict(self, line: str): + """ + Creates a project card change formatted dictionary for adding + a route based on the information in self.route_properties for + the line. + + Args: + line: name of line that is being updated + + Returns: + A project card change-formatted dictionary for the route addition. + """ + start_time_str, end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + standard_properties = self.cube_properties_to_standard_properties( + self.line_properties[line] + ) + + routing_properties = { + "property": "routing", + "set": self.shapes[line]["node"].tolist(), + } + + add_card_dict = { + "category": "New Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('_')[-2]), + "start_time": start_time_str, + "end_time": end_time_str, + "agency_id": line.strip('_')[0], + }, + "properties": standard_properties + [routing_properties], + } + + WranglerLogger.debug( + "Adding {} route to changes:\n{}".format(line, add_card_dict) + ) + return add_card_dict
+ +
[docs] @staticmethod + def get_time_period_numbers_from_cube_properties(properties_list: list): + """ + Finds properties that are associated with time periods and the + returns the numbers in them. + + Args: + properties_list (list): list of all properties. + + Returns: + list of strings of the time period numbers found + """ + time_periods_list = [] + for p in properties_list: + if ("[" not in p) or ("]" not in p): + continue + tp_num = p.split("[")[1][0] + if tp_num and tp_num not in time_periods_list: + time_periods_list.append(tp_num) + return time_periods_list
+ +
[docs] @staticmethod + def build_route_name( + route_id: str = "", + time_period: str = "", + agency_id: str = 0, + direction_id: str = 1, + ): + """ + Create a route name by contatenating route, time period, agency, and direction + + Args: + route_id: i.e. 452-111 + time_period: i.e. pk + direction_id: i.e. 1 + agency_id: i.e. 0 + + Returns: + constructed line_name i.e. "0_452-111_452_pk1" + """ + + return ( + str(agency_id) + + "_" + + str(route_id) + + "_" + + str(route_id.split("-")[0]) + + "_" + + str(time_period) + + str(direction_id) + )
+ +
[docs] @staticmethod + def unpack_route_name(line_name: str): + """ + Unpacks route name into direction, route, agency, and time period info + + Args: + line_name (str): i.e. "0_452-111_452_pk1" + + Returns: + route_id (str): 452-111 + time_period (str): i.e. pk + direction_id (str) : i.e. 1 + agency_id (str) : i.e. 0 + """ + + line_name = line_name.strip('"') + + agency_id, route_id, _rtid, _tp_direction = line_name.split("_") + time_period = _tp_direction[0:-1] + direction_id = _tp_direction[-1] + + return route_id, time_period, agency_id, direction_id
+ +
[docs] def calculate_start_end_times(self, line_properties_dict: dict): + """ + Calculate the start and end times of the property change + WARNING: Doesn't take care of discongruous time periods!!!! + + Args: + line_properties_dict: dictionary of cube-flavor properties for a transit line + """ + start_time_m = 24 * 60 + end_time_m = 0 * 60 + + WranglerLogger.debug( + "parameters.time_period_properties_list: {}".format( + self.parameters.time_period_properties_list + ) + ) + current_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + line_properties_dict + ) + ) + + WranglerLogger.debug( + "current_cube_time_period_numbers:{}".format( + current_cube_time_period_numbers + ) + ) + + for tp in current_cube_time_period_numbers: + time_period_name = self.parameters.cube_time_periods[tp] + WranglerLogger.debug("time_period_name:{}".format(time_period_name)) + _start_time, _end_time = self.parameters.time_period_to_time[ + time_period_name + ] + + # change from "HH:MM" to integer # of seconds + _start_time_m = (int(_start_time.split(":")[0]) * 60) + int( + _start_time.split(":")[1] + ) + _end_time_m = (int(_end_time.split(":")[0]) * 60) + int( + _end_time.split(":")[1] + ) + + # find bounding start and end times + if _start_time_m < start_time_m: + start_time_m = _start_time_m + if _end_time_m > end_time_m: + end_time_m = _end_time_m + + if start_time_m > end_time_m: + msg = "Start time ({}) is after end time ({})".format( + start_time_m, end_time_m + ) + #WranglerLogger.error(msg) + #raise ValueError(msg) + + start_time_str = "{:02d}:{:02d}".format(*divmod(start_time_m, 60)) + end_time_str = "{:02d}:{:02d}".format(*divmod(end_time_m, 60)) + return start_time_str, end_time_str
+ +
[docs] @staticmethod + def cube_properties_to_standard_properties(cube_properties_dict: dict): + """ + Converts cube style properties to standard properties. + + This is most pertinent to time-period specific variables like headway, + and varibles that have stnadard units like headway, which is minutes + in cube and seconds in standard format. + + Args: + cube_properties_dict: <cube style property name> : <property value> + + Returns: + A list of dictionaries with values for `"property": <standard + style property name>, "set" : <property value with correct units>` + + """ + standard_properties_list = [] + for k, v in cube_properties_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + change_item["set"] = v * 60 + else: + change_item["property"] = k + change_item["set"] = v + standard_properties_list.append(change_item) + + return standard_properties_list
+ +
[docs] def evaluate_route_property_differences( + self, + properties_build: dict, + properties_base: dict, + time_period_number: str, + absolute: bool = True, + validate_base: bool = False, + ): + """ + Checks if any values have been updated or added for a specific + route and creates project card entries for each. + + Args: + properties_build: ::<property_name>: <property_value> + properties_base: ::<property_name>: <property_value> + time_period_number: time period to evaluate + absolute: if True, will use `set` command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway + validate_base: if True, will add the `existing` line in the project card + + Returns: + transit_change_list (list): a list of dictionary values suitable for writing to a project card + `{ + 'property': <property_name>, + 'set': <set value>, + 'change': <change from existing value>, + 'existing': <existing value to check>, + }` + + """ + + # Remove time period specific values for things that aren't part of the time period in question + this_time_period_properties_list = [ + p + "[" + str(time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + properties_build.pop(k, None) + properties_base.pop(k, None) + + difference_dict = dict( + set(properties_build.items()) ^ set(properties_base.items()) + ) + + # Iterate through properties list to build difference project card list + + properties_list = [] + for k, v in difference_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + + if absolute: + change_item["set"] = ( + v * 60 + ) # project cards are in secs, cube is in minutes + else: + change_item["change"] = ( + properties_build[k] - properties_base[k] + ) * 60 + if validate_base or not absolute: + change_item["existing"] = properties_base[k] * 60 + else: + change_item["property"] = k + change_item["set"] = v + if validate_base: + change_item["existing"] = properties_base[k] + + properties_list.append(change_item) + WranglerLogger.debug( + "Evaluated Route Changes: \n {})".format( + "\n".join(map(str, properties_list)) + ) + ) + return properties_list
+ +
[docs] @staticmethod + def evaluate_route_shape_changes( + shape_build: DataFrame, shape_base: DataFrame + ): + """ + Compares two route shapes and constructs returns list of changes + suitable for a project card. + + Args: + shape_build: DataFrame of the build-version of the route shape. + shape_base: dDataFrame of the base-version of the route shape. + + Returns: + List of shape changes formatted as a project card-change dictionary. + + """ + + if shape_build.equals(shape_base): + return None + + shape_change_list = [] + + base_node_list = shape_base.tolist() + build_node_list = shape_build.tolist() + + sort_len = max(len(base_node_list), len(build_node_list)) + + start_pos = None + end_pos = None + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + start_pos = i - 1 + break + if base_node_list[i] != build_node_list[i]: + start_pos = i + break + else: + continue + + j = -1 + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + end_pos = j + 1 + break + if base_node_list[j] != build_node_list[j]: + end_pos = j + break + else: + j -= 1 + + if start_pos or end_pos: + existing = base_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + set = build_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + + shape_change_list.append( + {"property": "routing", "existing": existing, "set": set} + ) + + return shape_change_list
+ + +
[docs]class StandardTransit(object): + """Holds a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's + Cube Line files. + + .. highlight:: python + Typical usage example: + :: + cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) + cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) + + Attributes: + feed: Partridge Feed object containing read-only access to GTFS feed + parameters (Parameters): Parameters instance containing information + about time periods and variables. + """ + +
[docs] def __init__(self, ptg_feed, parameters: Union[Parameters, dict] = {}): + """ + + Args: + ptg_feed: partridge feed object + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + self.feed = ptg_feed + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] @staticmethod + def fromTransitNetwork( + transit_network_object: TransitNetwork, parameters: Union[Parameters, dict] = {} + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + transit_network_object: Reference to an instance of TransitNetwork. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit + """ + return StandardTransit(transit_network_object.feed, parameters=parameters)
+ +
[docs] @staticmethod + def read_gtfs(gtfs_feed_dir: str, parameters: Union[Parameters, dict] = {}): + """ + Reads GTFS files from a directory and returns a StandardTransit + instance. + + Args: + gtfs_feed_dir: location of the GTFS files + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit instance + """ + return StandardTransit(ptg.load_feed(gtfs_feed_dir), parameters=parameters)
+ +
[docs] def write_as_cube_lin(self, outpath: str = None): + """ + Writes the gtfs feed as a cube line file after + converting gtfs properties to MetCouncil cube properties. + #MC + Args: + outpath: File location for output cube line file. + + """ + if not outpath: + outpath = os.path.join(self.parameters.scratch_location, "outtransit.lin") + trip_cube_df = self.route_properties_gtfs_to_cube(self) + + trip_cube_df["LIN"] = trip_cube_df.apply(self.cube_format, axis=1) + + l = trip_cube_df["LIN"].tolist() + + with open(outpath, "w") as f: + f.write("\n".join(l))
+ +
[docs] @staticmethod + def route_properties_gtfs_to_cube(self): + """ + Prepare gtfs for cube lin file. + #MC + Does the following operations: + 1. Combines route, frequency, trip, and shape information + 2. Converts time of day to time periods + 3. Calculates cube route name from gtfs route name and properties + 4. Assigns a cube-appropriate mode number + 5. Assigns a cube-appropriate operator number + + Returns: + trip_df (DataFrame): DataFrame of trips with cube-appropriate values for: + - NAME + - ONEWAY + - OPERATOR + - MODE + - HEADWAY + """ + WranglerLogger.info( + "Converting GTFS Standard Properties to MetCouncil's Cube Standard" + ) + metro_operator_dict = { + "0": 3, + "1": 3, + "2": 3, + "3": 4, + "4": 2, + "5": 5, + "6": 8, + "7": 1, + "8": 1, + "9": 10, + "10": 3, + "11": 9, + "12": 3, + "13": 4, + "14": 4, + "15": 3, + } + + shape_df = self.feed.shapes.copy() + trip_df = self.feed.trips.copy() + + """ + Add information from: routes, frequencies, and routetype to trips_df + """ + trip_df = pd.merge(trip_df, self.feed.routes, how="left", on="route_id") + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + inv_cube_time_periods_map = { + v: k for k, v in self.parameters.cube_time_periods.items() + } + trip_df["tod_num"] = trip_df.tod_name.map(inv_cube_time_periods_map) + trip_df["tod_name"] = trip_df.tod_name.map( + self.parameters.cube_time_periods_name + ) + + trip_df["NAME"] = trip_df.apply( + lambda x: x.agency_id + + "_" + + x.route_id + + "_" + + x.route_short_name + + "_" + + x.tod_name + + str(x.direction_id), + axis=1, + ) + + trip_df["LONGNAME"] = trip_df["route_long_name"] + trip_df["HEADWAY"] = (trip_df["headway_secs"] / 60).astype(int) + trip_df["MODE"] = trip_df.apply(self.calculate_cube_mode, axis=1) + trip_df["ONEWAY"] = "T" + trip_df["OPERATOR"] = trip_df["agency_id"].map(metro_operator_dict) + + return trip_df
+ +
[docs] def calculate_cube_mode(self, row): + """ + Assigns a cube mode number by following logic. + #MC + For rail, uses GTFS route_type variable: + https://developers.google.com/transit/gtfs/reference + + :: + # route_type : cube_mode + route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9} # Rail + + For buses, uses route id numbers and route name to find + express and suburban buses as follows: + + :: + if not cube_mode: + if 'express' in row['LONGNAME'].lower(): + cube_mode = 7 # Express + elif int(row['route_id'].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + Args: + row: A DataFrame row with route_type, route_long_name, and route_id + + Returns: + cube mode number + """ + # route_type : cube_mode + route_type_to_cube_mode = { + 0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9, + } # Rail + + cube_mode = route_type_to_cube_mode[row["route_type"]] + + if not cube_mode: + if "express" in row["route_long_name"].lower(): + cube_mode = 7 # Express + elif int(row["route_id"].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + return cube_mode
+ +
[docs] def time_to_cube_time_period( + self, start_time_secs: int, as_str: bool = True, verbose: bool = False + ): + """ + Converts seconds from midnight to the cube time period. + + Args: + start_time_secs: start time for transit trip in seconds + from midnight + as_str: if True, returns the time period as a string, + otherwise returns a numeric time period + + Returns: + this_tp_num: if as_str is False, returns the numeric + time period + this_tp: if as_str is True, returns the Cube time period + name abbreviation + """ + from .util import hhmmss_to_datetime, secs_to_datetime + + # set initial time as the time that spans midnight + + start_time_dt = secs_to_datetime(start_time_secs) + + # set initial time as the time that spans midnight + this_tp = "NA" + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + _dt_end_time = hhmmss_to_datetime(_end_time) + if _dt_start_time > _dt_end_time: + this_tp = tp_name + break + + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + if start_time_dt >= _dt_start_time: + this_time = _dt_start_time + this_tp = tp_name + + if verbose: + WranglerLogger.debug( + "Finding Cube Time Period from Start Time: \ + \n - start_time_sec: {} \ + \n - start_time_dt: {} \ + \n - this_tp: {}".format( + start_time_secs, start_time_dt, this_tp + ) + ) + + if as_str: + return this_tp + + name_to_num = {v: k for k, v in self.parameters.cube_time_periods.items()} + this_tp_num = name_to_num.get(this_tp) + + if not this_tp_num: + msg = ( + "Cannot find time period number in {} for time period name: {}".format( + name_to_num, this_tp + ) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + return this_tp_num
+ +
[docs] def shape_gtfs_to_dict_list(self, trip_id: str, shape_id: str, add_nntime: bool): + """ + This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of + stepping through the routed nodes and corresponding them with shape nodes. + + TODO: eliminate this necessity by tagging the stop nodes in the shapes to begin with when + the transit routing on the roadway network is first performed. + + As such, I'm copying the code from StandardTransit.shape_gtfs_to_cube() with minimal modifications. + + Args: + trip_id of the trip in question + shape_id of the trip in question + Returns: + list of dict records with columns: + trip_id + shape_id + shape_pt_sequence + shape_mode_node_id + is_stop + access + stop_sequence + """ + # get the stop times for this route + # https://developers.google.com/transit/gtfs/reference#stop_timestxt + trip_stop_times_df = self.feed.stop_times.loc[ self.feed.stop_times.trip_id == trip_id, + ['trip_id','arrival_time','departure_time','stop_id','stop_sequence','pickup_type','drop_off_type']].copy() + trip_stop_times_df.sort_values(by='stop_sequence', inplace=True) + trip_stop_times_df.reset_index(drop=True, inplace=True) + # print("trip_stop_times_df:\n{}".format(trip_stop_times_df)) + # print("trip_stop_times_df.dtypes:\n{}".format(trip_stop_times_df.dtypes)) + # trip_stop_times_df: + # trip_id arrival_time departure_time stop_id stop_sequence pickup_type drop_off_type + # 0 10007 0 0 7781 1 0 NaN + # 1 10007 120 120 7845 2 0 NaN + # 2 10007 300 300 7790 3 0 NaN + # 3 10007 360 360 7854 4 0 NaN + # 4 10007 390 390 7951 5 0 NaN + # 5 10007 720 720 7950 6 0 NaN + # 6 10007 810 810 7850 7 0 NaN + # 7 10007 855 855 7945 8 0 NaN + # 8 10007 900 900 7803 9 0 NaN + # 9 10007 930 930 7941 10 0 NaN + # trip_stop_times_df.dtypes: + # trip_id object + # arrival_time object + # departure_time object + # stop_id object + # stop_sequence int64 + # pickup_type object + # drop_off_type object + + # get the shapes for this route + # https://developers.google.com/transit/gtfs/reference#shapestxt + trip_node_df = self.feed.shapes.loc[self.feed.shapes.shape_id == shape_id].copy() + trip_node_df.sort_values(by="shape_pt_sequence", inplace = True) + trip_node_df.reset_index(drop=True, inplace=True) + # print("trip_node_df.head(20):\n{}".format(trip_node_df.head(20))) + # print("trip_node_df.dtypes:\n{}".format(trip_node_df.dtypes)) + # trip_node_df: + # shape_id shape_pt_sequence shape_osm_node_id shape_shst_node_id shape_model_node_id shape_pt_lat shape_pt_lon + # 0 696 1 1429334016 35cb440c505534e8aedbd3a286b70eab 2139625 NaN NaN + # 1 696 2 444242480 39e263722d5849b3c732b48734671400 2164862 NaN NaN + # 2 696 3 5686705779 4c41c608c35f457079fd673bce5556e5 2169898 NaN NaN + # 3 696 4 3695761874 d0f5b2173189bbb1b5dbaa78a004e8c4 2021876 NaN NaN + # 4 696 5 1433982749 60726971f0fb359a57e9d8df30bf384b 2002078 NaN NaN + # 5 696 6 1433982740 634c301424647d5883191edf522180e3 2156807 NaN NaN + # 6 696 7 4915736746 f03c3d7f1aa0358a91c165f53dac1e20 2145185 NaN NaN + # 7 696 8 65604864 68b8df24f1572d267ecf834107741393 2120788 NaN NaN + # 8 696 9 65604866 e412a013ad45af6649fa1b396f74c127 2066513 NaN NaN + # 9 696 10 956664242 657e1602aa8585383ed058f28f7811ed 2006476 NaN NaN + # 10 696 11 291642561 726b03cced023a6459d7333885927208 2133933 NaN NaN + # 11 696 12 291642583 709a0c00811f213f7476349a2c002003 2159991 NaN NaN + # 12 696 13 291642745 c5aaab62e0c78c34d93ee57795f06953 2165343 NaN NaN + # 13 696 14 5718664845 c7f1f4aa88887071a0d28154fc84604b 2007965 NaN NaN + # 14 696 15 291642692 0ef007a79b391e8ba98daf4985f26f9b 2160569 NaN NaN + # 15 696 16 5718664843 2ce63288e77747abc3a4124f0e28efcf 2047955 NaN NaN + # 16 696 17 3485537279 ec0c8eb524f41072a9fd87ecfd45e15f 2169094 NaN NaN + # 17 696 18 5718664419 57ca23828db4adea39355a92fb0fc3ff 2082102 NaN NaN + # 18 696 19 5718664417 4aba41268ada1058ee58e99a84e28d37 2019974 NaN NaN + # 19 696 20 65545418 d4f815a2f6da6c95d2f032a3cd61020c 2025374 NaN NaN # trip_node_df.dtypes: + # shape_id object + # shape_pt_sequence int64 + # shape_osm_node_id object + # shape_shst_node_id object + # shape_model_node_id object + # shape_pt_lat object + # shape_pt_lon object + + # we only need: shape_id, shape_pt_sequence, shape_model_node_id + trip_node_df = trip_node_df[['shape_id','shape_pt_sequence','shape_model_node_id']] + + if 'trip_id' in self.feed.stops.columns: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on=['trip_id', "stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + # print("trip_stop_times_df:\n{}".format(trip_stop_times_df)) + # print("trip_stop_times_df.dtypes:\n{}".format(trip_stop_times_df.dtypes)) + # trip_stop_times_df.dtypes: + # trip_id object + # arrival_time object + # departure_time object + # stop_id object + # stop_sequence int64 + # pickup_type object + # drop_off_type object + # stop_name object + # stop_lat float64 + # stop_lon float64 + # zone_id object + # agency_raw_name object + # stop_code object + # location_type float64 + # parent_station object + # stop_desc object + # stop_url object + # stop_timezone object + # wheelchair_boarding float64 + # platform_code object + # position object + # direction object + # * used by routes object + # osm_node_id object + # shst_node_id object + # model_node_id object + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + # this is the same as shape_gtfs_to_cube but we'll build up a list of dicts with shape/stop information + shape_stop_dict_list = [] + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + # add this stop to shape_stop_df + node_dict = trip_node_df.iloc[nodeIdx].to_dict() + node_dict['trip_id' ] = trip_id + node_dict['is_stop' ] = True + node_dict['access' ] = access_v + node_dict['stop_sequence'] = stop_seq + shape_stop_dict_list.append(node_dict) + + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + # add this stop to shape_stop_df + node_dict = trip_node_df.iloc[nodeIdx].to_dict() + node_dict['trip_id'] = trip_id + node_dict['is_stop'] = False + shape_stop_dict_list.append(node_dict) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + # print("node_list_str: {}".format(node_list_str)) + return shape_stop_dict_list
+ +
[docs] def shape_gtfs_to_cube(self, row, add_nntime = False): + """ + Creates a list of nodes that for the route in appropriate + cube format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a string representation of the node list + for a route in cube format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + if 'trip_id' in self.feed.stops.columns: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on=['trip_id', "stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + return node_list_str
+ + +
[docs] def cube_format(self, row): + """ + Creates a string represnting the route in cube line file notation. + #MC + Args: + row: row of a DataFrame representing a cube-formatted trip, with the Attributes + trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR + + Returns: + string representation of route in cube line file notation + """ + + s = '\nLINE NAME="{}",'.format(row.NAME) + s += '\n LONGNAME="{}",'.format(row.LONGNAME) + s += "\n HEADWAY[{}]={},".format(row.tod_num, row.HEADWAY) + s += "\n MODE={},".format(row.MODE) + s += "\n ONEWAY={},".format(row.ONEWAY) + s += "\n OPERATOR={},".format(row.OPERATOR) + s += "\n NODES={}".format(self.shape_gtfs_to_cube(row)) + + return s
+ +
[docs] def shape_gtfs_to_emme(self, trip_row): + """ + Creates transit segment for the trips in appropriate + emme format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a dataframe representation of the transit segment + for a trip in emme format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == trip_row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == trip_row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # node list + stop_seq = 0 + nntimes = [] + allow_alightings=[] + allow_boardings=[] + stop_names=[] + + if trip_row.TM2_line_haul_name in ["Light rail", "Heavy rail", "Commuter rail", "Ferry service"]: + add_nntime = True + else: + add_nntime = False + + for nodeIdx in range(len(trip_node_list)): + + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + nntimes.append(nntime_v) + else: + nntimes.append(0) + + pickup_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"pickup_type"].iloc[0] + if pickup_type in [1, "1"]: + allow_alightings.append(0) + else: + allow_alightings.append(1) + + drop_off_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"drop_off_type"].iloc[0] + if drop_off_type in [1, "1"]: + allow_boardings.append(0) + else: + allow_boardings.append(1) + + stop_name = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"stop_name"].iloc[0] + stop_names.append(stop_name) + + else: + nntimes.append(0) + allow_alightings.append(0) + allow_boardings.append(0) + stop_names.append("") + + trip_node_df['time_minutes'] = nntimes + trip_node_df['allow_alightings'] = allow_alightings + trip_node_df['allow_boardings'] = allow_boardings + trip_node_df['stop_name'] = stop_names + trip_node_df['line_id'] = trip_row['line_id'] + trip_node_df['node_id'] = trip_node_df['shape_model_node_id'].astype(int) + trip_node_df['stop_order'] = trip_node_df['shape_pt_sequence'] + + return trip_node_df
+ +
[docs] def evaluate_differences(self, transit_changes): + """ + Compare changes from the transit_changes dataframe with the standard transit network + returns the project card changes in dictionary format + """ + + # simple properties change + trip_df = self.feed.trips.copy() + + mode_crosswalk = pd.read_csv(self.parameters.mode_crosswalk_file) + mode_crosswalk.drop_duplicates(subset = ["agency_raw_name", "route_type", "is_express_bus"], inplace = True) + + trip_df = pd.merge(trip_df, self.feed.routes.drop("agency_raw_name", axis = 1), how="left", on="route_id") + + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod"] = trip_df.start_time.apply(self.time_to_cube_time_period, as_str = False) + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + + trip_df["headway_minutes"] = (trip_df["headway_secs"] / 60).astype(int) + + trip_df = pd.merge(trip_df, self.feed.agency[["agency_name", "agency_raw_name", "agency_id"]], how = "left", on = ["agency_raw_name", "agency_id"]) + + # identify express bus + # moved this here from top since this StandardTransit shouldn't depend on mtc... + from .mtc import _is_express_bus + trip_df["is_express_bus"] = trip_df.apply(lambda x: _is_express_bus(x), axis = 1) + trip_df.drop("agency_name", axis = 1 , inplace = True) + + trip_df = pd.merge( + trip_df, + mode_crosswalk.drop("agency_id", axis = 1), + how = "left", + on = ["agency_raw_name", "route_type", "is_express_bus"] + ) + + trip_df["line_id"] = trip_df.apply( + lambda x: str(x.TM2_operator) + + "_" + + str(x.route_id) + + "_" + + x.tod_name + + "_" + + "d" + + str(int(x.direction_id)) + + "_s" + + x.shape_id, + axis=1, + ) + + trip_df["line_id"] = trip_df["line_id"].str.slice(stop = 28) + + project_card_changes = [] + + # lines updated + transit_changes['line_id'] = transit_changes.apply( + lambda x: '-'.join(x['element_id'].split('-')[:-3]) if + x['object'] == 'TRANSIT_STOP' else + x['element_id'], + axis = 1 + ) + + lines_updated_df = transit_changes[ + (transit_changes['operation'] == 'C') & + (transit_changes['line_id'].isin(trip_df['line_id'].tolist())) + ].copy() + + ######################### + # simple property changes + ######################### + + property_changes_df = lines_updated_df[ + lines_updated_df.object == 'TRANSIT_LINE' + ].copy() + + property_attribute_list = ['headway_secs'] + + for index, row in property_changes_df.iterrows(): + line_id = row['line_id'] + properties_list = [] + change_item = {} + for c in property_attribute_list: + existing_value = int(trip_df[ + trip_df['line_id'] == line_id + ][c].iloc[0]) + + change_item["existing"] = existing_value + + if c == 'headway_secs': + change_item["set"] = row['headway'] * 60 + else: + change_item["set"] = row[c] + + change_item["property"] = c + + properties_list.append(change_item) + + property_changes_df.loc[index, 'properties'] = properties_list + + ############### + # shape changes + ############### + + shape_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_SHAPE']) + ].copy() + + for index, row in shape_changes_df.iterrows(): + line_id = row.line_id + + # get base shape + trip_row = trip_df[trip_df.line_id == line_id].copy().squeeze() + + base_shape = self.shape_gtfs_to_emme( + trip_row=trip_row + ) + base_shape['shape_model_node_id'] = base_shape['shape_model_node_id'].astype(int) + + # get build shape + build_shape = row.new_itinerary + + updated_shapes = CubeTransit.evaluate_route_shape_changes( + shape_base = base_shape.shape_model_node_id, + shape_build = pd.Series(row.new_itinerary) + ) + updated_shapes[0]['property'] = 'shapes' + shape_changes_df.loc[index, 'properties'] = updated_shapes + + ############## + # stop changes + ############## + stop_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_STOP']) + ].copy() + + stop_attribute_list = ['allow_alightings', 'allow_boardings'] + + stop_changes_df = stop_changes_df.groupby( + ['line_id','i_node'] + )[stop_attribute_list].last().reset_index() + + stop_attribute_changes_df = pd.DataFrame() + + for attribute in stop_attribute_list: + + attribute_df = stop_changes_df.groupby( + ['line_id', attribute] + )['i_node'].apply(list).reset_index() + attribute_df['properties'] = attribute_df.apply( + lambda x: { + 'property' : attribute if x[attribute] == True else 'no_'+attribute.split('_')[-1], + 'set': x['i_node']}, + axis = 1 + ) + + stop_attribute_changes_df = pd.concat( + [stop_attribute_changes_df, + attribute_df[['line_id', 'properties']]], + sort = False, + ignore_index = True + ) + + ############## + # combine all transit changes + ############## + transit_changes_df = pd.concat( + [ + property_changes_df, + shape_changes_df, + stop_attribute_changes_df + ], + sort = False, + ignore_index = True + ) + + # groupby line_id + transit_changes_df = transit_changes_df.groupby( + ['line_id'] + )['properties'].apply(list).reset_index() + + # create change items by line_id + for index, row in transit_changes_df.iterrows(): + line_id = row['line_id'] + base_start_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[0] + + base_end_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[1] + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line_id.split("_")[1], + "direction_id": int(line_id.split("_")[-2].strip("d\"")), + "shape_id": line_id.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str + }, + "properties": row['properties'], + } + + project_card_changes.append(update_card_dict) + + return project_card_changes
+ +class CubeTransformer(Transformer): + """A lark-parsing Transformer which transforms the parse-tree to + a dictionary. + + .. highlight:: python + Typical usage example: + :: + transformed_tree_data = CubeTransformer().transform(parse_tree) + + Attributes: + line_order (int): a dynamic counter to hold the order of the nodes within + a route shape + lines_list (list): a list of the line names + """ + + def __init__(self): + self.line_order = 0 + self.lines_list = [] + + def lines(self, line): + # WranglerLogger.debug("lines: \n {}".format(line)) + + # This MUST be a tuple because it returns to start in the tree + lines = {k: v for k, v in line} + return ("lines", lines) + + @v_args(inline=True) + def program_type_line(self, PROGRAM_TYPE, whitespace=None): + # WranglerLogger.debug("program_type_line:{}".format(PROGRAM_TYPE)) + self.program_type = PROGRAM_TYPE.value + + # This MUST be a tuple because it returns to start in the tree + return ("program_type", PROGRAM_TYPE.value) + + @v_args(inline=True) + def line(self, lin_attributes, nodes): + # WranglerLogger.debug("line...attributes:\n {}".format(lin_attributes)) + # WranglerLogger.debug("line...nodes:\n {}".format(nodes)) + lin_name = lin_attributes["NAME"] + + self.line_order = 0 + # WranglerLogger.debug("parsing: {}".format(lin_name)) + + return (lin_name, {"line_properties": lin_attributes, "line_shape": nodes}) + + @v_args(inline=True) + def lin_attributes(self, *lin_attr): + lin_attr = {k: v for (k, v) in lin_attr} + # WranglerLogger.debug("lin_attributes: {}".format(lin_attr)) + return lin_attr + + @v_args(inline=True) + def lin_attr(self, lin_attr_name, attr_value, SEMICOLON_COMMENT=None): + # WranglerLogger.debug("lin_attr {}: {}".format(lin_attr_name, attr_value)) + return lin_attr_name, attr_value + + def lin_attr_name(self, args): + attr_name = args[0].value.upper() + # WranglerLogger.debug(".......args {}".format(args)) + if attr_name in ["FREQ", "HEADWAY"]: + attr_name = attr_name + "[" + str(args[2]) + "]" + return attr_name + + def attr_value(self, attr_value): + try: + return int(attr_value[0].value) + except: + return attr_value[0].value + + def nodes(self, lin_node): + lin_node = DataFrame(lin_node) + # WranglerLogger.debug("nodes:\n {}".format(lin_node)) + + return lin_node + + @v_args(inline=True) + def lin_node(self, NODE_NUM, SEMICOLON_COMMENT=None, *lin_nodeattr): + self.line_order += 1 + n = int(NODE_NUM.value) + return {"node_id": abs(n), "node": n, "stop": n > 0, "order": self.line_order} + + start = dict + + +TRANSIT_LINE_FILE_GRAMMAR = r""" + +start : program_type_line? lines +WHITESPACE : /[ \t\r\n]/+ +STRING : /("(?!"").*?(?<!\\)(\\\\)*?"|'(?!'').*?(?<!\\)(\\\\)*?')/i +SEMICOLON_COMMENT : /;[^\n]*/ +BOOLEAN : "T"i | "F"i +program_type_line : ";;<<" PROGRAM_TYPE ">><<LINE>>;;" WHITESPACE? +PROGRAM_TYPE : "PT" | "TRNBUILD" + +lines : line* +line : "LINE" lin_attributes nodes + +lin_attributes : lin_attr+ +lin_attr : lin_attr_name "=" attr_value "," SEMICOLON_COMMENT* +TIME_PERIOD : "1".."5" +!lin_attr_name : "allstops"i + | "color"i + | ("freq"i "[" TIME_PERIOD "]") + | ("headway"i "[" TIME_PERIOD "]") + | "mode"i + | "name"i + | "oneway"i + | "owner"i + | "runtime"i + | "timefac"i + | "xyspeed"i + | "longname"i + | "shortname"i + | ("usera1"i) + | ("usera2"i) + | "circular"i + | "vehicletype"i + | "operator"i + | "faresystem"i + +attr_value : BOOLEAN | STRING | SIGNED_INT | FLOAT + +nodes : lin_node+ +lin_node : ("N" | "NODES")? "="? NODE_NUM ","? SEMICOLON_COMMENT? lin_nodeattr* +NODE_NUM : SIGNED_INT +lin_nodeattr : lin_nodeattr_name "=" attr_value ","? SEMICOLON_COMMENT* +!lin_nodeattr_name : "access_c"i + | "access"i + | "delay"i + | "xyspeed"i + | "timefac"i + | "nntime"i + | "time"i + +operator : SEMICOLON_COMMENT* "OPERATOR" opmode_attr* SEMICOLON_COMMENT* +mode : SEMICOLON_COMMENT* "MODE" opmode_attr* SEMICOLON_COMMENT* +opmode_attr : ( (opmode_attr_name "=" attr_value) ","? ) +opmode_attr_name : "number" | "name" | "longname" + +%import common.SIGNED_INT +%import common.FLOAT +%import common.WS +%ignore WS + +""" +
+ +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_modules/lasso/util/index.html b/_modules/lasso/util/index.html new file mode 100644 index 0000000..3c66e1f --- /dev/null +++ b/_modules/lasso/util/index.html @@ -0,0 +1,254 @@ + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.util

+from functools import partial
+import pyproj
+from shapely.ops import transform
+from shapely.geometry import Point, Polygon
+import re
+from unidecode import unidecode
+
+
[docs]def get_shared_streets_intersection_hash(lat, long, osm_node_id=None): + """ + Calculated per: + https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565 + Expected in/out + -93.0965985, 44.952112199999995 osm_node_id = 954734870 + 69f13f881649cb21ee3b359730790bb9 + + """ + import hashlib + + message = "Intersection {0:.5f} {0:.5f}".format(long, lat) + if osm_node_id: + message += " {}".format(osm_node_id) + unhashed = message.encode("utf-8") + hash = hashlib.md5(unhashed).hexdigest() + return hash
+ + +
[docs]def hhmmss_to_datetime(hhmmss_str: str): + """ + Creates a datetime time object from a string of hh:mm:ss + + Args: + hhmmss_str: string of hh:mm:ss + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = datetime.time(*[int(i) for i in hhmmss_str.split(":")]) + + return dt
+ + +
[docs]def secs_to_datetime(secs: int): + """ + Creates a datetime time object from a seconds from midnight + + Args: + secs: seconds from midnight + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = (datetime.datetime.min + datetime.timedelta(seconds=secs)).time() + + return dt
+ + +
[docs]def geodesic_point_buffer(lat, lon, meters): + """ + creates circular buffer polygon for node + + Args: + lat: node lat + lon: node lon + meters: buffer distance, radius of circle + Returns: + Polygon + """ + proj_wgs84 = pyproj.Proj('+proj=longlat +datum=WGS84') + # Azimuthal equidistant projection + aeqd_proj = '+proj=aeqd +lat_0={lat} +lon_0={lon} +x_0=0 +y_0=0' + project = partial( + pyproj.transform, + pyproj.Proj(aeqd_proj.format(lat=lat, lon=lon)), + proj_wgs84) + buf = Point(0, 0).buffer(meters) # distance in meters + return Polygon(transform(project, buf).exterior.coords[:])
+ +
[docs]def create_locationreference(node, link): + node['X'] = node['geometry'].apply(lambda p: p.x) + node['Y'] = node['geometry'].apply(lambda p: p.y) + node['point'] = [list(xy) for xy in zip(node.X, node.Y)] + node_dict = dict(zip(node.model_node_id, node.point)) + + link['A_point'] = link['A'].map(node_dict) + link['B_point'] = link['B'].map(node_dict) + link['locationReferences'] = link.apply(lambda x: [{'sequence':1, + 'point': x['A_point'], + 'distanceToNextRef':x['length'], + 'bearing' : 0, + 'intersectionId':x['fromIntersectionId']}, + {'sequence':2, + 'point': x['B_point'], + 'intersectionId':x['toIntersectionId']}], + axis = 1)
+ +
[docs]def column_name_to_parts(c, parameters=None): + + if not parameters: + from .parameters import Parameters + + parameters = Parameters() + + if c[0:2] == "ML": + managed = True + else: + managed = False + + time_period = None + category = None + + if c.split("_")[0] not in parameters.properties_to_split.keys(): + return c, None, None, managed + + tps = parameters.time_period_to_time.keys() + cats = parameters.categories.keys() + + if c.split("_")[-1] in tps: + time_period = c.split("_")[-1] + base_name = c.split(time_period)[-2][:-1] + if c.split("_")[-2] in cats: + category = c.split("_")[-2] + base_name = c.split(category)[-2][:-1] + elif c.split("_")[-1] in cats: + category = c.split("_")[-1] + base_name = c.split(category)[-2][:-1] + else: + msg = "Can't split property correctly: {}".format(c) + WranglerLogger.error(msg) + + return base_name, time_period, category, managed
+ +
[docs]def shorten_name(name): + if type(name) == str: + name_list = name.split(',') + else: + name_list = name + name_list = [re.sub(r'\W+', ' ', c).replace('nan', '').strip(' ') for c in name_list] + + name_list = list(set(name_list)) + #name_list.remove('') + + name_new = ' '.join(name_list).strip(' ') + + # convert non english character to english + name_new = unidecode(name_new) + + return name_new
+
+ +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_modules/shapely/geometry/point/index.html b/_modules/shapely/geometry/point/index.html new file mode 100644 index 0000000..edb5903 --- /dev/null +++ b/_modules/shapely/geometry/point/index.html @@ -0,0 +1,250 @@ + + + + + + shapely.geometry.point — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.point

+"""Points and related utilities
+"""
+import numpy as np
+
+import shapely
+from shapely.errors import DimensionError
+from shapely.geometry.base import BaseGeometry
+
+__all__ = ["Point"]
+
+
+
[docs]class Point(BaseGeometry): + """ + A geometry type that represents a single coordinate with + x,y and possibly z values. + + A point is a zero-dimensional feature and has zero length and zero area. + + Parameters + ---------- + args : float, or sequence of floats + The coordinates can either be passed as a single parameter, or as + individual float values using multiple parameters: + + 1) 1 parameter: a sequence or array-like of with 2 or 3 values. + 2) 2 or 3 parameters (float): x, y, and possibly z. + + Attributes + ---------- + x, y, z : float + Coordinate values + + Examples + -------- + Constructing the Point using separate parameters for x and y: + + >>> p = Point(1.0, -1.0) + + Constructing the Point using a list of x, y coordinates: + + >>> p = Point([1.0, -1.0]) + >>> print(p) + POINT (1 -1) + >>> p.y + -1.0 + >>> p.x + 1.0 + """ + + __slots__ = [] + + def __new__(self, *args): + if len(args) == 0: + # empty geometry + # TODO better constructor + return shapely.from_wkt("POINT EMPTY") + elif len(args) > 3: + raise TypeError(f"Point() takes at most 3 arguments ({len(args)} given)") + elif len(args) == 1: + coords = args[0] + if isinstance(coords, Point): + return coords + + # Accept either (x, y) or [(x, y)] + if not hasattr(coords, "__getitem__"): # generators + coords = list(coords) + coords = np.asarray(coords).squeeze() + else: + # 2 or 3 args + coords = np.array(args).squeeze() + + if coords.ndim > 1: + raise ValueError( + f"Point() takes only scalar or 1-size vector arguments, got {args}" + ) + if not np.issubdtype(coords.dtype, np.number): + coords = [float(c) for c in coords] + geom = shapely.points(coords) + if not isinstance(geom, Point): + raise ValueError("Invalid values passed to Point constructor") + return geom + + # Coordinate getters and setters + + @property + def x(self): + """Return x coordinate.""" + return shapely.get_x(self) + + @property + def y(self): + """Return y coordinate.""" + return shapely.get_y(self) + + @property + def z(self): + """Return z coordinate.""" + if not shapely.has_z(self): + raise DimensionError("This point has no z coordinate.") + # return shapely.get_z(self) -> get_z only supported for GEOS 3.7+ + return self.coords[0][2] + + @property + def __geo_interface__(self): + return {"type": "Point", "coordinates": self.coords[0]} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG circle element for the Point geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG circle diameter. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + return ( + '<circle cx="{0.x}" cy="{0.y}" r="{1}" ' + 'stroke="#555555" stroke-width="{2}" fill="{3}" opacity="{4}" />' + ).format(self, 3.0 * scale_factor, 1.0 * scale_factor, fill_color, opacity)
+ + @property + def xy(self): + """Separate arrays of X and Y coordinate values + + Example: + >>> x, y = Point(0, 0).xy + >>> list(x) + [0.0] + >>> list(y) + [0.0] + """ + return self.coords.xy
+ + +shapely.lib.registry[0] = Point +
+ +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_modules/shapely/geometry/polygon/index.html b/_modules/shapely/geometry/polygon/index.html new file mode 100644 index 0000000..4124afc --- /dev/null +++ b/_modules/shapely/geometry/polygon/index.html @@ -0,0 +1,431 @@ + + + + + + shapely.geometry.polygon — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.polygon

+"""Polygons and their linear ring components
+"""
+
+import numpy as np
+
+import shapely
+from shapely.algorithms.cga import is_ccw_impl, signed_area
+from shapely.errors import TopologicalError
+from shapely.geometry.base import BaseGeometry
+from shapely.geometry.linestring import LineString
+from shapely.geometry.point import Point
+
+__all__ = ["Polygon", "LinearRing"]
+
+
+def _unpickle_linearring(wkb):
+    linestring = shapely.from_wkb(wkb)
+    srid = shapely.get_srid(linestring)
+    linearring = shapely.linearrings(shapely.get_coordinates(linestring))
+    if srid:
+        linearring = shapely.set_srid(linearring, srid)
+    return linearring
+
+
+class LinearRing(LineString):
+    """
+    A geometry type composed of one or more line segments
+    that forms a closed loop.
+
+    A LinearRing is a closed, one-dimensional feature.
+    A LinearRing that crosses itself or touches itself at a single point is
+    invalid and operations on it may fail.
+
+    Parameters
+    ----------
+    coordinates : sequence
+        A sequence of (x, y [,z]) numeric coordinate pairs or triples, or
+        an array-like with shape (N, 2) or (N, 3).
+        Also can be a sequence of Point objects.
+
+    Notes
+    -----
+    Rings are automatically closed. There is no need to specify a final
+    coordinate pair identical to the first.
+
+    Examples
+    --------
+    Construct a square ring.
+
+    >>> ring = LinearRing( ((0, 0), (0, 1), (1 ,1 ), (1 , 0)) )
+    >>> ring.is_closed
+    True
+    >>> list(ring.coords)
+    [(0.0, 0.0), (0.0, 1.0), (1.0, 1.0), (1.0, 0.0), (0.0, 0.0)]
+    >>> ring.length
+    4.0
+
+    """
+
+    __slots__ = []
+
+    def __new__(self, coordinates=None):
+        if coordinates is None:
+            # empty geometry
+            # TODO better way?
+            return shapely.from_wkt("LINEARRING EMPTY")
+        elif isinstance(coordinates, LineString):
+            if type(coordinates) == LinearRing:
+                # return original objects since geometries are immutable
+                return coordinates
+            elif not coordinates.is_valid:
+                raise TopologicalError("An input LineString must be valid.")
+            else:
+                # LineString
+                # TODO convert LineString to LinearRing more directly?
+                coordinates = coordinates.coords
+
+        else:
+            if hasattr(coordinates, "__array__"):
+                coordinates = np.asarray(coordinates)
+            if isinstance(coordinates, np.ndarray) and np.issubdtype(
+                coordinates.dtype, np.number
+            ):
+                pass
+            else:
+                # check coordinates on points
+                def _coords(o):
+                    if isinstance(o, Point):
+                        return o.coords[0]
+                    else:
+                        return [float(c) for c in o]
+
+                coordinates = np.array([_coords(o) for o in coordinates])
+                if not np.issubdtype(coordinates.dtype, np.number):
+                    # conversion of coords to 2D array failed, this might be due
+                    # to inconsistent coordinate dimensionality
+                    raise ValueError("Inconsistent coordinate dimensionality")
+
+        if len(coordinates) == 0:
+            # empty geometry
+            # TODO better constructor + should shapely.linearrings handle this?
+            return shapely.from_wkt("LINEARRING EMPTY")
+
+        geom = shapely.linearrings(coordinates)
+        if not isinstance(geom, LinearRing):
+            raise ValueError("Invalid values passed to LinearRing constructor")
+        return geom
+
+    @property
+    def __geo_interface__(self):
+        return {"type": "LinearRing", "coordinates": tuple(self.coords)}
+
+    def __reduce__(self):
+        """WKB doesn't differentiate between LineString and LinearRing so we
+        need to move the coordinate sequence into the correct geometry type"""
+        return (_unpickle_linearring, (shapely.to_wkb(self, include_srid=True),))
+
+    @property
+    def is_ccw(self):
+        """True is the ring is oriented counter clock-wise"""
+        return bool(is_ccw_impl()(self))
+
+    @property
+    def is_simple(self):
+        """True if the geometry is simple, meaning that any self-intersections
+        are only at boundary points, else False"""
+        return bool(shapely.is_simple(self))
+
+
+shapely.lib.registry[2] = LinearRing
+
+
+class InteriorRingSequence:
+
+    _parent = None
+    _ndim = None
+    _index = 0
+    _length = 0
+
+    def __init__(self, parent):
+        self._parent = parent
+        self._ndim = parent._ndim
+
+    def __iter__(self):
+        self._index = 0
+        self._length = self.__len__()
+        return self
+
+    def __next__(self):
+        if self._index < self._length:
+            ring = self._get_ring(self._index)
+            self._index += 1
+            return ring
+        else:
+            raise StopIteration
+
+    def __len__(self):
+        return shapely.get_num_interior_rings(self._parent)
+
+    def __getitem__(self, key):
+        m = self.__len__()
+        if isinstance(key, int):
+            if key + m < 0 or key >= m:
+                raise IndexError("index out of range")
+            if key < 0:
+                i = m + key
+            else:
+                i = key
+            return self._get_ring(i)
+        elif isinstance(key, slice):
+            res = []
+            start, stop, stride = key.indices(m)
+            for i in range(start, stop, stride):
+                res.append(self._get_ring(i))
+            return res
+        else:
+            raise TypeError("key must be an index or slice")
+
+    def _get_ring(self, i):
+        return shapely.get_interior_ring(self._parent, i)
+
+
+
[docs]class Polygon(BaseGeometry): + """ + A geometry type representing an area that is enclosed by a linear ring. + + A polygon is a two-dimensional feature and has a non-zero area. It may + have one or more negative-space "holes" which are also bounded by linear + rings. If any rings cross each other, the feature is invalid and + operations on it may fail. + + Parameters + ---------- + shell : sequence + A sequence of (x, y [,z]) numeric coordinate pairs or triples, or + an array-like with shape (N, 2) or (N, 3). + Also can be a sequence of Point objects. + holes : sequence + A sequence of objects which satisfy the same requirements as the + shell parameters above + + Attributes + ---------- + exterior : LinearRing + The ring which bounds the positive space of the polygon. + interiors : sequence + A sequence of rings which bound all existing holes. + + Examples + -------- + Create a square polygon with no holes + + >>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.)) + >>> polygon = Polygon(coords) + >>> polygon.area + 1.0 + """ + + __slots__ = [] + + def __new__(self, shell=None, holes=None): + if shell is None: + # empty geometry + # TODO better way? + return shapely.from_wkt("POLYGON EMPTY") + elif isinstance(shell, Polygon): + # return original objects since geometries are immutable + return shell + else: + shell = LinearRing(shell) + + if holes is not None: + if len(holes) == 0: + # shapely constructor cannot handle holes=[] + holes = None + else: + holes = [LinearRing(ring) for ring in holes] + + geom = shapely.polygons(shell, holes=holes) + if not isinstance(geom, Polygon): + raise ValueError("Invalid values passed to Polygon constructor") + return geom + + @property + def exterior(self): + return shapely.get_exterior_ring(self) + + @property + def interiors(self): + if self.is_empty: + return [] + return InteriorRingSequence(self) + + @property + def coords(self): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not" + ) + + @property + def __geo_interface__(self): + if self.exterior == LinearRing(): + coords = [] + else: + coords = [tuple(self.exterior.coords)] + for hole in self.interiors: + coords.append(tuple(hole.coords)) + return {"type": "Polygon", "coordinates": tuple(coords)} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG path element for the Polygon geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG stroke-width. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + exterior_coords = [["{},{}".format(*c) for c in self.exterior.coords]] + interior_coords = [ + ["{},{}".format(*c) for c in interior.coords] for interior in self.interiors + ] + path = " ".join( + [ + "M {} L {} z".format(coords[0], " L ".join(coords[1:])) + for coords in exterior_coords + interior_coords + ] + ) + return ( + '<path fill-rule="evenodd" fill="{2}" stroke="#555555" ' + 'stroke-width="{0}" opacity="{3}" d="{1}" />' + ).format(2.0 * scale_factor, path, fill_color, opacity)
+ +
[docs] @classmethod + def from_bounds(cls, xmin, ymin, xmax, ymax): + """Construct a `Polygon()` from spatial bounds.""" + return cls([(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin)])
+ + +shapely.lib.registry[3] = Polygon + + +def orient(polygon, sign=1.0): + s = float(sign) + rings = [] + ring = polygon.exterior + if signed_area(ring) / s >= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + for ring in polygon.interiors: + if signed_area(ring) / s <= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + return Polygon(rings[0], rings[1:]) +
+ +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_modules/shapely/ops/index.html b/_modules/shapely/ops/index.html new file mode 100644 index 0000000..8dda2b0 --- /dev/null +++ b/_modules/shapely/ops/index.html @@ -0,0 +1,851 @@ + + + + + + shapely.ops — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.ops

+"""Support for various GEOS geometry operations
+"""
+
+from warnings import warn
+
+import shapely
+from shapely.algorithms.polylabel import polylabel  # noqa
+from shapely.errors import GeometryTypeError, ShapelyDeprecationWarning
+from shapely.geometry import (
+    GeometryCollection,
+    LineString,
+    MultiLineString,
+    MultiPoint,
+    Point,
+    Polygon,
+    shape,
+)
+from shapely.geometry.base import BaseGeometry, BaseMultipartGeometry
+from shapely.geometry.polygon import orient as orient_
+from shapely.prepared import prep
+
+__all__ = [
+    "cascaded_union",
+    "linemerge",
+    "operator",
+    "polygonize",
+    "polygonize_full",
+    "transform",
+    "unary_union",
+    "triangulate",
+    "voronoi_diagram",
+    "split",
+    "nearest_points",
+    "validate",
+    "snap",
+    "shared_paths",
+    "clip_by_rect",
+    "orient",
+    "substring",
+]
+
+
+class CollectionOperator:
+    def shapeup(self, ob):
+        if isinstance(ob, BaseGeometry):
+            return ob
+        else:
+            try:
+                return shape(ob)
+            except (ValueError, AttributeError):
+                return LineString(ob)
+
+    def polygonize(self, lines):
+        """Creates polygons from a source of lines
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        collection = shapely.polygonize(obs)
+        return collection.geoms
+
+    def polygonize_full(self, lines):
+        """Creates polygons from a source of lines, returning the polygons
+        and leftover geometries.
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+
+        Returns a tuple of objects: (polygons, cut edges, dangles, invalid ring
+        lines). Each are a geometry collection.
+
+        Dangles are edges which have one or both ends which are not incident on
+        another edge endpoint. Cut edges are connected at both ends but do not
+        form part of polygon. Invalid ring lines form rings which are invalid
+        (bowties, etc).
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        return shapely.polygonize_full(obs)
+
+    def linemerge(self, lines, directed=False):
+        """Merges all connected lines from a source
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.  Returns a
+        LineString or MultiLineString when lines are not contiguous.
+        """
+        source = None
+        if getattr(lines, "geom_type", None) == "MultiLineString":
+            source = lines
+        elif hasattr(lines, "geoms"):
+            # other Multi geometries
+            source = MultiLineString([ls.coords for ls in lines.geoms])
+        elif hasattr(lines, "__iter__"):
+            try:
+                source = MultiLineString([ls.coords for ls in lines])
+            except AttributeError:
+                source = MultiLineString(lines)
+        if source is None:
+            raise ValueError(f"Cannot linemerge {lines}")
+        return shapely.line_merge(source, directed=directed)
+
+    def cascaded_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        .. deprecated:: 1.8
+            This function was superseded by :meth:`unary_union`.
+        """
+        warn(
+            "The 'cascaded_union()' function is deprecated. "
+            "Use 'unary_union()' instead.",
+            ShapelyDeprecationWarning,
+            stacklevel=2,
+        )
+        return shapely.union_all(geoms, axis=None)
+
+    def unary_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        Usually used to convert a collection into the smallest set of polygons
+        that cover the same area.
+        """
+        return shapely.union_all(geoms, axis=None)
+
+
+operator = CollectionOperator()
+polygonize = operator.polygonize
+polygonize_full = operator.polygonize_full
+linemerge = operator.linemerge
+cascaded_union = operator.cascaded_union
+unary_union = operator.unary_union
+
+
+def triangulate(geom, tolerance=0.0, edges=False):
+    """Creates the Delaunay triangulation and returns a list of geometries
+
+    The source may be any geometry type. All vertices of the geometry will be
+    used as the points of the triangulation.
+
+    From the GEOS documentation:
+    tolerance is the snapping tolerance used to improve the robustness of
+    the triangulation computation. A tolerance of 0.0 specifies that no
+    snapping will take place.
+
+    If edges is False, a list of Polygons (triangles) will be returned.
+    Otherwise the list of LineString edges is returned.
+
+    """
+    collection = shapely.delaunay_triangles(geom, tolerance=tolerance, only_edges=edges)
+    return [g for g in collection.geoms]
+
+
+def voronoi_diagram(geom, envelope=None, tolerance=0.0, edges=False):
+    """
+    Constructs a Voronoi Diagram [1] from the given geometry.
+    Returns a list of geometries.
+
+    Parameters
+    ----------
+    geom: geometry
+        the input geometry whose vertices will be used to calculate
+        the final diagram.
+    envelope: geometry, None
+        clipping envelope for the returned diagram, automatically
+        determined if None. The diagram will be clipped to the larger
+        of this envelope or an envelope surrounding the sites.
+    tolerance: float, 0.0
+        sets the snapping tolerance used to improve the robustness
+        of the computation. A tolerance of 0.0 specifies that no
+        snapping will take place.
+    edges: bool, False
+        If False, return regions as polygons. Else, return only
+        edges e.g. LineStrings.
+
+    GEOS documentation can be found at [2]
+
+    Returns
+    -------
+    GeometryCollection
+        geometries representing the Voronoi regions.
+
+    Notes
+    -----
+    The tolerance `argument` can be finicky and is known to cause the
+    algorithm to fail in several cases. If you're using `tolerance`
+    and getting a failure, try removing it. The test cases in
+    tests/test_voronoi_diagram.py show more details.
+
+
+    References
+    ----------
+    [1] https://en.wikipedia.org/wiki/Voronoi_diagram
+    [2] https://geos.osgeo.org/doxygen/geos__c_8h_source.html  (line 730)
+    """
+    try:
+        result = shapely.voronoi_polygons(
+            geom, tolerance=tolerance, extend_to=envelope, only_edges=edges
+        )
+    except shapely.GEOSException as err:
+        errstr = "Could not create Voronoi Diagram with the specified inputs "
+        errstr += f"({err!s})."
+        if tolerance:
+            errstr += " Try running again with default tolerance value."
+        raise ValueError(errstr) from err
+
+    if result.geom_type != "GeometryCollection":
+        return GeometryCollection([result])
+    return result
+
+
+def validate(geom):
+    return shapely.is_valid_reason(geom)
+
+
+
[docs]def transform(func, geom): + """Applies `func` to all coordinates of `geom` and returns a new + geometry of the same type from the transformed coordinates. + + `func` maps x, y, and optionally z to output xp, yp, zp. The input + parameters may iterable types like lists or arrays or single values. + The output shall be of the same type. Scalars in, scalars out. + Lists in, lists out. + + For example, here is an identity function applicable to both types + of input. + + def id_func(x, y, z=None): + return tuple(filter(None, [x, y, z])) + + g2 = transform(id_func, g1) + + Using pyproj >= 2.1, this example will accurately project Shapely geometries: + + import pyproj + + wgs84 = pyproj.CRS('EPSG:4326') + utm = pyproj.CRS('EPSG:32618') + + project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform + + g2 = transform(project, g1) + + Note that the always_xy kwarg is required here as Shapely geometries only support + X,Y coordinate ordering. + + Lambda expressions such as the one in + + g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1) + + also satisfy the requirements for `func`. + """ + if geom.is_empty: + return geom + if geom.geom_type in ("Point", "LineString", "LinearRing", "Polygon"): + + # First we try to apply func to x, y, z sequences. When func is + # optimized for sequences, this is the fastest, though zipping + # the results up to go back into the geometry constructors adds + # extra cost. + try: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)(zip(*func(*zip(*geom.coords)))) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)(zip(*func(*zip(*geom.exterior.coords)))) + holes = list( + type(ring)(zip(*func(*zip(*ring.coords)))) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + # A func that assumes x, y, z are single values will likely raise a + # TypeError, in which case we'll try again. + except TypeError: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)([func(*c) for c in geom.coords]) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)([func(*c) for c in geom.exterior.coords]) + holes = list( + type(ring)([func(*c) for c in ring.coords]) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + elif geom.geom_type.startswith("Multi") or geom.geom_type == "GeometryCollection": + return type(geom)([transform(func, part) for part in geom.geoms]) + else: + raise GeometryTypeError(f"Type {geom.geom_type!r} not recognized")
+ + +def nearest_points(g1, g2): + """Returns the calculated nearest points in the input geometries + + The points are returned in the same order as the input geometries. + """ + seq = shapely.shortest_line(g1, g2) + if seq is None: + if g1.is_empty: + raise ValueError("The first input geometry is empty") + else: + raise ValueError("The second input geometry is empty") + + p1 = shapely.get_point(seq, 0) + p2 = shapely.get_point(seq, 1) + return (p1, p2) + + +def snap(g1, g2, tolerance): + """Snap one geometry to another with a given tolerance + + Vertices of the first geometry are snapped to vertices of the second + geometry. The resulting snapped geometry is returned. The input geometries + are not modified. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + tolerance : float + The snapping tolerance + + Example + ------- + >>> square = Polygon([(1,1), (2, 1), (2, 2), (1, 2), (1, 1)]) + >>> line = LineString([(0,0), (0.8, 0.8), (1.8, 0.95), (2.6, 0.5)]) + >>> result = snap(line, square, 0.5) + >>> result.wkt + 'LINESTRING (0 0, 1 1, 2 1, 2.6 0.5)' + """ + return shapely.snap(g1, g2, tolerance) + + +def shared_paths(g1, g2): + """Find paths shared between the two given lineal geometries + + Returns a GeometryCollection with two elements: + - First element is a MultiLineString containing shared paths with the + same direction for both inputs. + - Second element is a MultiLineString containing shared paths with the + opposite direction for the two inputs. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + """ + if not isinstance(g1, LineString): + raise GeometryTypeError("First geometry must be a LineString") + if not isinstance(g2, LineString): + raise GeometryTypeError("Second geometry must be a LineString") + return shapely.shared_paths(g1, g2) + + +class SplitOp: + @staticmethod + def _split_polygon_with_line(poly, splitter): + """Split a Polygon with a LineString""" + if not isinstance(poly, Polygon): + raise GeometryTypeError("First argument must be a Polygon") + if not isinstance(splitter, LineString): + raise GeometryTypeError("Second argument must be a LineString") + + union = poly.boundary.union(splitter) + + # greatly improves split performance for big geometries with many + # holes (the following contains checks) with minimal overhead + # for common cases + poly = prep(poly) + + # some polygonized geometries may be holes, we do not want them + # that's why we test if the original polygon (poly) contains + # an inner point of polygonized geometry (pg) + return [ + pg for pg in polygonize(union) if poly.contains(pg.representative_point()) + ] + + @staticmethod + def _split_line_with_line(line, splitter): + """Split a LineString with another (Multi)LineString or (Multi)Polygon""" + + # if splitter is a polygon, pick it's boundary + if splitter.geom_type in ("Polygon", "MultiPolygon"): + splitter = splitter.boundary + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, LineString) and not isinstance( + splitter, MultiLineString + ): + raise GeometryTypeError( + "Second argument must be either a LineString or a MultiLineString" + ) + + # | s\l | Interior | Boundary | Exterior | + # |----------|----------|----------|----------| + # | Interior | 0 or F | * | * | At least one of these two must be 0 + # | Boundary | 0 or F | * | * | So either '0********' or '[0F]**0*****' + # | Exterior | * | * | * | No overlapping interiors ('1********') + relation = splitter.relate(line) + if relation[0] == "1": + # The lines overlap at some segment (linear intersection of interiors) + raise ValueError("Input geometry segment overlaps with the splitter.") + elif relation[0] == "0" or relation[3] == "0": + # The splitter crosses or touches the line's interior --> return multilinestring from the split + return line.difference(splitter) + else: + # The splitter does not cross or touch the line's interior --> return collection with identity line + return [line] + + @staticmethod + def _split_line_with_point(line, splitter): + """Split a LineString with a Point""" + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, Point): + raise GeometryTypeError("Second argument must be a Point") + + # check if point is in the interior of the line + if not line.relate_pattern(splitter, "0********"): + # point not on line interior --> return collection with single identity line + # (REASONING: Returning a list with the input line reference and creating a + # GeometryCollection at the general split function prevents unnecessary copying + # of linestrings in multipoint splitting function) + return [line] + elif line.coords[0] == splitter.coords[0]: + # if line is a closed ring the previous test doesn't behave as desired + return [line] + + # point is on line, get the distance from the first point on line + distance_on_line = line.project(splitter) + coords = list(line.coords) + # split the line at the point and create two new lines + current_position = 0.0 + for i in range(len(coords) - 1): + point1 = coords[i] + point2 = coords[i + 1] + dx = point1[0] - point2[0] + dy = point1[1] - point2[1] + segment_length = (dx**2 + dy**2) ** 0.5 + current_position += segment_length + if distance_on_line == current_position: + # splitter is exactly on a vertex + return [LineString(coords[: i + 2]), LineString(coords[i + 1 :])] + elif distance_on_line < current_position: + # splitter is between two vertices + return [ + LineString(coords[: i + 1] + [splitter.coords[0]]), + LineString([splitter.coords[0]] + coords[i + 1 :]), + ] + return [line] + + @staticmethod + def _split_line_with_multipoint(line, splitter): + """Split a LineString with a MultiPoint""" + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, MultiPoint): + raise GeometryTypeError("Second argument must be a MultiPoint") + + chunks = [line] + for pt in splitter.geoms: + new_chunks = [] + for chunk in filter(lambda x: not x.is_empty, chunks): + # add the newly split 2 lines or the same line if not split + new_chunks.extend(SplitOp._split_line_with_point(chunk, pt)) + chunks = new_chunks + + return chunks + + @staticmethod + def split(geom, splitter): + """ + Splits a geometry by another geometry and returns a collection of geometries. This function is the theoretical + opposite of the union of the split geometry parts. If the splitter does not split the geometry, a collection + with a single geometry equal to the input geometry is returned. + The function supports: + - Splitting a (Multi)LineString by a (Multi)Point or (Multi)LineString or (Multi)Polygon + - Splitting a (Multi)Polygon by a LineString + + It may be convenient to snap the splitter with low tolerance to the geometry. For example in the case + of splitting a line by a point, the point must be exactly on the line, for the line to be correctly split. + When splitting a line by a polygon, the boundary of the polygon is used for the operation. + When splitting a line by another line, a ValueError is raised if the two overlap at some segment. + + Parameters + ---------- + geom : geometry + The geometry to be split + splitter : geometry + The geometry that will split the input geom + + Example + ------- + >>> pt = Point((1, 1)) + >>> line = LineString([(0,0), (2,2)]) + >>> result = split(line, pt) + >>> result.wkt + 'GEOMETRYCOLLECTION (LINESTRING (0 0, 1 1), LINESTRING (1 1, 2 2))' + """ + + if geom.geom_type in ("MultiLineString", "MultiPolygon"): + return GeometryCollection( + [i for part in geom.geoms for i in SplitOp.split(part, splitter).geoms] + ) + + elif geom.geom_type == "LineString": + if splitter.geom_type in ( + "LineString", + "MultiLineString", + "Polygon", + "MultiPolygon", + ): + split_func = SplitOp._split_line_with_line + elif splitter.geom_type == "Point": + split_func = SplitOp._split_line_with_point + elif splitter.geom_type == "MultiPoint": + split_func = SplitOp._split_line_with_multipoint + else: + raise GeometryTypeError( + f"Splitting a LineString with a {splitter.geom_type} is not supported" + ) + + elif geom.geom_type == "Polygon": + if splitter.geom_type == "LineString": + split_func = SplitOp._split_polygon_with_line + else: + raise GeometryTypeError( + f"Splitting a Polygon with a {splitter.geom_type} is not supported" + ) + + else: + raise GeometryTypeError( + f"Splitting {geom.geom_type} geometry is not supported" + ) + + return GeometryCollection(split_func(geom, splitter)) + + +split = SplitOp.split + + +def substring(geom, start_dist, end_dist, normalized=False): + """Return a line segment between specified distances along a LineString + + Negative distance values are taken as measured in the reverse + direction from the end of the geometry. Out-of-range index + values are handled by clamping them to the valid range of values. + + If the start distance equals the end distance, a Point is returned. + + If the start distance is actually beyond the end distance, then the + reversed substring is returned such that the start distance is + at the first coordinate. + + Parameters + ---------- + geom : LineString + The geometry to get a substring of. + start_dist : float + The distance along `geom` of the start of the substring. + end_dist : float + The distance along `geom` of the end of the substring. + normalized : bool, False + Whether the distance parameters are interpreted as a + fraction of the geometry's length. + + Returns + ------- + Union[Point, LineString] + The substring between `start_dist` and `end_dist` or a Point + if they are at the same location. + + Raises + ------ + TypeError + If `geom` is not a LineString. + + Examples + -------- + >>> from shapely.geometry import LineString + >>> from shapely.ops import substring + >>> ls = LineString((i, 0) for i in range(6)) + >>> ls.wkt + 'LINESTRING (0 0, 1 0, 2 0, 3 0, 4 0, 5 0)' + >>> substring(ls, start_dist=1, end_dist=3).wkt + 'LINESTRING (1 0, 2 0, 3 0)' + >>> substring(ls, start_dist=3, end_dist=1).wkt + 'LINESTRING (3 0, 2 0, 1 0)' + >>> substring(ls, start_dist=1, end_dist=-3).wkt + 'LINESTRING (1 0, 2 0)' + >>> substring(ls, start_dist=0.2, end_dist=-0.6, normalized=True).wkt + 'LINESTRING (1 0, 2 0)' + + Returning a `Point` when `start_dist` and `end_dist` are at the + same location. + + >>> substring(ls, 2.5, -2.5).wkt + 'POINT (2.5 0)' + """ + + if not isinstance(geom, LineString): + raise GeometryTypeError( + "Can only calculate a substring of LineString geometries. " + f"A {geom.geom_type} was provided." + ) + + # Filter out cases in which to return a point + if start_dist == end_dist: + return geom.interpolate(start_dist, normalized) + elif not normalized and start_dist >= geom.length and end_dist >= geom.length: + return geom.interpolate(geom.length, normalized) + elif not normalized and -start_dist >= geom.length and -end_dist >= geom.length: + return geom.interpolate(0, normalized) + elif normalized and start_dist >= 1 and end_dist >= 1: + return geom.interpolate(1, normalized) + elif normalized and -start_dist >= 1 and -end_dist >= 1: + return geom.interpolate(0, normalized) + + if normalized: + start_dist *= geom.length + end_dist *= geom.length + + # Filter out cases where distances meet at a middle point from opposite ends. + if start_dist < 0 < end_dist and abs(start_dist) + end_dist == geom.length: + return geom.interpolate(end_dist) + elif end_dist < 0 < start_dist and abs(end_dist) + start_dist == geom.length: + return geom.interpolate(start_dist) + + start_point = geom.interpolate(start_dist) + end_point = geom.interpolate(end_dist) + + if start_dist < 0: + start_dist = geom.length + start_dist # Values may still be negative, + if end_dist < 0: # but only in the out-of-range + end_dist = geom.length + end_dist # sense, not the wrap-around sense. + + reverse = start_dist > end_dist + if reverse: + start_dist, end_dist = end_dist, start_dist + + if start_dist < 0: + start_dist = 0 # to avoid duplicating the first vertex + + if reverse: + vertex_list = [(end_point.x, end_point.y)] + else: + vertex_list = [(start_point.x, start_point.y)] + + coords = list(geom.coords) + current_distance = 0 + for p1, p2 in zip(coords, coords[1:]): + if start_dist < current_distance < end_dist: + vertex_list.append(p1) + elif current_distance >= end_dist: + break + + current_distance += ((p2[0] - p1[0]) ** 2 + (p2[1] - p1[1]) ** 2) ** 0.5 + + if reverse: + vertex_list.append((start_point.x, start_point.y)) + # reverse direction result + vertex_list = reversed(vertex_list) + else: + vertex_list.append((end_point.x, end_point.y)) + + return LineString(vertex_list) + + +def clip_by_rect(geom, xmin, ymin, xmax, ymax): + """Returns the portion of a geometry within a rectangle + + The geometry is clipped in a fast but possibly dirty way. The output is + not guaranteed to be valid. No exceptions will be raised for topological + errors. + + Parameters + ---------- + geom : geometry + The geometry to be clipped + xmin : float + Minimum x value of the rectangle + ymin : float + Minimum y value of the rectangle + xmax : float + Maximum x value of the rectangle + ymax : float + Maximum y value of the rectangle + + Notes + ----- + Requires GEOS >= 3.5.0 + New in 1.7. + """ + if geom.is_empty: + return geom + return shapely.clip_by_rect(geom, xmin, ymin, xmax, ymax) + + +def orient(geom, sign=1.0): + """A properly oriented copy of the given geometry. + + The signed area of the result will have the given sign. A sign of + 1.0 means that the coordinates of the product's exterior rings will + be oriented counter-clockwise. + + Parameters + ---------- + geom : Geometry + The original geometry. May be a Polygon, MultiPolygon, or + GeometryCollection. + sign : float, optional. + The sign of the result's signed area. + + Returns + ------- + Geometry + + """ + if isinstance(geom, BaseMultipartGeometry): + return geom.__class__( + list( + map( + lambda geom: orient(geom, sign), + geom.geoms, + ) + ) + ) + if isinstance(geom, (Polygon,)): + return orient_(geom, sign) + return geom +
+ +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/_sources/_generated/lasso.CubeTransit.rst.txt b/_sources/_generated/lasso.CubeTransit.rst.txt new file mode 100644 index 0000000..e24b49e --- /dev/null +++ b/_sources/_generated/lasso.CubeTransit.rst.txt @@ -0,0 +1,36 @@ +lasso.CubeTransit +================= + +.. currentmodule:: lasso + +.. autoclass:: CubeTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~CubeTransit.__init__ + ~CubeTransit.add_additional_time_periods + ~CubeTransit.add_cube + ~CubeTransit.build_route_name + ~CubeTransit.calculate_start_end_times + ~CubeTransit.create_add_route_card_dict + ~CubeTransit.create_delete_route_card_dict + ~CubeTransit.create_from_cube + ~CubeTransit.create_update_route_card_dict + ~CubeTransit.cube_properties_to_standard_properties + ~CubeTransit.evaluate_differences + ~CubeTransit.evaluate_route_property_differences + ~CubeTransit.evaluate_route_shape_changes + ~CubeTransit.get_time_period_numbers_from_cube_properties + ~CubeTransit.unpack_route_name + + + + + + \ No newline at end of file diff --git a/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt b/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt new file mode 100644 index 0000000..29190d8 --- /dev/null +++ b/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt @@ -0,0 +1,90 @@ +lasso.ModelRoadwayNetwork +========================= + +.. currentmodule:: lasso + +.. autoclass:: ModelRoadwayNetwork + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~ModelRoadwayNetwork.__init__ + ~ModelRoadwayNetwork.add_counts + ~ModelRoadwayNetwork.add_incident_link_data_to_nodes + ~ModelRoadwayNetwork.add_new_roadway_feature_change + ~ModelRoadwayNetwork.add_variable_using_shst_reference + ~ModelRoadwayNetwork.addition_map + ~ModelRoadwayNetwork.apply + ~ModelRoadwayNetwork.apply_managed_lane_feature_change + ~ModelRoadwayNetwork.apply_python_calculation + ~ModelRoadwayNetwork.apply_roadway_feature_change + ~ModelRoadwayNetwork.assess_connectivity + ~ModelRoadwayNetwork.build_selection_key + ~ModelRoadwayNetwork.calculate_area_type + ~ModelRoadwayNetwork.calculate_centroidconnect + ~ModelRoadwayNetwork.calculate_county + ~ModelRoadwayNetwork.calculate_distance + ~ModelRoadwayNetwork.calculate_mpo + ~ModelRoadwayNetwork.calculate_use + ~ModelRoadwayNetwork.convert_int + ~ModelRoadwayNetwork.create_ML_variable + ~ModelRoadwayNetwork.create_calculated_variables + ~ModelRoadwayNetwork.create_dummy_connector_links + ~ModelRoadwayNetwork.create_hov_corridor_variable + ~ModelRoadwayNetwork.create_managed_lane_network + ~ModelRoadwayNetwork.create_managed_variable + ~ModelRoadwayNetwork.dataframe_to_fixed_width + ~ModelRoadwayNetwork.delete_roadway_feature_change + ~ModelRoadwayNetwork.deletion_map + ~ModelRoadwayNetwork.fill_na + ~ModelRoadwayNetwork.from_RoadwayNetwork + ~ModelRoadwayNetwork.get_attribute + ~ModelRoadwayNetwork.get_managed_lane_node_ids + ~ModelRoadwayNetwork.get_modal_graph + ~ModelRoadwayNetwork.get_modal_links_nodes + ~ModelRoadwayNetwork.get_property_by_time_period_and_group + ~ModelRoadwayNetwork.identify_segment + ~ModelRoadwayNetwork.identify_segment_endpoints + ~ModelRoadwayNetwork.is_network_connected + ~ModelRoadwayNetwork.load_transform_network + ~ModelRoadwayNetwork.network_connection_plot + ~ModelRoadwayNetwork.orig_dest_nodes_foreign_key + ~ModelRoadwayNetwork.ox_graph + ~ModelRoadwayNetwork.path_search + ~ModelRoadwayNetwork.read + ~ModelRoadwayNetwork.read_match_result + ~ModelRoadwayNetwork.rename_variables_for_dbf + ~ModelRoadwayNetwork.roadway_net_to_gdf + ~ModelRoadwayNetwork.roadway_standard_to_met_council_network + ~ModelRoadwayNetwork.select_roadway_features + ~ModelRoadwayNetwork.selection_has_unique_link_id + ~ModelRoadwayNetwork.selection_map + ~ModelRoadwayNetwork.shortest_path + ~ModelRoadwayNetwork.split_properties_by_time_period_and_category + ~ModelRoadwayNetwork.update_distance + ~ModelRoadwayNetwork.validate_link_schema + ~ModelRoadwayNetwork.validate_node_schema + ~ModelRoadwayNetwork.validate_properties + ~ModelRoadwayNetwork.validate_selection + ~ModelRoadwayNetwork.validate_shape_schema + ~ModelRoadwayNetwork.validate_uniqueness + ~ModelRoadwayNetwork.write + ~ModelRoadwayNetwork.write_roadway_as_fixedwidth + ~ModelRoadwayNetwork.write_roadway_as_shp + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~ModelRoadwayNetwork.CALCULATED_VALUES + + \ No newline at end of file diff --git a/_sources/_generated/lasso.Parameters.rst.txt b/_sources/_generated/lasso.Parameters.rst.txt new file mode 100644 index 0000000..28d2c86 --- /dev/null +++ b/_sources/_generated/lasso.Parameters.rst.txt @@ -0,0 +1,31 @@ +lasso.Parameters +================ + +.. currentmodule:: lasso + +.. autoclass:: Parameters + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Parameters.__init__ + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Parameters.cube_time_periods + ~Parameters.properties_to_split + ~Parameters.county_link_range_dict + ~Parameters.zones + + \ No newline at end of file diff --git a/_sources/_generated/lasso.Project.rst.txt b/_sources/_generated/lasso.Project.rst.txt new file mode 100644 index 0000000..e6e6bcc --- /dev/null +++ b/_sources/_generated/lasso.Project.rst.txt @@ -0,0 +1,42 @@ +lasso.Project +============= + +.. currentmodule:: lasso + +.. autoclass:: Project + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Project.__init__ + ~Project.add_highway_changes + ~Project.add_transit_changes + ~Project.create_project + ~Project.determine_roadway_network_changes_compatability + ~Project.emme_id_to_wrangler_id + ~Project.emme_name_to_wrangler_name + ~Project.evaluate_changes + ~Project.get_object_from_network_build_command + ~Project.get_operation_from_network_build_command + ~Project.read_logfile + ~Project.read_network_build_file + ~Project.write_project_card + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Project.CALCULATED_VALUES + ~Project.DEFAULT_PROJECT_NAME + ~Project.STATIC_VALUES + + \ No newline at end of file diff --git a/_sources/_generated/lasso.StandardTransit.rst.txt b/_sources/_generated/lasso.StandardTransit.rst.txt new file mode 100644 index 0000000..4fae048 --- /dev/null +++ b/_sources/_generated/lasso.StandardTransit.rst.txt @@ -0,0 +1,33 @@ +lasso.StandardTransit +===================== + +.. currentmodule:: lasso + +.. autoclass:: StandardTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~StandardTransit.__init__ + ~StandardTransit.calculate_cube_mode + ~StandardTransit.cube_format + ~StandardTransit.evaluate_differences + ~StandardTransit.fromTransitNetwork + ~StandardTransit.read_gtfs + ~StandardTransit.route_properties_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_dict_list + ~StandardTransit.shape_gtfs_to_emme + ~StandardTransit.time_to_cube_time_period + ~StandardTransit.write_as_cube_lin + + + + + + \ No newline at end of file diff --git a/_sources/_generated/lasso.logger.rst.txt b/_sources/_generated/lasso.logger.rst.txt new file mode 100644 index 0000000..2054273 --- /dev/null +++ b/_sources/_generated/lasso.logger.rst.txt @@ -0,0 +1,29 @@ +lasso.logger +============ + +.. automodule:: lasso.logger + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + setupLogging + + + + + + + + + + + + + diff --git a/_sources/_generated/lasso.util.rst.txt b/_sources/_generated/lasso.util.rst.txt new file mode 100644 index 0000000..95fecf8 --- /dev/null +++ b/_sources/_generated/lasso.util.rst.txt @@ -0,0 +1,35 @@ +lasso.util +========== + +.. automodule:: lasso.util + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + column_name_to_parts + create_locationreference + geodesic_point_buffer + get_shared_streets_intersection_hash + hhmmss_to_datetime + secs_to_datetime + shorten_name + + + + + + + + + + + + + diff --git a/_sources/autodoc.rst.txt b/_sources/autodoc.rst.txt new file mode 100644 index 0000000..7e48d58 --- /dev/null +++ b/_sources/autodoc.rst.txt @@ -0,0 +1,29 @@ +Lasso Classes and Functions +==================================== + +.. automodule:: lasso + :no-members: + :no-undoc-members: + :no-inherited-members: + :no-show-inheritance: + + +Base Classes +-------------- +.. autosummary:: + :toctree: _generated + :nosignatures: + + CubeTransit + StandardTransit + ModelRoadwayNetwork + Project + Parameters + +Utils and Functions +-------------------- +.. autosummary:: + :toctree: _generated + + util + logger diff --git a/_sources/index.rst.txt b/_sources/index.rst.txt new file mode 100644 index 0000000..616853c --- /dev/null +++ b/_sources/index.rst.txt @@ -0,0 +1,36 @@ +.. lasso documentation master file, created by + sphinx-quickstart on Thu Dec 5 15:43:28 2019. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +Welcome to lasso's documentation! +================================= + +This package of utilities is a wrapper around the +`network_wrangler `_ package +for MetCouncil and MTC. It aims to have the following functionality: + +1. parse Cube log files and base highway networks and create ProjectCards + for Network Wrangler +2. parse two Cube transit line files and create ProjectCards for NetworkWrangler +3. refine Network Wrangler highway networks to contain specific variables and + settings for the respective agency and export them to a format that can + be read in by Citilab's Cube software. + +.. toctree:: + :maxdepth: 3 + :caption: Contents: + + starting + setup + running + autodoc + + + +Indices and tables +================== + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/_sources/running.md.txt b/_sources/running.md.txt new file mode 100644 index 0000000..e139dc8 --- /dev/null +++ b/_sources/running.md.txt @@ -0,0 +1,12 @@ +# Running Lasso + +## Create project files + + +## Create a scenario + + +## Exporting networks + + +## Auditing and Reporting diff --git a/_sources/setup.md.txt b/_sources/setup.md.txt new file mode 100644 index 0000000..e77d463 --- /dev/null +++ b/_sources/setup.md.txt @@ -0,0 +1,9 @@ +# Setup + +### Projects + +### Parameters + +### Settings + +### Additional Data Files diff --git a/_sources/starting.md.txt b/_sources/starting.md.txt new file mode 100644 index 0000000..8886f95 --- /dev/null +++ b/_sources/starting.md.txt @@ -0,0 +1,292 @@ +# Starting Out + +## Installation + +If you are managing multiple python versions, we suggest using [`virtualenv`](https://virtualenv.pypa.io/en/latest/) or [`conda`](https://conda.io/en/latest/) virtual environments. + +Example using a conda environment (recommended) and using the package manager [pip](https://pip.pypa.io/en/stable/) to install Lasso from the source on GitHub. + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/Lasso@master +``` + +Lasso will install `network_wrangler` from the [PyPi](https://pypi.org/project/network-wrangler/) repository because it is included in Lasso's `requirements.txt`. + +#### Bleeding Edge +If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the `develop` branch of + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/network_wrangler@develop +pip install git+https://github.com/wsp-sag/Lasso@develop +``` + +#### From Clone +If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in [editable mode](https://pip.pypa.io/en/stable/reference/pip_install/?highlight=editable#editable-installs). + +**if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!** + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas osmnx -n +conda activate +git clone https://github.com/wsp-sag/Lasso +git clone https://github.com/wsp-sag/network_wrangler +cd network_wrangler +pip install -e . +cd .. +cd Lasso +pip install -e . +``` + +Notes: + +1. The -e installs it in editable mode. +2. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per [these directions on github](https://help.github.com/en/articles/fork-a-repo). +3. if you wanted to install from a specific tag/version number or branch, replace `@master` with `@` or `@tag` +4. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler + +If you are going to be doing Lasso development, we also recommend: + - a good IDE such as [Atom](http://atom.io), VS Code, Sublime Text, etc. + with Python syntax highlighting turned on. + - [GitHub Desktop](https://desktop.github.com/) to locally update your clones + +## Brief Intro + +Lasso is a 'wrapper' around the [Network Wrangler](http://wsp-sag.github.io/network_wrangler) utility. + +Both Lasso and NetworkWrangler are built around the following data schemas: + - [`roadway network`], which is based on a mashup of Open Street Map and [Shared Streets](http://sharedstreets.io). In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category. + - [`transit network`], which is based on a frequency-based implementation of the csv-based GTFS; and + - [`project card`], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml. + +In addition, Lasso utilizes the following data schemas: + + - [`MetCouncil Model Roadway Network Schema`], which adds data fields to the `roadway network` schema that MetCouncil uses in their travel model including breaking out data fields by time period. + - [`MetCouncil Model Transit Network Schema`], which uses the Cube PublicTransport format, and + - [`Cube Log Files`], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler. + - [`Cube public transport line files`], which define a set of transit lines in the cube software. + +### Components +Network Wrangler has the following atomic parts: + + - _RoadwayNetwork_ object, which represents the `roadway network` data as GeoDataFrames; + - _TransitNetwork_ object, which represents the `transit network` data as DataFrames; + - _ProjectCard_ object, which represents the data of the `project card`. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.; + - _Scenario_ object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network. + +In addition, Lasso has the following atomic parts: + + - _Project_ object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file. + - _ModelRoadwayNetwork_ object is a subclass of `RoadwayNetwork` and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube. + - _StandardTransit_, an object for holding a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files. + - _CubeTransit_, an object for storing information about transit defined in `Cube public transport line files` + . Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries. + - _Parameters_, A class representing all the parameters defining the networks + including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + +#### RoadwayNetwork + +Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames `links_df`, `nodes_df`, and `shapes_df`. + +```python +net = RoadwayNetwork.read( + link_filename=MY_LINK_FILE, + node_filename=MY_NODE_FILE, + shape_filename=MY_SHAPE_FILE, + shape_foreign_key ='shape_id', + + ) +my_selection = { + "link": [{"name": ["I 35E"]}], + "A": {"osm_node_id": "961117623"}, # start searching for segments at A + "B": {"osm_node_id": "2564047368"}, +} +net.select_roadway_features(my_selection) + +my_change = [ + { + 'property': 'lanes', + 'existing': 1, + 'set': 2, + }, + { + 'property': 'drive_access', + 'set': 0, + }, +] + +my_net.apply_roadway_feature_change( + my_net.select_roadway_features(my_selection), + my_change +) + +ml_net = net.create_managed_lane_network(in_place=False) + +ml_net.is_network_connected(mode="drive")) + +_, disconnected_nodes = ml_net.assess_connectivity( + mode="walk", + ignore_end_nodes=True +) +ml_net.write(filename=my_out_prefix, path=my_dir) +``` +#### TransitNetwork + +#### ProjectCard + +#### Scenario + +Manages sets of project cards and tiering from a base scenario/set of networks. + +```python + +my_base_scenario = { + "road_net": RoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ), + "transit_net": TransitNetwork.read(STPAUL_DIR), +} + +card_filenames = [ + "3_multiple_roadway_attribute_change.yml", + "multiple_changes.yml", + "4_simple_managed_lane.yml", +] + +project_card_directory = os.path.join(STPAUL_DIR, "project_cards") + +project_cards_list = [ + ProjectCard.read(os.path.join(project_card_directory, filename), validate=False) + for filename in card_filenames +] + +my_scenario = Scenario.create_scenario( + base_scenario=my_base_scenario, + project_cards_list=project_cards_list, +) +my_scenario.check_scenario_requisites() + +my_scenario.apply_all_projects() + +my_scenario.scenario_summary() +``` + +#### Project +Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network; + +```python + +test_project = Project.create_project( + base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + +test_project.evaluate_changes() + +test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + +``` + +#### ModelRoadwayNetwork +A subclass of network_wrangler's RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema. + +```Python + +net = ModelRoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ) + +net.write_roadway_as_fixedwidth() + +``` + +#### StandardTransit +Translates the standard GTFS data to MetCouncil's Cube Line files. + +```Python +cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) +cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) +``` + +#### CubeTransit +Used by the project class and has the capability to: + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + +```python +tn = CubeTransit.create_from_cube(CUBE_DIR) +transit_change_list = tn.evaluate_differences(base_transit_network) +``` + +#### Parameters +Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary. + +```Python +# read parameters from a yaml configuration file +# could also provide as a key/value pair +with open(config_file) as f: + my_config = yaml.safe_load(f) + +# provide parameters at instantiation of ModelRoadwayNetwork +model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork( + my_scenario.road_net, parameters=my_config.get("my_parameters", {}) + ) +# network written with direction from the parameters given +model_road_net.write_roadway_as_shp() + +``` + +### Typical Workflow + +Workflows in Lasso and Network Wrangler typically accomplish one of two goals: +1. Create Project Cards to document network changes as a result of either transit or roadway projects. +2. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network. + +#### Project Cards from Transit LIN Files + + +#### Project Cards from Cube LOG Files + + +#### Model Network Files for a Scenario + + + +## Running Quickstart Jupyter Notebooks + +To learn basic lasso functionality, please refer to the following jupyter notebooks in the `/notebooks` directory: + + - `Lasso Project Card Creation Quickstart.ipynb` + - `Lasso Scenario Creation Quickstart.ipynb` + + Jupyter notebooks can be started by activating the lasso conda environment and typing `jupyter notebook`: + + ```bash + conda activate + jupyter notebook + ``` diff --git a/_static/_sphinx_javascript_frameworks_compat.js b/_static/_sphinx_javascript_frameworks_compat.js new file mode 100644 index 0000000..8141580 --- /dev/null +++ b/_static/_sphinx_javascript_frameworks_compat.js @@ -0,0 +1,123 @@ +/* Compatability shim for jQuery and underscores.js. + * + * Copyright Sphinx contributors + * Released under the two clause BSD licence + */ + +/** + * small helper function to urldecode strings + * + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent#Decoding_query_parameters_from_a_URL + */ +jQuery.urldecode = function(x) { + if (!x) { + return x + } + return decodeURIComponent(x.replace(/\+/g, ' ')); +}; + +/** + * small helper function to urlencode strings + */ +jQuery.urlencode = encodeURIComponent; + +/** + * This function returns the parsed url parameters of the + * current request. Multiple values per key are supported, + * it will always return arrays of strings for the value parts. + */ +jQuery.getQueryParameters = function(s) { + if (typeof s === 'undefined') + s = document.location.search; + var parts = s.substr(s.indexOf('?') + 1).split('&'); + var result = {}; + for (var i = 0; i < parts.length; i++) { + var tmp = parts[i].split('=', 2); + var key = jQuery.urldecode(tmp[0]); + var value = jQuery.urldecode(tmp[1]); + if (key in result) + result[key].push(value); + else + result[key] = [value]; + } + return result; +}; + +/** + * highlight a given string on a jquery object by wrapping it in + * span elements with the given class name. + */ +jQuery.fn.highlightText = function(text, className) { + function highlight(node, addItems) { + if (node.nodeType === 3) { + var val = node.nodeValue; + var pos = val.toLowerCase().indexOf(text); + if (pos >= 0 && + !jQuery(node.parentNode).hasClass(className) && + !jQuery(node.parentNode).hasClass("nohighlight")) { + var span; + var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.className = className; + } + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + node.parentNode.insertBefore(span, node.parentNode.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling)); + node.nodeValue = val.substr(0, pos); + if (isInSVG) { + var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); + var bbox = node.parentElement.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute('class', className); + addItems.push({ + "parent": node.parentNode, + "target": rect}); + } + } + } + else if (!jQuery(node).is("button, select, textarea")) { + jQuery.each(node.childNodes, function() { + highlight(this, addItems); + }); + } + } + var addItems = []; + var result = this.each(function() { + highlight(this, addItems); + }); + for (var i = 0; i < addItems.length; ++i) { + jQuery(addItems[i].parent).before(addItems[i].target); + } + return result; +}; + +/* + * backward compatibility for jQuery.browser + * This will be supported until firefox bug is fixed. + */ +if (!jQuery.browser) { + jQuery.uaMatch = function(ua) { + ua = ua.toLowerCase(); + + var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || + /(webkit)[ \/]([\w.]+)/.exec(ua) || + /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || + /(msie) ([\w.]+)/.exec(ua) || + ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || + []; + + return { + browser: match[ 1 ] || "", + version: match[ 2 ] || "0" + }; + }; + jQuery.browser = {}; + jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; +} diff --git a/_static/basic.css b/_static/basic.css new file mode 100644 index 0000000..7577acb --- /dev/null +++ b/_static/basic.css @@ -0,0 +1,903 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 230px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li p.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 360px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, figure.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, figure.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, figure.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, figure.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar, +aside.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} + +nav.contents, +aside.topic, +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ + +nav.contents, +aside.topic, +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +aside.sidebar > :last-child, +nav.contents > :last-child, +aside.topic > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +aside.sidebar::after, +nav.contents::after, +aside.topic::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure, figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption, figcaption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number, +figcaption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text, +figcaption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + +/* -- object description styles --------------------------------------------- */ + +.sig { + font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; +} + +.sig-name, code.descname { + background-color: transparent; + font-weight: bold; +} + +.sig-name { + font-size: 1.1em; +} + +code.descname { + font-size: 1.2em; +} + +.sig-prename, code.descclassname { + background-color: transparent; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.sig-param.n { + font-style: italic; +} + +/* C++ specific styling */ + +.sig-inline.c-texpr, +.sig-inline.cpp-texpr { + font-family: unset; +} + +.sig.c .k, .sig.c .kt, +.sig.cpp .k, .sig.cpp .kt { + color: #0033B3; +} + +.sig.c .m, +.sig.cpp .m { + color: #1750EB; +} + +.sig.c .s, .sig.c .sc, +.sig.cpp .s, .sig.cpp .sc { + color: #067D17; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} + +aside.footnote > span, +div.citation > span { + float: left; +} +aside.footnote > span:last-of-type, +div.citation > span:last-of-type { + padding-right: 0.5em; +} +aside.footnote > p { + margin-left: 2em; +} +div.citation > p { + margin-left: 4em; +} +aside.footnote > p:last-of-type, +div.citation > p:last-of-type { + margin-bottom: 0em; +} +aside.footnote > p:last-of-type:after, +div.citation > p:last-of-type:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0 0.5em; + content: ":"; + display: inline-block; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; + white-space: nowrap; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +span.linenos, +div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; + -webkit-user-select: text; /* Safari fallback only */ + -webkit-user-select: none; /* Chrome/Safari */ + -moz-user-select: none; /* Firefox */ + -ms-user-select: none; /* IE10+ */ +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/_static/css/badge_only.css b/_static/css/badge_only.css new file mode 100644 index 0000000..c718cee --- /dev/null +++ b/_static/css/badge_only.css @@ -0,0 +1 @@ +.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-style:normal;font-weight:400;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#FontAwesome) format("svg")}.fa:before{font-family:FontAwesome;font-style:normal;font-weight:400;line-height:1}.fa:before,a .fa{text-decoration:inherit}.fa:before,a .fa,li .fa{display:inline-block}li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before,.icon-book:before{content:"\f02d"}.fa-caret-down:before,.icon-caret-down:before{content:"\f0d7"}.fa-caret-up:before,.icon-caret-up:before{content:"\f0d8"}.fa-caret-left:before,.icon-caret-left:before{content:"\f0d9"}.fa-caret-right:before,.icon-caret-right:before{content:"\f0da"}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60}.rst-versions .rst-current-version:after{clear:both;content:"";display:block}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}} \ No newline at end of file diff --git a/_static/css/fonts/Roboto-Slab-Bold.woff b/_static/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/_static/css/fonts/Roboto-Slab-Bold.woff2 b/_static/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/_static/css/fonts/Roboto-Slab-Regular.woff b/_static/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/_static/css/fonts/Roboto-Slab-Regular.woff2 b/_static/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/_static/css/fonts/fontawesome-webfont.eot b/_static/css/fonts/fontawesome-webfont.eot new file mode 100644 index 0000000..e9f60ca Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.eot differ diff --git a/_static/css/fonts/fontawesome-webfont.svg b/_static/css/fonts/fontawesome-webfont.svg new file mode 100644 index 0000000..855c845 --- /dev/null +++ b/_static/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/_static/css/fonts/fontawesome-webfont.ttf b/_static/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 0000000..35acda2 Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.ttf differ diff --git a/_static/css/fonts/fontawesome-webfont.woff b/_static/css/fonts/fontawesome-webfont.woff new file mode 100644 index 0000000..400014a Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.woff differ diff --git a/_static/css/fonts/fontawesome-webfont.woff2 b/_static/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 0000000..4d13fc6 Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.woff2 differ diff --git a/_static/css/fonts/lato-bold-italic.woff b/_static/css/fonts/lato-bold-italic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/_static/css/fonts/lato-bold-italic.woff differ diff --git a/_static/css/fonts/lato-bold-italic.woff2 b/_static/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/_static/css/fonts/lato-bold-italic.woff2 differ diff --git a/_static/css/fonts/lato-bold.woff b/_static/css/fonts/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/_static/css/fonts/lato-bold.woff differ diff --git a/_static/css/fonts/lato-bold.woff2 b/_static/css/fonts/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/_static/css/fonts/lato-bold.woff2 differ diff --git a/_static/css/fonts/lato-normal-italic.woff b/_static/css/fonts/lato-normal-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/_static/css/fonts/lato-normal-italic.woff differ diff --git a/_static/css/fonts/lato-normal-italic.woff2 b/_static/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/_static/css/fonts/lato-normal-italic.woff2 differ diff --git a/_static/css/fonts/lato-normal.woff b/_static/css/fonts/lato-normal.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/_static/css/fonts/lato-normal.woff differ diff --git a/_static/css/fonts/lato-normal.woff2 b/_static/css/fonts/lato-normal.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/_static/css/fonts/lato-normal.woff2 differ diff --git a/_static/css/theme.css b/_static/css/theme.css new file mode 100644 index 0000000..19a446a --- /dev/null +++ b/_static/css/theme.css @@ -0,0 +1,4 @@ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .eqno .headerlink:before,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .eqno .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a button.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-left.toctree-expand,.wy-menu-vertical li button.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .eqno .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a button.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-right.toctree-expand,.wy-menu-vertical li button.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .eqno .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a button.pull-left.toctree-expand,.wy-menu-vertical li.on a button.pull-left.toctree-expand,.wy-menu-vertical li button.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .eqno .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a button.pull-right.toctree-expand,.wy-menu-vertical li.on a button.pull-right.toctree-expand,.wy-menu-vertical li button.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li button.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content .eqno .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content .eqno a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content p a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li a button.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content .eqno .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content p .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li button.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content .eqno .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a button.toctree-expand,.btn .wy-menu-vertical li.on a button.toctree-expand,.btn .wy-menu-vertical li button.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content .eqno .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a button.toctree-expand,.nav .wy-menu-vertical li.on a button.toctree-expand,.nav .wy-menu-vertical li button.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .eqno .btn .headerlink,.rst-content .eqno .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p .btn .headerlink,.rst-content p .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn button.toctree-expand,.wy-menu-vertical li.current>a .btn button.toctree-expand,.wy-menu-vertical li.current>a .nav button.toctree-expand,.wy-menu-vertical li .nav button.toctree-expand,.wy-menu-vertical li.on a .btn button.toctree-expand,.wy-menu-vertical li.on a .nav button.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .eqno .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li button.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .eqno .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li button.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .eqno .btn .fa-large.headerlink,.rst-content .eqno .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p .btn .fa-large.headerlink,.rst-content p .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn button.fa-large.toctree-expand,.wy-menu-vertical li .nav button.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .eqno .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li button.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .eqno .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li button.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .eqno .btn .fa-spin.headerlink,.rst-content .eqno .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p .btn .fa-spin.headerlink,.rst-content p .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn button.fa-spin.toctree-expand,.wy-menu-vertical li .nav button.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content .eqno .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li button.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content .eqno .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li button.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content .eqno .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li button.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content .eqno .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini button.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.rst-content section ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.rst-content section ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.rst-content section ul li p:last-child,.rst-content section ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.rst-content section ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.rst-content section ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.rst-content section ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content .section ol.arabic,.rst-content .toctree-wrapper ol,.rst-content .toctree-wrapper ol.arabic,.rst-content section ol,.rst-content section ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol.arabic li,.rst-content .section ol li,.rst-content .toctree-wrapper ol.arabic li,.rst-content .toctree-wrapper ol li,.rst-content section ol.arabic li,.rst-content section ol li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol.arabic li ul,.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content .toctree-wrapper ol.arabic li ul,.rst-content .toctree-wrapper ol li p:last-child,.rst-content .toctree-wrapper ol li ul,.rst-content section ol.arabic li ul,.rst-content section ol li p:last-child,.rst-content section ol li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol.arabic li ul li,.rst-content .section ol li ul li,.rst-content .toctree-wrapper ol.arabic li ul li,.rst-content .toctree-wrapper ol li ul li,.rst-content section ol.arabic li ul li,.rst-content section ol li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs>li{display:inline-block;padding-top:5px}.wy-breadcrumbs>li.wy-breadcrumbs-aside{float:right}.rst-content .wy-breadcrumbs>li code,.rst-content .wy-breadcrumbs>li tt,.wy-breadcrumbs>li .rst-content tt,.wy-breadcrumbs>li code{all:inherit;color:inherit}.breadcrumb-item:before{content:"/";color:#bbb;font-size:13px;padding:0 6px 0 3px}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li button.toctree-expand{display:block;float:left;margin-left:-1.2em;line-height:18px;color:#4d4d4d;border:none;background:none;padding:0}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover button.toctree-expand,.wy-menu-vertical li.on a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand{display:block;line-height:18px;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{padding:.4045em 1.618em .4045em 4.045em}.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{padding:.4045em 1.618em .4045em 5.663em}.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a{padding:.4045em 1.618em .4045em 7.281em}.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a{padding:.4045em 1.618em .4045em 8.899em}.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a{padding:.4045em 1.618em .4045em 10.517em}.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a{padding:.4045em 1.618em .4045em 12.135em}.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a{padding:.4045em 1.618em .4045em 13.753em}.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a{padding:.4045em 1.618em .4045em 15.371em}.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 1.618em .4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 button.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 button.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover button.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active button.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em;max-width:100%}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .eqno .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content .eqno .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li button.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version button.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content .toctree-wrapper>p.caption,.rst-content h1,.rst-content h2,.rst-content h3,.rst-content h4,.rst-content h5,.rst-content h6{margin-bottom:24px}.rst-content img{max-width:100%;height:auto}.rst-content div.figure,.rst-content figure{margin-bottom:24px}.rst-content div.figure .caption-text,.rst-content figure .caption-text{font-style:italic}.rst-content div.figure p:last-child.caption,.rst-content figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center,.rst-content figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img,.rst-content section>a>img,.rst-content section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp,.rst-content div.highlight span.linenos{user-select:none;pointer-events:none}.rst-content div.highlight span.linenos{display:inline-block;padding-left:0;padding-right:12px;margin-right:12px;border-right:1px solid #e6e9ea}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li,.rst-content .toctree-wrapper ol.loweralpha,.rst-content .toctree-wrapper ol.loweralpha>li,.rst-content section ol.loweralpha,.rst-content section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li,.rst-content .toctree-wrapper ol.upperalpha,.rst-content .toctree-wrapper ol.upperalpha>li,.rst-content section ol.upperalpha,.rst-content section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*,.rst-content .toctree-wrapper ol li>*,.rst-content .toctree-wrapper ul li>*,.rst-content section ol li>*,.rst-content section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child,.rst-content .toctree-wrapper ol li>:first-child,.rst-content .toctree-wrapper ul li>:first-child,.rst-content section ol li>:first-child,.rst-content section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child,.rst-content .toctree-wrapper ol li>p,.rst-content .toctree-wrapper ol li>p:last-child,.rst-content .toctree-wrapper ul li>p,.rst-content .toctree-wrapper ul li>p:last-child,.rst-content section ol li>p,.rst-content section ol li>p:last-child,.rst-content section ul li>p,.rst-content section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child,.rst-content .toctree-wrapper ol li>p:only-child,.rst-content .toctree-wrapper ol li>p:only-child:last-child,.rst-content .toctree-wrapper ul li>p:only-child,.rst-content .toctree-wrapper ul li>p:only-child:last-child,.rst-content section ol li>p:only-child,.rst-content section ol li>p:only-child:last-child,.rst-content section ul li>p:only-child,.rst-content section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul,.rst-content .toctree-wrapper ol li>ol,.rst-content .toctree-wrapper ol li>ul,.rst-content .toctree-wrapper ul li>ol,.rst-content .toctree-wrapper ul li>ul,.rst-content section ol li>ol,.rst-content section ol li>ul,.rst-content section ul li>ol,.rst-content section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul,.rst-content .toctree-wrapper ol.simple li>*,.rst-content .toctree-wrapper ol.simple li ol,.rst-content .toctree-wrapper ol.simple li ul,.rst-content .toctree-wrapper ul.simple li>*,.rst-content .toctree-wrapper ul.simple li ol,.rst-content .toctree-wrapper ul.simple li ul,.rst-content section ol.simple li>*,.rst-content section ol.simple li ol,.rst-content section ol.simple li ul,.rst-content section ul.simple li>*,.rst-content section ul.simple li ol,.rst-content section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink{opacity:0;font-size:14px;font-family:FontAwesome;margin-left:.5em}.rst-content .code-block-caption .headerlink:focus,.rst-content .code-block-caption:hover .headerlink,.rst-content .eqno .headerlink:focus,.rst-content .eqno:hover .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink:focus,.rst-content .toctree-wrapper>p.caption:hover .headerlink,.rst-content dl dt .headerlink:focus,.rst-content dl dt:hover .headerlink,.rst-content h1 .headerlink:focus,.rst-content h1:hover .headerlink,.rst-content h2 .headerlink:focus,.rst-content h2:hover .headerlink,.rst-content h3 .headerlink:focus,.rst-content h3:hover .headerlink,.rst-content h4 .headerlink:focus,.rst-content h4:hover .headerlink,.rst-content h5 .headerlink:focus,.rst-content h5:hover .headerlink,.rst-content h6 .headerlink:focus,.rst-content h6:hover .headerlink,.rst-content p.caption .headerlink:focus,.rst-content p.caption:hover .headerlink,.rst-content p .headerlink:focus,.rst-content p:hover .headerlink,.rst-content table>caption .headerlink:focus,.rst-content table>caption:hover .headerlink{opacity:1}.rst-content p a{overflow-wrap:anywhere}.rst-content .wy-table td p,.rst-content .wy-table td ul,.rst-content .wy-table th p,.rst-content .wy-table th ul,.rst-content table.docutils td p,.rst-content table.docutils td ul,.rst-content table.docutils th p,.rst-content table.docutils th ul,.rst-content table.field-list td p,.rst-content table.field-list td ul,.rst-content table.field-list th p,.rst-content table.field-list th ul{font-size:inherit}.rst-content .btn:focus{outline:2px solid}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .citation-reference>span.fn-bracket,.rst-content .footnote-reference>span.fn-bracket{display:none}.rst-content .hlist{width:100%}.rst-content dl dt span.classifier:before{content:" : "}.rst-content dl dt span.classifier-delimiter{display:none!important}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:auto minmax(80%,95%)}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{display:inline-grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{display:grid;grid-template-columns:auto auto minmax(.65rem,auto) minmax(40%,95%)}html.writer-html5 .rst-content aside.citation>span.label,html.writer-html5 .rst-content aside.footnote>span.label,html.writer-html5 .rst-content div.citation>span.label{grid-column-start:1;grid-column-end:2}html.writer-html5 .rst-content aside.citation>span.backrefs,html.writer-html5 .rst-content aside.footnote>span.backrefs,html.writer-html5 .rst-content div.citation>span.backrefs{grid-column-start:2;grid-column-end:3;grid-row-start:1;grid-row-end:3}html.writer-html5 .rst-content aside.citation>p,html.writer-html5 .rst-content aside.footnote>p,html.writer-html5 .rst-content div.citation>p{grid-column-start:4;grid-column-end:5}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{margin-bottom:24px}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.citation>dt>span.brackets:before,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.citation>dt>span.brackets:after,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a{word-break:keep-all}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a:not(:first-child):before,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.citation>dd p,html.writer-html5 .rst-content dl.footnote>dd p{font-size:.9rem}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{padding-left:1rem;padding-right:1rem;font-size:.9rem;line-height:1.2rem}html.writer-html5 .rst-content aside.citation p,html.writer-html5 .rst-content aside.footnote p,html.writer-html5 .rst-content div.citation p{font-size:.9rem;line-height:1.2rem;margin-bottom:12px}html.writer-html5 .rst-content aside.citation span.backrefs,html.writer-html5 .rst-content aside.footnote span.backrefs,html.writer-html5 .rst-content div.citation span.backrefs{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content aside.citation span.backrefs>a,html.writer-html5 .rst-content aside.footnote span.backrefs>a,html.writer-html5 .rst-content div.citation span.backrefs>a{word-break:keep-all}html.writer-html5 .rst-content aside.citation span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content aside.footnote span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content div.citation span.backrefs>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content aside.citation span.label,html.writer-html5 .rst-content aside.footnote span.label,html.writer-html5 .rst-content div.citation span.label{line-height:1.2rem}html.writer-html5 .rst-content aside.citation-list,html.writer-html5 .rst-content aside.footnote-list,html.writer-html5 .rst-content div.citation-list{margin-bottom:24px}html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content aside.footnote-list aside.footnote,html.writer-html5 .rst-content div.citation-list>div.citation,html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content aside.footnote-list aside.footnote code,html.writer-html5 .rst-content aside.footnote-list aside.footnote tt,html.writer-html5 .rst-content aside.footnote code,html.writer-html5 .rst-content aside.footnote tt,html.writer-html5 .rst-content div.citation-list>div.citation code,html.writer-html5 .rst-content div.citation-list>div.citation tt,html.writer-html5 .rst-content dl.citation code,html.writer-html5 .rst-content dl.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c;white-space:normal}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040;overflow-wrap:normal}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}.rst-content dl dd>ol:last-child,.rst-content dl dd>p:last-child,.rst-content dl dd>table:last-child,.rst-content dl dd>ul:last-child{margin-bottom:0}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px;max-width:100%}html.writer-html4 .rst-content dl:not(.docutils) .k,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .k{font-style:italic}html.writer-html4 .rst-content dl:not(.docutils) .descclassname,html.writer-html4 .rst-content dl:not(.docutils) .descname,html.writer-html4 .rst-content dl:not(.docutils) .sig-name,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .sig-name{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#000}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel,.rst-content .menuselection{font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content .guilabel,.rst-content .menuselection{border:1px solid #7fbbe3;background:#e7f2fa}.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>.kbd,.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>kbd{color:inherit;font-size:80%;background-color:#fff;border:1px solid #a6a6a6;border-radius:4px;box-shadow:0 2px grey;padding:2.4px 6px;margin:auto 0}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} \ No newline at end of file diff --git a/_static/doctools.js b/_static/doctools.js new file mode 100644 index 0000000..d06a71d --- /dev/null +++ b/_static/doctools.js @@ -0,0 +1,156 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Base JavaScript utilities for all Sphinx HTML documentation. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +const BLACKLISTED_KEY_CONTROL_ELEMENTS = new Set([ + "TEXTAREA", + "INPUT", + "SELECT", + "BUTTON", +]); + +const _ready = (callback) => { + if (document.readyState !== "loading") { + callback(); + } else { + document.addEventListener("DOMContentLoaded", callback); + } +}; + +/** + * Small JavaScript module for the documentation. + */ +const Documentation = { + init: () => { + Documentation.initDomainIndexTable(); + Documentation.initOnKeyListeners(); + }, + + /** + * i18n support + */ + TRANSLATIONS: {}, + PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), + LOCALE: "unknown", + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext: (string) => { + const translated = Documentation.TRANSLATIONS[string]; + switch (typeof translated) { + case "undefined": + return string; // no translation + case "string": + return translated; // translation exists + default: + return translated[0]; // (singular, plural) translation tuple exists + } + }, + + ngettext: (singular, plural, n) => { + const translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated !== "undefined") + return translated[Documentation.PLURAL_EXPR(n)]; + return n === 1 ? singular : plural; + }, + + addTranslations: (catalog) => { + Object.assign(Documentation.TRANSLATIONS, catalog.messages); + Documentation.PLURAL_EXPR = new Function( + "n", + `return (${catalog.plural_expr})` + ); + Documentation.LOCALE = catalog.locale; + }, + + /** + * helper function to focus on search bar + */ + focusSearchBar: () => { + document.querySelectorAll("input[name=q]")[0]?.focus(); + }, + + /** + * Initialise the domain index toggle buttons + */ + initDomainIndexTable: () => { + const toggler = (el) => { + const idNumber = el.id.substr(7); + const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); + if (el.src.substr(-9) === "minus.png") { + el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; + toggledRows.forEach((el) => (el.style.display = "none")); + } else { + el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; + toggledRows.forEach((el) => (el.style.display = "")); + } + }; + + const togglerElements = document.querySelectorAll("img.toggler"); + togglerElements.forEach((el) => + el.addEventListener("click", (event) => toggler(event.currentTarget)) + ); + togglerElements.forEach((el) => (el.style.display = "")); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); + }, + + initOnKeyListeners: () => { + // only install a listener if it is really needed + if ( + !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && + !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS + ) + return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.altKey || event.ctrlKey || event.metaKey) return; + + if (!event.shiftKey) { + switch (event.key) { + case "ArrowLeft": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const prevLink = document.querySelector('link[rel="prev"]'); + if (prevLink && prevLink.href) { + window.location.href = prevLink.href; + event.preventDefault(); + } + break; + case "ArrowRight": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const nextLink = document.querySelector('link[rel="next"]'); + if (nextLink && nextLink.href) { + window.location.href = nextLink.href; + event.preventDefault(); + } + break; + } + } + + // some keyboard layouts may need Shift to get / + switch (event.key) { + case "/": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.focusSearchBar(); + event.preventDefault(); + } + }); + }, +}; + +// quick alias for translations +const _ = Documentation.gettext; + +_ready(Documentation.init); diff --git a/_static/documentation_options.js b/_static/documentation_options.js new file mode 100644 index 0000000..c066c69 --- /dev/null +++ b/_static/documentation_options.js @@ -0,0 +1,14 @@ +var DOCUMENTATION_OPTIONS = { + URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'), + VERSION: '', + LANGUAGE: 'en', + COLLAPSE_INDEX: false, + BUILDER: 'dirhtml', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false, + SHOW_SEARCH_SUMMARY: true, + ENABLE_SEARCH_SHORTCUTS: true, +}; \ No newline at end of file diff --git a/_static/file.png b/_static/file.png new file mode 100644 index 0000000..a858a41 Binary files /dev/null and b/_static/file.png differ diff --git a/_static/graphviz.css b/_static/graphviz.css new file mode 100644 index 0000000..8d81c02 --- /dev/null +++ b/_static/graphviz.css @@ -0,0 +1,19 @@ +/* + * graphviz.css + * ~~~~~~~~~~~~ + * + * Sphinx stylesheet -- graphviz extension. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +img.graphviz { + border: 0; + max-width: 100%; +} + +object.graphviz { + max-width: 100%; +} diff --git a/_static/jquery.js b/_static/jquery.js new file mode 100644 index 0000000..c4c6022 --- /dev/null +++ b/_static/jquery.js @@ -0,0 +1,2 @@ +/*! jQuery v3.6.0 | (c) OpenJS Foundation and other contributors | jquery.org/license */ +!function(e,t){"use strict";"object"==typeof module&&"object"==typeof module.exports?module.exports=e.document?t(e,!0):function(e){if(!e.document)throw new Error("jQuery requires a window with a document");return t(e)}:t(e)}("undefined"!=typeof window?window:this,function(C,e){"use strict";var t=[],r=Object.getPrototypeOf,s=t.slice,g=t.flat?function(e){return t.flat.call(e)}:function(e){return t.concat.apply([],e)},u=t.push,i=t.indexOf,n={},o=n.toString,v=n.hasOwnProperty,a=v.toString,l=a.call(Object),y={},m=function(e){return"function"==typeof e&&"number"!=typeof e.nodeType&&"function"!=typeof e.item},x=function(e){return null!=e&&e===e.window},E=C.document,c={type:!0,src:!0,nonce:!0,noModule:!0};function b(e,t,n){var r,i,o=(n=n||E).createElement("script");if(o.text=e,t)for(r in c)(i=t[r]||t.getAttribute&&t.getAttribute(r))&&o.setAttribute(r,i);n.head.appendChild(o).parentNode.removeChild(o)}function w(e){return null==e?e+"":"object"==typeof e||"function"==typeof e?n[o.call(e)]||"object":typeof e}var f="3.6.0",S=function(e,t){return new S.fn.init(e,t)};function p(e){var t=!!e&&"length"in e&&e.length,n=w(e);return!m(e)&&!x(e)&&("array"===n||0===t||"number"==typeof t&&0+~]|"+M+")"+M+"*"),U=new RegExp(M+"|>"),X=new RegExp(F),V=new RegExp("^"+I+"$"),G={ID:new RegExp("^#("+I+")"),CLASS:new RegExp("^\\.("+I+")"),TAG:new RegExp("^("+I+"|[*])"),ATTR:new RegExp("^"+W),PSEUDO:new RegExp("^"+F),CHILD:new RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+M+"*(even|odd|(([+-]|)(\\d*)n|)"+M+"*(?:([+-]|)"+M+"*(\\d+)|))"+M+"*\\)|)","i"),bool:new RegExp("^(?:"+R+")$","i"),needsContext:new RegExp("^"+M+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+M+"*((?:-\\d)?\\d*)"+M+"*\\)|)(?=[^-]|$)","i")},Y=/HTML$/i,Q=/^(?:input|select|textarea|button)$/i,J=/^h\d$/i,K=/^[^{]+\{\s*\[native \w/,Z=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,ee=/[+~]/,te=new RegExp("\\\\[\\da-fA-F]{1,6}"+M+"?|\\\\([^\\r\\n\\f])","g"),ne=function(e,t){var n="0x"+e.slice(1)-65536;return t||(n<0?String.fromCharCode(n+65536):String.fromCharCode(n>>10|55296,1023&n|56320))},re=/([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g,ie=function(e,t){return t?"\0"===e?"\ufffd":e.slice(0,-1)+"\\"+e.charCodeAt(e.length-1).toString(16)+" ":"\\"+e},oe=function(){T()},ae=be(function(e){return!0===e.disabled&&"fieldset"===e.nodeName.toLowerCase()},{dir:"parentNode",next:"legend"});try{H.apply(t=O.call(p.childNodes),p.childNodes),t[p.childNodes.length].nodeType}catch(e){H={apply:t.length?function(e,t){L.apply(e,O.call(t))}:function(e,t){var n=e.length,r=0;while(e[n++]=t[r++]);e.length=n-1}}}function se(t,e,n,r){var i,o,a,s,u,l,c,f=e&&e.ownerDocument,p=e?e.nodeType:9;if(n=n||[],"string"!=typeof t||!t||1!==p&&9!==p&&11!==p)return n;if(!r&&(T(e),e=e||C,E)){if(11!==p&&(u=Z.exec(t)))if(i=u[1]){if(9===p){if(!(a=e.getElementById(i)))return n;if(a.id===i)return n.push(a),n}else if(f&&(a=f.getElementById(i))&&y(e,a)&&a.id===i)return n.push(a),n}else{if(u[2])return H.apply(n,e.getElementsByTagName(t)),n;if((i=u[3])&&d.getElementsByClassName&&e.getElementsByClassName)return H.apply(n,e.getElementsByClassName(i)),n}if(d.qsa&&!N[t+" "]&&(!v||!v.test(t))&&(1!==p||"object"!==e.nodeName.toLowerCase())){if(c=t,f=e,1===p&&(U.test(t)||z.test(t))){(f=ee.test(t)&&ye(e.parentNode)||e)===e&&d.scope||((s=e.getAttribute("id"))?s=s.replace(re,ie):e.setAttribute("id",s=S)),o=(l=h(t)).length;while(o--)l[o]=(s?"#"+s:":scope")+" "+xe(l[o]);c=l.join(",")}try{return H.apply(n,f.querySelectorAll(c)),n}catch(e){N(t,!0)}finally{s===S&&e.removeAttribute("id")}}}return g(t.replace($,"$1"),e,n,r)}function ue(){var r=[];return function e(t,n){return r.push(t+" ")>b.cacheLength&&delete e[r.shift()],e[t+" "]=n}}function le(e){return e[S]=!0,e}function ce(e){var t=C.createElement("fieldset");try{return!!e(t)}catch(e){return!1}finally{t.parentNode&&t.parentNode.removeChild(t),t=null}}function fe(e,t){var n=e.split("|"),r=n.length;while(r--)b.attrHandle[n[r]]=t}function pe(e,t){var n=t&&e,r=n&&1===e.nodeType&&1===t.nodeType&&e.sourceIndex-t.sourceIndex;if(r)return r;if(n)while(n=n.nextSibling)if(n===t)return-1;return e?1:-1}function de(t){return function(e){return"input"===e.nodeName.toLowerCase()&&e.type===t}}function he(n){return function(e){var t=e.nodeName.toLowerCase();return("input"===t||"button"===t)&&e.type===n}}function ge(t){return function(e){return"form"in e?e.parentNode&&!1===e.disabled?"label"in e?"label"in e.parentNode?e.parentNode.disabled===t:e.disabled===t:e.isDisabled===t||e.isDisabled!==!t&&ae(e)===t:e.disabled===t:"label"in e&&e.disabled===t}}function ve(a){return le(function(o){return o=+o,le(function(e,t){var n,r=a([],e.length,o),i=r.length;while(i--)e[n=r[i]]&&(e[n]=!(t[n]=e[n]))})})}function ye(e){return e&&"undefined"!=typeof e.getElementsByTagName&&e}for(e in d=se.support={},i=se.isXML=function(e){var t=e&&e.namespaceURI,n=e&&(e.ownerDocument||e).documentElement;return!Y.test(t||n&&n.nodeName||"HTML")},T=se.setDocument=function(e){var t,n,r=e?e.ownerDocument||e:p;return r!=C&&9===r.nodeType&&r.documentElement&&(a=(C=r).documentElement,E=!i(C),p!=C&&(n=C.defaultView)&&n.top!==n&&(n.addEventListener?n.addEventListener("unload",oe,!1):n.attachEvent&&n.attachEvent("onunload",oe)),d.scope=ce(function(e){return a.appendChild(e).appendChild(C.createElement("div")),"undefined"!=typeof e.querySelectorAll&&!e.querySelectorAll(":scope fieldset div").length}),d.attributes=ce(function(e){return e.className="i",!e.getAttribute("className")}),d.getElementsByTagName=ce(function(e){return e.appendChild(C.createComment("")),!e.getElementsByTagName("*").length}),d.getElementsByClassName=K.test(C.getElementsByClassName),d.getById=ce(function(e){return a.appendChild(e).id=S,!C.getElementsByName||!C.getElementsByName(S).length}),d.getById?(b.filter.ID=function(e){var t=e.replace(te,ne);return function(e){return e.getAttribute("id")===t}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n=t.getElementById(e);return n?[n]:[]}}):(b.filter.ID=function(e){var n=e.replace(te,ne);return function(e){var t="undefined"!=typeof e.getAttributeNode&&e.getAttributeNode("id");return t&&t.value===n}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n,r,i,o=t.getElementById(e);if(o){if((n=o.getAttributeNode("id"))&&n.value===e)return[o];i=t.getElementsByName(e),r=0;while(o=i[r++])if((n=o.getAttributeNode("id"))&&n.value===e)return[o]}return[]}}),b.find.TAG=d.getElementsByTagName?function(e,t){return"undefined"!=typeof t.getElementsByTagName?t.getElementsByTagName(e):d.qsa?t.querySelectorAll(e):void 0}:function(e,t){var n,r=[],i=0,o=t.getElementsByTagName(e);if("*"===e){while(n=o[i++])1===n.nodeType&&r.push(n);return r}return o},b.find.CLASS=d.getElementsByClassName&&function(e,t){if("undefined"!=typeof t.getElementsByClassName&&E)return t.getElementsByClassName(e)},s=[],v=[],(d.qsa=K.test(C.querySelectorAll))&&(ce(function(e){var t;a.appendChild(e).innerHTML="",e.querySelectorAll("[msallowcapture^='']").length&&v.push("[*^$]="+M+"*(?:''|\"\")"),e.querySelectorAll("[selected]").length||v.push("\\["+M+"*(?:value|"+R+")"),e.querySelectorAll("[id~="+S+"-]").length||v.push("~="),(t=C.createElement("input")).setAttribute("name",""),e.appendChild(t),e.querySelectorAll("[name='']").length||v.push("\\["+M+"*name"+M+"*="+M+"*(?:''|\"\")"),e.querySelectorAll(":checked").length||v.push(":checked"),e.querySelectorAll("a#"+S+"+*").length||v.push(".#.+[+~]"),e.querySelectorAll("\\\f"),v.push("[\\r\\n\\f]")}),ce(function(e){e.innerHTML="";var t=C.createElement("input");t.setAttribute("type","hidden"),e.appendChild(t).setAttribute("name","D"),e.querySelectorAll("[name=d]").length&&v.push("name"+M+"*[*^$|!~]?="),2!==e.querySelectorAll(":enabled").length&&v.push(":enabled",":disabled"),a.appendChild(e).disabled=!0,2!==e.querySelectorAll(":disabled").length&&v.push(":enabled",":disabled"),e.querySelectorAll("*,:x"),v.push(",.*:")})),(d.matchesSelector=K.test(c=a.matches||a.webkitMatchesSelector||a.mozMatchesSelector||a.oMatchesSelector||a.msMatchesSelector))&&ce(function(e){d.disconnectedMatch=c.call(e,"*"),c.call(e,"[s!='']:x"),s.push("!=",F)}),v=v.length&&new RegExp(v.join("|")),s=s.length&&new RegExp(s.join("|")),t=K.test(a.compareDocumentPosition),y=t||K.test(a.contains)?function(e,t){var n=9===e.nodeType?e.documentElement:e,r=t&&t.parentNode;return e===r||!(!r||1!==r.nodeType||!(n.contains?n.contains(r):e.compareDocumentPosition&&16&e.compareDocumentPosition(r)))}:function(e,t){if(t)while(t=t.parentNode)if(t===e)return!0;return!1},j=t?function(e,t){if(e===t)return l=!0,0;var n=!e.compareDocumentPosition-!t.compareDocumentPosition;return n||(1&(n=(e.ownerDocument||e)==(t.ownerDocument||t)?e.compareDocumentPosition(t):1)||!d.sortDetached&&t.compareDocumentPosition(e)===n?e==C||e.ownerDocument==p&&y(p,e)?-1:t==C||t.ownerDocument==p&&y(p,t)?1:u?P(u,e)-P(u,t):0:4&n?-1:1)}:function(e,t){if(e===t)return l=!0,0;var n,r=0,i=e.parentNode,o=t.parentNode,a=[e],s=[t];if(!i||!o)return e==C?-1:t==C?1:i?-1:o?1:u?P(u,e)-P(u,t):0;if(i===o)return pe(e,t);n=e;while(n=n.parentNode)a.unshift(n);n=t;while(n=n.parentNode)s.unshift(n);while(a[r]===s[r])r++;return r?pe(a[r],s[r]):a[r]==p?-1:s[r]==p?1:0}),C},se.matches=function(e,t){return se(e,null,null,t)},se.matchesSelector=function(e,t){if(T(e),d.matchesSelector&&E&&!N[t+" "]&&(!s||!s.test(t))&&(!v||!v.test(t)))try{var n=c.call(e,t);if(n||d.disconnectedMatch||e.document&&11!==e.document.nodeType)return n}catch(e){N(t,!0)}return 0":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(e){return e[1]=e[1].replace(te,ne),e[3]=(e[3]||e[4]||e[5]||"").replace(te,ne),"~="===e[2]&&(e[3]=" "+e[3]+" "),e.slice(0,4)},CHILD:function(e){return e[1]=e[1].toLowerCase(),"nth"===e[1].slice(0,3)?(e[3]||se.error(e[0]),e[4]=+(e[4]?e[5]+(e[6]||1):2*("even"===e[3]||"odd"===e[3])),e[5]=+(e[7]+e[8]||"odd"===e[3])):e[3]&&se.error(e[0]),e},PSEUDO:function(e){var t,n=!e[6]&&e[2];return G.CHILD.test(e[0])?null:(e[3]?e[2]=e[4]||e[5]||"":n&&X.test(n)&&(t=h(n,!0))&&(t=n.indexOf(")",n.length-t)-n.length)&&(e[0]=e[0].slice(0,t),e[2]=n.slice(0,t)),e.slice(0,3))}},filter:{TAG:function(e){var t=e.replace(te,ne).toLowerCase();return"*"===e?function(){return!0}:function(e){return e.nodeName&&e.nodeName.toLowerCase()===t}},CLASS:function(e){var t=m[e+" "];return t||(t=new RegExp("(^|"+M+")"+e+"("+M+"|$)"))&&m(e,function(e){return t.test("string"==typeof e.className&&e.className||"undefined"!=typeof e.getAttribute&&e.getAttribute("class")||"")})},ATTR:function(n,r,i){return function(e){var t=se.attr(e,n);return null==t?"!="===r:!r||(t+="","="===r?t===i:"!="===r?t!==i:"^="===r?i&&0===t.indexOf(i):"*="===r?i&&-1:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i;function j(e,n,r){return m(n)?S.grep(e,function(e,t){return!!n.call(e,t,e)!==r}):n.nodeType?S.grep(e,function(e){return e===n!==r}):"string"!=typeof n?S.grep(e,function(e){return-1)[^>]*|#([\w-]+))$/;(S.fn.init=function(e,t,n){var r,i;if(!e)return this;if(n=n||D,"string"==typeof e){if(!(r="<"===e[0]&&">"===e[e.length-1]&&3<=e.length?[null,e,null]:q.exec(e))||!r[1]&&t)return!t||t.jquery?(t||n).find(e):this.constructor(t).find(e);if(r[1]){if(t=t instanceof S?t[0]:t,S.merge(this,S.parseHTML(r[1],t&&t.nodeType?t.ownerDocument||t:E,!0)),N.test(r[1])&&S.isPlainObject(t))for(r in t)m(this[r])?this[r](t[r]):this.attr(r,t[r]);return this}return(i=E.getElementById(r[2]))&&(this[0]=i,this.length=1),this}return e.nodeType?(this[0]=e,this.length=1,this):m(e)?void 0!==n.ready?n.ready(e):e(S):S.makeArray(e,this)}).prototype=S.fn,D=S(E);var L=/^(?:parents|prev(?:Until|All))/,H={children:!0,contents:!0,next:!0,prev:!0};function O(e,t){while((e=e[t])&&1!==e.nodeType);return e}S.fn.extend({has:function(e){var t=S(e,this),n=t.length;return this.filter(function(){for(var e=0;e\x20\t\r\n\f]*)/i,he=/^$|^module$|\/(?:java|ecma)script/i;ce=E.createDocumentFragment().appendChild(E.createElement("div")),(fe=E.createElement("input")).setAttribute("type","radio"),fe.setAttribute("checked","checked"),fe.setAttribute("name","t"),ce.appendChild(fe),y.checkClone=ce.cloneNode(!0).cloneNode(!0).lastChild.checked,ce.innerHTML="",y.noCloneChecked=!!ce.cloneNode(!0).lastChild.defaultValue,ce.innerHTML="",y.option=!!ce.lastChild;var ge={thead:[1,"","
"],col:[2,"","
"],tr:[2,"","
"],td:[3,"","
"],_default:[0,"",""]};function ve(e,t){var n;return n="undefined"!=typeof e.getElementsByTagName?e.getElementsByTagName(t||"*"):"undefined"!=typeof e.querySelectorAll?e.querySelectorAll(t||"*"):[],void 0===t||t&&A(e,t)?S.merge([e],n):n}function ye(e,t){for(var n=0,r=e.length;n",""]);var me=/<|&#?\w+;/;function xe(e,t,n,r,i){for(var o,a,s,u,l,c,f=t.createDocumentFragment(),p=[],d=0,h=e.length;d\s*$/g;function je(e,t){return A(e,"table")&&A(11!==t.nodeType?t:t.firstChild,"tr")&&S(e).children("tbody")[0]||e}function De(e){return e.type=(null!==e.getAttribute("type"))+"/"+e.type,e}function qe(e){return"true/"===(e.type||"").slice(0,5)?e.type=e.type.slice(5):e.removeAttribute("type"),e}function Le(e,t){var n,r,i,o,a,s;if(1===t.nodeType){if(Y.hasData(e)&&(s=Y.get(e).events))for(i in Y.remove(t,"handle events"),s)for(n=0,r=s[i].length;n").attr(n.scriptAttrs||{}).prop({charset:n.scriptCharset,src:n.url}).on("load error",i=function(e){r.remove(),i=null,e&&t("error"===e.type?404:200,e.type)}),E.head.appendChild(r[0])},abort:function(){i&&i()}}});var _t,zt=[],Ut=/(=)\?(?=&|$)|\?\?/;S.ajaxSetup({jsonp:"callback",jsonpCallback:function(){var e=zt.pop()||S.expando+"_"+wt.guid++;return this[e]=!0,e}}),S.ajaxPrefilter("json jsonp",function(e,t,n){var r,i,o,a=!1!==e.jsonp&&(Ut.test(e.url)?"url":"string"==typeof e.data&&0===(e.contentType||"").indexOf("application/x-www-form-urlencoded")&&Ut.test(e.data)&&"data");if(a||"jsonp"===e.dataTypes[0])return r=e.jsonpCallback=m(e.jsonpCallback)?e.jsonpCallback():e.jsonpCallback,a?e[a]=e[a].replace(Ut,"$1"+r):!1!==e.jsonp&&(e.url+=(Tt.test(e.url)?"&":"?")+e.jsonp+"="+r),e.converters["script json"]=function(){return o||S.error(r+" was not called"),o[0]},e.dataTypes[0]="json",i=C[r],C[r]=function(){o=arguments},n.always(function(){void 0===i?S(C).removeProp(r):C[r]=i,e[r]&&(e.jsonpCallback=t.jsonpCallback,zt.push(r)),o&&m(i)&&i(o[0]),o=i=void 0}),"script"}),y.createHTMLDocument=((_t=E.implementation.createHTMLDocument("").body).innerHTML="
",2===_t.childNodes.length),S.parseHTML=function(e,t,n){return"string"!=typeof e?[]:("boolean"==typeof t&&(n=t,t=!1),t||(y.createHTMLDocument?((r=(t=E.implementation.createHTMLDocument("")).createElement("base")).href=E.location.href,t.head.appendChild(r)):t=E),o=!n&&[],(i=N.exec(e))?[t.createElement(i[1])]:(i=xe([e],t,o),o&&o.length&&S(o).remove(),S.merge([],i.childNodes)));var r,i,o},S.fn.load=function(e,t,n){var r,i,o,a=this,s=e.indexOf(" ");return-1").append(S.parseHTML(e)).find(r):e)}).always(n&&function(e,t){a.each(function(){n.apply(this,o||[e.responseText,t,e])})}),this},S.expr.pseudos.animated=function(t){return S.grep(S.timers,function(e){return t===e.elem}).length},S.offset={setOffset:function(e,t,n){var r,i,o,a,s,u,l=S.css(e,"position"),c=S(e),f={};"static"===l&&(e.style.position="relative"),s=c.offset(),o=S.css(e,"top"),u=S.css(e,"left"),("absolute"===l||"fixed"===l)&&-1<(o+u).indexOf("auto")?(a=(r=c.position()).top,i=r.left):(a=parseFloat(o)||0,i=parseFloat(u)||0),m(t)&&(t=t.call(e,n,S.extend({},s))),null!=t.top&&(f.top=t.top-s.top+a),null!=t.left&&(f.left=t.left-s.left+i),"using"in t?t.using.call(e,f):c.css(f)}},S.fn.extend({offset:function(t){if(arguments.length)return void 0===t?this:this.each(function(e){S.offset.setOffset(this,t,e)});var e,n,r=this[0];return r?r.getClientRects().length?(e=r.getBoundingClientRect(),n=r.ownerDocument.defaultView,{top:e.top+n.pageYOffset,left:e.left+n.pageXOffset}):{top:0,left:0}:void 0},position:function(){if(this[0]){var e,t,n,r=this[0],i={top:0,left:0};if("fixed"===S.css(r,"position"))t=r.getBoundingClientRect();else{t=this.offset(),n=r.ownerDocument,e=r.offsetParent||n.documentElement;while(e&&(e===n.body||e===n.documentElement)&&"static"===S.css(e,"position"))e=e.parentNode;e&&e!==r&&1===e.nodeType&&((i=S(e).offset()).top+=S.css(e,"borderTopWidth",!0),i.left+=S.css(e,"borderLeftWidth",!0))}return{top:t.top-i.top-S.css(r,"marginTop",!0),left:t.left-i.left-S.css(r,"marginLeft",!0)}}},offsetParent:function(){return this.map(function(){var e=this.offsetParent;while(e&&"static"===S.css(e,"position"))e=e.offsetParent;return e||re})}}),S.each({scrollLeft:"pageXOffset",scrollTop:"pageYOffset"},function(t,i){var o="pageYOffset"===i;S.fn[t]=function(e){return $(this,function(e,t,n){var r;if(x(e)?r=e:9===e.nodeType&&(r=e.defaultView),void 0===n)return r?r[i]:e[t];r?r.scrollTo(o?r.pageXOffset:n,o?n:r.pageYOffset):e[t]=n},t,e,arguments.length)}}),S.each(["top","left"],function(e,n){S.cssHooks[n]=Fe(y.pixelPosition,function(e,t){if(t)return t=We(e,n),Pe.test(t)?S(e).position()[n]+"px":t})}),S.each({Height:"height",Width:"width"},function(a,s){S.each({padding:"inner"+a,content:s,"":"outer"+a},function(r,o){S.fn[o]=function(e,t){var n=arguments.length&&(r||"boolean"!=typeof e),i=r||(!0===e||!0===t?"margin":"border");return $(this,function(e,t,n){var r;return x(e)?0===o.indexOf("outer")?e["inner"+a]:e.document.documentElement["client"+a]:9===e.nodeType?(r=e.documentElement,Math.max(e.body["scroll"+a],r["scroll"+a],e.body["offset"+a],r["offset"+a],r["client"+a])):void 0===n?S.css(e,t,i):S.style(e,t,n,i)},s,n?e:void 0,n)}})}),S.each(["ajaxStart","ajaxStop","ajaxComplete","ajaxError","ajaxSuccess","ajaxSend"],function(e,t){S.fn[t]=function(e){return this.on(t,e)}}),S.fn.extend({bind:function(e,t,n){return this.on(e,null,t,n)},unbind:function(e,t){return this.off(e,null,t)},delegate:function(e,t,n,r){return this.on(t,e,n,r)},undelegate:function(e,t,n){return 1===arguments.length?this.off(e,"**"):this.off(t,e||"**",n)},hover:function(e,t){return this.mouseenter(e).mouseleave(t||e)}}),S.each("blur focus focusin focusout resize scroll click dblclick mousedown mouseup mousemove mouseover mouseout mouseenter mouseleave change select submit keydown keypress keyup contextmenu".split(" "),function(e,n){S.fn[n]=function(e,t){return 0",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=y.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=y.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),y.elements=c+" "+a,j(b)}function f(a){var b=x[a[v]];return b||(b={},w++,a[v]=w,x[w]=b),b}function g(a,c,d){if(c||(c=b),q)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():u.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||t.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),q)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return y.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(y,b.frag)}function j(a){a||(a=b);var d=f(a);return!y.shivCSS||p||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),q||i(a,d),a}function k(a){for(var b,c=a.getElementsByTagName("*"),e=c.length,f=RegExp("^(?:"+d().join("|")+")$","i"),g=[];e--;)b=c[e],f.test(b.nodeName)&&g.push(b.applyElement(l(b)));return g}function l(a){for(var b,c=a.attributes,d=c.length,e=a.ownerDocument.createElement(A+":"+a.nodeName);d--;)b=c[d],b.specified&&e.setAttribute(b.nodeName,b.nodeValue);return e.style.cssText=a.style.cssText,e}function m(a){for(var b,c=a.split("{"),e=c.length,f=RegExp("(^|[\\s,>+~])("+d().join("|")+")(?=[[\\s,>+~#.:]|$)","gi"),g="$1"+A+"\\:$2";e--;)b=c[e]=c[e].split("}"),b[b.length-1]=b[b.length-1].replace(f,g),c[e]=b.join("}");return c.join("{")}function n(a){for(var b=a.length;b--;)a[b].removeNode()}function o(a){function b(){clearTimeout(g._removeSheetTimer),d&&d.removeNode(!0),d=null}var d,e,g=f(a),h=a.namespaces,i=a.parentWindow;return!B||a.printShived?a:("undefined"==typeof h[A]&&h.add(A),i.attachEvent("onbeforeprint",function(){b();for(var f,g,h,i=a.styleSheets,j=[],l=i.length,n=Array(l);l--;)n[l]=i[l];for(;h=n.pop();)if(!h.disabled&&z.test(h.media)){try{f=h.imports,g=f.length}catch(o){g=0}for(l=0;g>l;l++)n.push(f[l]);try{j.push(h.cssText)}catch(o){}}j=m(j.reverse().join("")),e=k(a),d=c(a,j)}),i.attachEvent("onafterprint",function(){n(e),clearTimeout(g._removeSheetTimer),g._removeSheetTimer=setTimeout(b,500)}),a.printShived=!0,a)}var p,q,r="3.7.3",s=a.html5||{},t=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,u=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,v="_html5shiv",w=0,x={};!function(){try{var a=b.createElement("a");a.innerHTML="",p="hidden"in a,q=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){p=!0,q=!0}}();var y={elements:s.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:r,shivCSS:s.shivCSS!==!1,supportsUnknownElements:q,shivMethods:s.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=y,j(b);var z=/^$|\b(?:all|print)\b/,A="html5shiv",B=!q&&function(){var c=b.documentElement;return!("undefined"==typeof b.namespaces||"undefined"==typeof b.parentWindow||"undefined"==typeof c.applyElement||"undefined"==typeof c.removeNode||"undefined"==typeof a.attachEvent)}();y.type+=" print",y.shivPrint=o,o(b),"object"==typeof module&&module.exports&&(module.exports=y)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/_static/js/html5shiv.min.js b/_static/js/html5shiv.min.js new file mode 100644 index 0000000..cd1c674 --- /dev/null +++ b/_static/js/html5shiv.min.js @@ -0,0 +1,4 @@ +/** +* @preserve HTML5 Shiv 3.7.3 | @afarkas @jdalton @jon_neal @rem | MIT/GPL2 Licensed +*/ +!function(a,b){function c(a,b){var c=a.createElement("p"),d=a.getElementsByTagName("head")[0]||a.documentElement;return c.innerHTML="x",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=t.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=t.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),t.elements=c+" "+a,j(b)}function f(a){var b=s[a[q]];return b||(b={},r++,a[q]=r,s[r]=b),b}function g(a,c,d){if(c||(c=b),l)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():p.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||o.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),l)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return t.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(t,b.frag)}function j(a){a||(a=b);var d=f(a);return!t.shivCSS||k||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),l||i(a,d),a}var k,l,m="3.7.3-pre",n=a.html5||{},o=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,p=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,q="_html5shiv",r=0,s={};!function(){try{var a=b.createElement("a");a.innerHTML="",k="hidden"in a,l=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){k=!0,l=!0}}();var t={elements:n.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:m,shivCSS:n.shivCSS!==!1,supportsUnknownElements:l,shivMethods:n.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=t,j(b),"object"==typeof module&&module.exports&&(module.exports=t)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/_static/js/theme.js b/_static/js/theme.js new file mode 100644 index 0000000..1fddb6e --- /dev/null +++ b/_static/js/theme.js @@ -0,0 +1 @@ +!function(n){var e={};function t(i){if(e[i])return e[i].exports;var o=e[i]={i:i,l:!1,exports:{}};return n[i].call(o.exports,o,o.exports,t),o.l=!0,o.exports}t.m=n,t.c=e,t.d=function(n,e,i){t.o(n,e)||Object.defineProperty(n,e,{enumerable:!0,get:i})},t.r=function(n){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(n,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(n,"__esModule",{value:!0})},t.t=function(n,e){if(1&e&&(n=t(n)),8&e)return n;if(4&e&&"object"==typeof n&&n&&n.__esModule)return n;var i=Object.create(null);if(t.r(i),Object.defineProperty(i,"default",{enumerable:!0,value:n}),2&e&&"string"!=typeof n)for(var o in n)t.d(i,o,function(e){return n[e]}.bind(null,o));return i},t.n=function(n){var e=n&&n.__esModule?function(){return n.default}:function(){return n};return t.d(e,"a",e),e},t.o=function(n,e){return Object.prototype.hasOwnProperty.call(n,e)},t.p="",t(t.s=0)}([function(n,e,t){t(1),n.exports=t(3)},function(n,e,t){(function(){var e="undefined"!=typeof window?window.jQuery:t(2);n.exports.ThemeNav={navBar:null,win:null,winScroll:!1,winResize:!1,linkScroll:!1,winPosition:0,winHeight:null,docHeight:null,isRunning:!1,enable:function(n){var t=this;void 0===n&&(n=!0),t.isRunning||(t.isRunning=!0,e((function(e){t.init(e),t.reset(),t.win.on("hashchange",t.reset),n&&t.win.on("scroll",(function(){t.linkScroll||t.winScroll||(t.winScroll=!0,requestAnimationFrame((function(){t.onScroll()})))})),t.win.on("resize",(function(){t.winResize||(t.winResize=!0,requestAnimationFrame((function(){t.onResize()})))})),t.onResize()})))},enableSticky:function(){this.enable(!0)},init:function(n){n(document);var e=this;this.navBar=n("div.wy-side-scroll:first"),this.win=n(window),n(document).on("click","[data-toggle='wy-nav-top']",(function(){n("[data-toggle='wy-nav-shift']").toggleClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift")})).on("click",".wy-menu-vertical .current ul li a",(function(){var t=n(this);n("[data-toggle='wy-nav-shift']").removeClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift"),e.toggleCurrent(t),e.hashChange()})).on("click","[data-toggle='rst-current-version']",(function(){n("[data-toggle='rst-versions']").toggleClass("shift-up")})),n("table.docutils:not(.field-list,.footnote,.citation)").wrap("
"),n("table.docutils.footnote").wrap("
"),n("table.docutils.citation").wrap("
"),n(".wy-menu-vertical ul").not(".simple").siblings("a").each((function(){var t=n(this);expand=n(''),expand.on("click",(function(n){return e.toggleCurrent(t),n.stopPropagation(),!1})),t.prepend(expand)}))},reset:function(){var n=encodeURI(window.location.hash)||"#";try{var e=$(".wy-menu-vertical"),t=e.find('[href="'+n+'"]');if(0===t.length){var i=$('.document [id="'+n.substring(1)+'"]').closest("div.section");0===(t=e.find('[href="#'+i.attr("id")+'"]')).length&&(t=e.find('[href="#"]'))}if(t.length>0){$(".wy-menu-vertical .current").removeClass("current").attr("aria-expanded","false"),t.addClass("current").attr("aria-expanded","true"),t.closest("li.toctree-l1").parent().addClass("current").attr("aria-expanded","true");for(let n=1;n<=10;n++)t.closest("li.toctree-l"+n).addClass("current").attr("aria-expanded","true");t[0].scrollIntoView()}}catch(n){console.log("Error expanding nav for anchor",n)}},onScroll:function(){this.winScroll=!1;var n=this.win.scrollTop(),e=n+this.winHeight,t=this.navBar.scrollTop()+(n-this.winPosition);n<0||e>this.docHeight||(this.navBar.scrollTop(t),this.winPosition=n)},onResize:function(){this.winResize=!1,this.winHeight=this.win.height(),this.docHeight=$(document).height()},hashChange:function(){this.linkScroll=!0,this.win.one("hashchange",(function(){this.linkScroll=!1}))},toggleCurrent:function(n){var e=n.closest("li");e.siblings("li.current").removeClass("current").attr("aria-expanded","false"),e.siblings().find("li.current").removeClass("current").attr("aria-expanded","false");var t=e.find("> ul li");t.length&&(t.removeClass("current").attr("aria-expanded","false"),e.toggleClass("current").attr("aria-expanded",(function(n,e){return"true"==e?"false":"true"})))}},"undefined"!=typeof window&&(window.SphinxRtdTheme={Navigation:n.exports.ThemeNav,StickyNav:n.exports.ThemeNav}),function(){for(var n=0,e=["ms","moz","webkit","o"],t=0;t0 + var meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$"; // [C]VC[V] is m=1 + var mgr1 = "^(" + C + ")?" + V + C + V + C; // [C]VCVC... is m>1 + var s_v = "^(" + C + ")?" + v; // vowel in stem + + this.stemWord = function (w) { + var stem; + var suffix; + var firstch; + var origword = w; + + if (w.length < 3) + return w; + + var re; + var re2; + var re3; + var re4; + + firstch = w.substr(0,1); + if (firstch == "y") + w = firstch.toUpperCase() + w.substr(1); + + // Step 1a + re = /^(.+?)(ss|i)es$/; + re2 = /^(.+?)([^s])s$/; + + if (re.test(w)) + w = w.replace(re,"$1$2"); + else if (re2.test(w)) + w = w.replace(re2,"$1$2"); + + // Step 1b + re = /^(.+?)eed$/; + re2 = /^(.+?)(ed|ing)$/; + if (re.test(w)) { + var fp = re.exec(w); + re = new RegExp(mgr0); + if (re.test(fp[1])) { + re = /.$/; + w = w.replace(re,""); + } + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + re2 = new RegExp(s_v); + if (re2.test(stem)) { + w = stem; + re2 = /(at|bl|iz)$/; + re3 = new RegExp("([^aeiouylsz])\\1$"); + re4 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re2.test(w)) + w = w + "e"; + else if (re3.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + else if (re4.test(w)) + w = w + "e"; + } + } + + // Step 1c + re = /^(.+?)y$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(s_v); + if (re.test(stem)) + w = stem + "i"; + } + + // Step 2 + re = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step2list[suffix]; + } + + // Step 3 + re = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step3list[suffix]; + } + + // Step 4 + re = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/; + re2 = /^(.+?)(s|t)(ion)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + if (re.test(stem)) + w = stem; + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1] + fp[2]; + re2 = new RegExp(mgr1); + if (re2.test(stem)) + w = stem; + } + + // Step 5 + re = /^(.+?)e$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + re2 = new RegExp(meq1); + re3 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) + w = stem; + } + re = /ll$/; + re2 = new RegExp(mgr1); + if (re.test(w) && re2.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + + // and turn initial Y back to y + if (firstch == "y") + w = firstch.toLowerCase() + w.substr(1); + return w; + } +} + diff --git a/_static/minus.png b/_static/minus.png new file mode 100644 index 0000000..d96755f Binary files /dev/null and b/_static/minus.png differ diff --git a/_static/plus.png b/_static/plus.png new file mode 100644 index 0000000..7107cec Binary files /dev/null and b/_static/plus.png differ diff --git a/_static/pygments.css b/_static/pygments.css new file mode 100644 index 0000000..08bec68 --- /dev/null +++ b/_static/pygments.css @@ -0,0 +1,74 @@ +pre { line-height: 125%; } +td.linenos .normal { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +span.linenos { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +td.linenos .special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +span.linenos.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +.highlight .hll { background-color: #ffffcc } +.highlight { background: #f8f8f8; } +.highlight .c { color: #3D7B7B; font-style: italic } /* Comment */ +.highlight .err { border: 1px solid #FF0000 } /* Error */ +.highlight .k { color: #008000; font-weight: bold } /* Keyword */ +.highlight .o { color: #666666 } /* Operator */ +.highlight .ch { color: #3D7B7B; font-style: italic } /* Comment.Hashbang */ +.highlight .cm { color: #3D7B7B; font-style: italic } /* Comment.Multiline */ +.highlight .cp { color: #9C6500 } /* Comment.Preproc */ +.highlight .cpf { color: #3D7B7B; font-style: italic } /* Comment.PreprocFile */ +.highlight .c1 { color: #3D7B7B; font-style: italic } /* Comment.Single */ +.highlight .cs { color: #3D7B7B; font-style: italic } /* Comment.Special */ +.highlight .gd { color: #A00000 } /* Generic.Deleted */ +.highlight .ge { font-style: italic } /* Generic.Emph */ +.highlight .gr { color: #E40000 } /* Generic.Error */ +.highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */ +.highlight .gi { color: #008400 } /* Generic.Inserted */ +.highlight .go { color: #717171 } /* Generic.Output */ +.highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */ +.highlight .gs { font-weight: bold } /* Generic.Strong */ +.highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */ +.highlight .gt { color: #0044DD } /* Generic.Traceback */ +.highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */ +.highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */ +.highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */ +.highlight .kp { color: #008000 } /* Keyword.Pseudo */ +.highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */ +.highlight .kt { color: #B00040 } /* Keyword.Type */ +.highlight .m { color: #666666 } /* Literal.Number */ +.highlight .s { color: #BA2121 } /* Literal.String */ +.highlight .na { color: #687822 } /* Name.Attribute */ +.highlight .nb { color: #008000 } /* Name.Builtin */ +.highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */ +.highlight .no { color: #880000 } /* Name.Constant */ +.highlight .nd { color: #AA22FF } /* Name.Decorator */ +.highlight .ni { color: #717171; font-weight: bold } /* Name.Entity */ +.highlight .ne { color: #CB3F38; font-weight: bold } /* Name.Exception */ +.highlight .nf { color: #0000FF } /* Name.Function */ +.highlight .nl { color: #767600 } /* Name.Label */ +.highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */ +.highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */ +.highlight .nv { color: #19177C } /* Name.Variable */ +.highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */ +.highlight .w { color: #bbbbbb } /* Text.Whitespace */ +.highlight .mb { color: #666666 } /* Literal.Number.Bin */ +.highlight .mf { color: #666666 } /* Literal.Number.Float */ +.highlight .mh { color: #666666 } /* Literal.Number.Hex */ +.highlight .mi { color: #666666 } /* Literal.Number.Integer */ +.highlight .mo { color: #666666 } /* Literal.Number.Oct */ +.highlight .sa { color: #BA2121 } /* Literal.String.Affix */ +.highlight .sb { color: #BA2121 } /* Literal.String.Backtick */ +.highlight .sc { color: #BA2121 } /* Literal.String.Char */ +.highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */ +.highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */ +.highlight .s2 { color: #BA2121 } /* Literal.String.Double */ +.highlight .se { color: #AA5D1F; font-weight: bold } /* Literal.String.Escape */ +.highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */ +.highlight .si { color: #A45A77; font-weight: bold } /* Literal.String.Interpol */ +.highlight .sx { color: #008000 } /* Literal.String.Other */ +.highlight .sr { color: #A45A77 } /* Literal.String.Regex */ +.highlight .s1 { color: #BA2121 } /* Literal.String.Single */ +.highlight .ss { color: #19177C } /* Literal.String.Symbol */ +.highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */ +.highlight .fm { color: #0000FF } /* Name.Function.Magic */ +.highlight .vc { color: #19177C } /* Name.Variable.Class */ +.highlight .vg { color: #19177C } /* Name.Variable.Global */ +.highlight .vi { color: #19177C } /* Name.Variable.Instance */ +.highlight .vm { color: #19177C } /* Name.Variable.Magic */ +.highlight .il { color: #666666 } /* Literal.Number.Integer.Long */ \ No newline at end of file diff --git a/_static/searchtools.js b/_static/searchtools.js new file mode 100644 index 0000000..97d56a7 --- /dev/null +++ b/_static/searchtools.js @@ -0,0 +1,566 @@ +/* + * searchtools.js + * ~~~~~~~~~~~~~~~~ + * + * Sphinx JavaScript utilities for the full-text search. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +/** + * Simple result scoring code. + */ +if (typeof Scorer === "undefined") { + var Scorer = { + // Implement the following function to further tweak the score for each result + // The function takes a result array [docname, title, anchor, descr, score, filename] + // and returns the new score. + /* + score: result => { + const [docname, title, anchor, descr, score, filename] = result + return score + }, + */ + + // query matches the full name of an object + objNameMatch: 11, + // or matches in the last dotted part of the object name + objPartialMatch: 6, + // Additive scores depending on the priority of the object + objPrio: { + 0: 15, // used to be importantResults + 1: 5, // used to be objectResults + 2: -5, // used to be unimportantResults + }, + // Used when the priority is not in the mapping. + objPrioDefault: 0, + + // query found in title + title: 15, + partialTitle: 7, + // query found in terms + term: 5, + partialTerm: 2, + }; +} + +const _removeChildren = (element) => { + while (element && element.lastChild) element.removeChild(element.lastChild); +}; + +/** + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions#escaping + */ +const _escapeRegExp = (string) => + string.replace(/[.*+\-?^${}()|[\]\\]/g, "\\$&"); // $& means the whole matched string + +const _displayItem = (item, searchTerms) => { + const docBuilder = DOCUMENTATION_OPTIONS.BUILDER; + const docUrlRoot = DOCUMENTATION_OPTIONS.URL_ROOT; + const docFileSuffix = DOCUMENTATION_OPTIONS.FILE_SUFFIX; + const docLinkSuffix = DOCUMENTATION_OPTIONS.LINK_SUFFIX; + const showSearchSummary = DOCUMENTATION_OPTIONS.SHOW_SEARCH_SUMMARY; + + const [docName, title, anchor, descr, score, _filename] = item; + + let listItem = document.createElement("li"); + let requestUrl; + let linkUrl; + if (docBuilder === "dirhtml") { + // dirhtml builder + let dirname = docName + "/"; + if (dirname.match(/\/index\/$/)) + dirname = dirname.substring(0, dirname.length - 6); + else if (dirname === "index/") dirname = ""; + requestUrl = docUrlRoot + dirname; + linkUrl = requestUrl; + } else { + // normal html builders + requestUrl = docUrlRoot + docName + docFileSuffix; + linkUrl = docName + docLinkSuffix; + } + let linkEl = listItem.appendChild(document.createElement("a")); + linkEl.href = linkUrl + anchor; + linkEl.dataset.score = score; + linkEl.innerHTML = title; + if (descr) + listItem.appendChild(document.createElement("span")).innerHTML = + " (" + descr + ")"; + else if (showSearchSummary) + fetch(requestUrl) + .then((responseData) => responseData.text()) + .then((data) => { + if (data) + listItem.appendChild( + Search.makeSearchSummary(data, searchTerms) + ); + }); + Search.output.appendChild(listItem); +}; +const _finishSearch = (resultCount) => { + Search.stopPulse(); + Search.title.innerText = _("Search Results"); + if (!resultCount) + Search.status.innerText = Documentation.gettext( + "Your search did not match any documents. Please make sure that all words are spelled correctly and that you've selected enough categories." + ); + else + Search.status.innerText = _( + `Search finished, found ${resultCount} page(s) matching the search query.` + ); +}; +const _displayNextItem = ( + results, + resultCount, + searchTerms +) => { + // results left, load the summary and display it + // this is intended to be dynamic (don't sub resultsCount) + if (results.length) { + _displayItem(results.pop(), searchTerms); + setTimeout( + () => _displayNextItem(results, resultCount, searchTerms), + 5 + ); + } + // search finished, update title and status message + else _finishSearch(resultCount); +}; + +/** + * Default splitQuery function. Can be overridden in ``sphinx.search`` with a + * custom function per language. + * + * The regular expression works by splitting the string on consecutive characters + * that are not Unicode letters, numbers, underscores, or emoji characters. + * This is the same as ``\W+`` in Python, preserving the surrogate pair area. + */ +if (typeof splitQuery === "undefined") { + var splitQuery = (query) => query + .split(/[^\p{Letter}\p{Number}_\p{Emoji_Presentation}]+/gu) + .filter(term => term) // remove remaining empty strings +} + +/** + * Search Module + */ +const Search = { + _index: null, + _queued_query: null, + _pulse_status: -1, + + htmlToText: (htmlString) => { + const htmlElement = new DOMParser().parseFromString(htmlString, 'text/html'); + htmlElement.querySelectorAll(".headerlink").forEach((el) => { el.remove() }); + const docContent = htmlElement.querySelector('[role="main"]'); + if (docContent !== undefined) return docContent.textContent; + console.warn( + "Content block not found. Sphinx search tries to obtain it via '[role=main]'. Could you check your theme or template." + ); + return ""; + }, + + init: () => { + const query = new URLSearchParams(window.location.search).get("q"); + document + .querySelectorAll('input[name="q"]') + .forEach((el) => (el.value = query)); + if (query) Search.performSearch(query); + }, + + loadIndex: (url) => + (document.body.appendChild(document.createElement("script")).src = url), + + setIndex: (index) => { + Search._index = index; + if (Search._queued_query !== null) { + const query = Search._queued_query; + Search._queued_query = null; + Search.query(query); + } + }, + + hasIndex: () => Search._index !== null, + + deferQuery: (query) => (Search._queued_query = query), + + stopPulse: () => (Search._pulse_status = -1), + + startPulse: () => { + if (Search._pulse_status >= 0) return; + + const pulse = () => { + Search._pulse_status = (Search._pulse_status + 1) % 4; + Search.dots.innerText = ".".repeat(Search._pulse_status); + if (Search._pulse_status >= 0) window.setTimeout(pulse, 500); + }; + pulse(); + }, + + /** + * perform a search for something (or wait until index is loaded) + */ + performSearch: (query) => { + // create the required interface elements + const searchText = document.createElement("h2"); + searchText.textContent = _("Searching"); + const searchSummary = document.createElement("p"); + searchSummary.classList.add("search-summary"); + searchSummary.innerText = ""; + const searchList = document.createElement("ul"); + searchList.classList.add("search"); + + const out = document.getElementById("search-results"); + Search.title = out.appendChild(searchText); + Search.dots = Search.title.appendChild(document.createElement("span")); + Search.status = out.appendChild(searchSummary); + Search.output = out.appendChild(searchList); + + const searchProgress = document.getElementById("search-progress"); + // Some themes don't use the search progress node + if (searchProgress) { + searchProgress.innerText = _("Preparing search..."); + } + Search.startPulse(); + + // index already loaded, the browser was quick! + if (Search.hasIndex()) Search.query(query); + else Search.deferQuery(query); + }, + + /** + * execute search (requires search index to be loaded) + */ + query: (query) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + const allTitles = Search._index.alltitles; + const indexEntries = Search._index.indexentries; + + // stem the search terms and add them to the correct list + const stemmer = new Stemmer(); + const searchTerms = new Set(); + const excludedTerms = new Set(); + const highlightTerms = new Set(); + const objectTerms = new Set(splitQuery(query.toLowerCase().trim())); + splitQuery(query.trim()).forEach((queryTerm) => { + const queryTermLower = queryTerm.toLowerCase(); + + // maybe skip this "word" + // stopwords array is from language_data.js + if ( + stopwords.indexOf(queryTermLower) !== -1 || + queryTerm.match(/^\d+$/) + ) + return; + + // stem the word + let word = stemmer.stemWord(queryTermLower); + // select the correct list + if (word[0] === "-") excludedTerms.add(word.substr(1)); + else { + searchTerms.add(word); + highlightTerms.add(queryTermLower); + } + }); + + if (SPHINX_HIGHLIGHT_ENABLED) { // set in sphinx_highlight.js + localStorage.setItem("sphinx_highlight_terms", [...highlightTerms].join(" ")) + } + + // console.debug("SEARCH: searching for:"); + // console.info("required: ", [...searchTerms]); + // console.info("excluded: ", [...excludedTerms]); + + // array of [docname, title, anchor, descr, score, filename] + let results = []; + _removeChildren(document.getElementById("search-progress")); + + const queryLower = query.toLowerCase(); + for (const [title, foundTitles] of Object.entries(allTitles)) { + if (title.toLowerCase().includes(queryLower) && (queryLower.length >= title.length/2)) { + for (const [file, id] of foundTitles) { + let score = Math.round(100 * queryLower.length / title.length) + results.push([ + docNames[file], + titles[file] !== title ? `${titles[file]} > ${title}` : title, + id !== null ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // search for explicit entries in index directives + for (const [entry, foundEntries] of Object.entries(indexEntries)) { + if (entry.includes(queryLower) && (queryLower.length >= entry.length/2)) { + for (const [file, id] of foundEntries) { + let score = Math.round(100 * queryLower.length / entry.length) + results.push([ + docNames[file], + titles[file], + id ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // lookup as object + objectTerms.forEach((term) => + results.push(...Search.performObjectSearch(term, objectTerms)) + ); + + // lookup as search terms in fulltext + results.push(...Search.performTermsSearch(searchTerms, excludedTerms)); + + // let the scorer override scores with a custom scoring function + if (Scorer.score) results.forEach((item) => (item[4] = Scorer.score(item))); + + // now sort the results by score (in opposite order of appearance, since the + // display function below uses pop() to retrieve items) and then + // alphabetically + results.sort((a, b) => { + const leftScore = a[4]; + const rightScore = b[4]; + if (leftScore === rightScore) { + // same score: sort alphabetically + const leftTitle = a[1].toLowerCase(); + const rightTitle = b[1].toLowerCase(); + if (leftTitle === rightTitle) return 0; + return leftTitle > rightTitle ? -1 : 1; // inverted is intentional + } + return leftScore > rightScore ? 1 : -1; + }); + + // remove duplicate search results + // note the reversing of results, so that in the case of duplicates, the highest-scoring entry is kept + let seen = new Set(); + results = results.reverse().reduce((acc, result) => { + let resultStr = result.slice(0, 4).concat([result[5]]).map(v => String(v)).join(','); + if (!seen.has(resultStr)) { + acc.push(result); + seen.add(resultStr); + } + return acc; + }, []); + + results = results.reverse(); + + // for debugging + //Search.lastresults = results.slice(); // a copy + // console.info("search results:", Search.lastresults); + + // print the results + _displayNextItem(results, results.length, searchTerms); + }, + + /** + * search for object names + */ + performObjectSearch: (object, objectTerms) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const objects = Search._index.objects; + const objNames = Search._index.objnames; + const titles = Search._index.titles; + + const results = []; + + const objectSearchCallback = (prefix, match) => { + const name = match[4] + const fullname = (prefix ? prefix + "." : "") + name; + const fullnameLower = fullname.toLowerCase(); + if (fullnameLower.indexOf(object) < 0) return; + + let score = 0; + const parts = fullnameLower.split("."); + + // check for different match types: exact matches of full name or + // "last name" (i.e. last dotted part) + if (fullnameLower === object || parts.slice(-1)[0] === object) + score += Scorer.objNameMatch; + else if (parts.slice(-1)[0].indexOf(object) > -1) + score += Scorer.objPartialMatch; // matches in last name + + const objName = objNames[match[1]][2]; + const title = titles[match[0]]; + + // If more than one term searched for, we require other words to be + // found in the name/title/description + const otherTerms = new Set(objectTerms); + otherTerms.delete(object); + if (otherTerms.size > 0) { + const haystack = `${prefix} ${name} ${objName} ${title}`.toLowerCase(); + if ( + [...otherTerms].some((otherTerm) => haystack.indexOf(otherTerm) < 0) + ) + return; + } + + let anchor = match[3]; + if (anchor === "") anchor = fullname; + else if (anchor === "-") anchor = objNames[match[1]][1] + "-" + fullname; + + const descr = objName + _(", in ") + title; + + // add custom score for some objects according to scorer + if (Scorer.objPrio.hasOwnProperty(match[2])) + score += Scorer.objPrio[match[2]]; + else score += Scorer.objPrioDefault; + + results.push([ + docNames[match[0]], + fullname, + "#" + anchor, + descr, + score, + filenames[match[0]], + ]); + }; + Object.keys(objects).forEach((prefix) => + objects[prefix].forEach((array) => + objectSearchCallback(prefix, array) + ) + ); + return results; + }, + + /** + * search for full-text terms in the index + */ + performTermsSearch: (searchTerms, excludedTerms) => { + // prepare search + const terms = Search._index.terms; + const titleTerms = Search._index.titleterms; + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + + const scoreMap = new Map(); + const fileMap = new Map(); + + // perform the search on the required terms + searchTerms.forEach((word) => { + const files = []; + const arr = [ + { files: terms[word], score: Scorer.term }, + { files: titleTerms[word], score: Scorer.title }, + ]; + // add support for partial matches + if (word.length > 2) { + const escapedWord = _escapeRegExp(word); + Object.keys(terms).forEach((term) => { + if (term.match(escapedWord) && !terms[word]) + arr.push({ files: terms[term], score: Scorer.partialTerm }); + }); + Object.keys(titleTerms).forEach((term) => { + if (term.match(escapedWord) && !titleTerms[word]) + arr.push({ files: titleTerms[word], score: Scorer.partialTitle }); + }); + } + + // no match but word was a required one + if (arr.every((record) => record.files === undefined)) return; + + // found search word in contents + arr.forEach((record) => { + if (record.files === undefined) return; + + let recordFiles = record.files; + if (recordFiles.length === undefined) recordFiles = [recordFiles]; + files.push(...recordFiles); + + // set score for the word in each file + recordFiles.forEach((file) => { + if (!scoreMap.has(file)) scoreMap.set(file, {}); + scoreMap.get(file)[word] = record.score; + }); + }); + + // create the mapping + files.forEach((file) => { + if (fileMap.has(file) && fileMap.get(file).indexOf(word) === -1) + fileMap.get(file).push(word); + else fileMap.set(file, [word]); + }); + }); + + // now check if the files don't contain excluded terms + const results = []; + for (const [file, wordList] of fileMap) { + // check if all requirements are matched + + // as search terms with length < 3 are discarded + const filteredTermCount = [...searchTerms].filter( + (term) => term.length > 2 + ).length; + if ( + wordList.length !== searchTerms.size && + wordList.length !== filteredTermCount + ) + continue; + + // ensure that none of the excluded terms is in the search result + if ( + [...excludedTerms].some( + (term) => + terms[term] === file || + titleTerms[term] === file || + (terms[term] || []).includes(file) || + (titleTerms[term] || []).includes(file) + ) + ) + break; + + // select one (max) score for the file. + const score = Math.max(...wordList.map((w) => scoreMap.get(file)[w])); + // add result to the result list + results.push([ + docNames[file], + titles[file], + "", + null, + score, + filenames[file], + ]); + } + return results; + }, + + /** + * helper function to return a node containing the + * search summary for a given text. keywords is a list + * of stemmed words. + */ + makeSearchSummary: (htmlText, keywords) => { + const text = Search.htmlToText(htmlText); + if (text === "") return null; + + const textLower = text.toLowerCase(); + const actualStartPosition = [...keywords] + .map((k) => textLower.indexOf(k.toLowerCase())) + .filter((i) => i > -1) + .slice(-1)[0]; + const startWithContext = Math.max(actualStartPosition - 120, 0); + + const top = startWithContext === 0 ? "" : "..."; + const tail = startWithContext + 240 < text.length ? "..." : ""; + + let summary = document.createElement("p"); + summary.classList.add("context"); + summary.textContent = top + text.substr(startWithContext, 240).trim() + tail; + + return summary; + }, +}; + +_ready(Search.init); diff --git a/_static/sphinx_highlight.js b/_static/sphinx_highlight.js new file mode 100644 index 0000000..aae669d --- /dev/null +++ b/_static/sphinx_highlight.js @@ -0,0 +1,144 @@ +/* Highlighting utilities for Sphinx HTML documentation. */ +"use strict"; + +const SPHINX_HIGHLIGHT_ENABLED = true + +/** + * highlight a given string on a node by wrapping it in + * span elements with the given class name. + */ +const _highlight = (node, addItems, text, className) => { + if (node.nodeType === Node.TEXT_NODE) { + const val = node.nodeValue; + const parent = node.parentNode; + const pos = val.toLowerCase().indexOf(text); + if ( + pos >= 0 && + !parent.classList.contains(className) && + !parent.classList.contains("nohighlight") + ) { + let span; + + const closestNode = parent.closest("body, svg, foreignObject"); + const isInSVG = closestNode && closestNode.matches("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.classList.add(className); + } + + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + parent.insertBefore( + span, + parent.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling + ) + ); + node.nodeValue = val.substr(0, pos); + + if (isInSVG) { + const rect = document.createElementNS( + "http://www.w3.org/2000/svg", + "rect" + ); + const bbox = parent.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute("class", className); + addItems.push({ parent: parent, target: rect }); + } + } + } else if (node.matches && !node.matches("button, select, textarea")) { + node.childNodes.forEach((el) => _highlight(el, addItems, text, className)); + } +}; +const _highlightText = (thisNode, text, className) => { + let addItems = []; + _highlight(thisNode, addItems, text, className); + addItems.forEach((obj) => + obj.parent.insertAdjacentElement("beforebegin", obj.target) + ); +}; + +/** + * Small JavaScript module for the documentation. + */ +const SphinxHighlight = { + + /** + * highlight the search words provided in localstorage in the text + */ + highlightSearchWords: () => { + if (!SPHINX_HIGHLIGHT_ENABLED) return; // bail if no highlight + + // get and clear terms from localstorage + const url = new URL(window.location); + const highlight = + localStorage.getItem("sphinx_highlight_terms") + || url.searchParams.get("highlight") + || ""; + localStorage.removeItem("sphinx_highlight_terms") + url.searchParams.delete("highlight"); + window.history.replaceState({}, "", url); + + // get individual terms from highlight string + const terms = highlight.toLowerCase().split(/\s+/).filter(x => x); + if (terms.length === 0) return; // nothing to do + + // There should never be more than one element matching "div.body" + const divBody = document.querySelectorAll("div.body"); + const body = divBody.length ? divBody[0] : document.querySelector("body"); + window.setTimeout(() => { + terms.forEach((term) => _highlightText(body, term, "highlighted")); + }, 10); + + const searchBox = document.getElementById("searchbox"); + if (searchBox === null) return; + searchBox.appendChild( + document + .createRange() + .createContextualFragment( + '" + ) + ); + }, + + /** + * helper function to hide the search marks again + */ + hideSearchWords: () => { + document + .querySelectorAll("#searchbox .highlight-link") + .forEach((el) => el.remove()); + document + .querySelectorAll("span.highlighted") + .forEach((el) => el.classList.remove("highlighted")); + localStorage.removeItem("sphinx_highlight_terms") + }, + + initEscapeListener: () => { + // only install a listener if it is really needed + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.shiftKey || event.altKey || event.ctrlKey || event.metaKey) return; + if (DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS && (event.key === "Escape")) { + SphinxHighlight.hideSearchWords(); + event.preventDefault(); + } + }); + }, +}; + +_ready(SphinxHighlight.highlightSearchWords); +_ready(SphinxHighlight.initEscapeListener); diff --git a/autodoc/index.html b/autodoc/index.html new file mode 100644 index 0000000..791d441 --- /dev/null +++ b/autodoc/index.html @@ -0,0 +1,163 @@ + + + + + + + Lasso Classes and Functions — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Lasso Classes and Functions

+
+

Base Classes

+ + + + + + + + + + + + + + + + + + +

CubeTransit

Class for storing information about transit defined in Cube line files.

StandardTransit

Holds a standard transit feed as a Partridge object and contains methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files.

ModelRoadwayNetwork

Subclass of network_wrangler class RoadwayNetwork

Project

A single or set of changes to the roadway or transit system.

Parameters

A class representing all the parameters defining the networks including time of day, categories, etc.

+
+
+

Utils and Functions

+ + + + + + + + + +

util

logger

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/.buildinfo b/branch/remove_assignable/.buildinfo new file mode 100644 index 0000000..b0ac44f --- /dev/null +++ b/branch/remove_assignable/.buildinfo @@ -0,0 +1,4 @@ +# Sphinx build info version 1 +# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. +config: a6ecde6fa215bda40c6c2c465fb089ee +tags: d77d1c0d9ca2f4c8421862c7c5a0d620 diff --git a/branch/remove_assignable/_generated/lasso.CubeTransit/index.html b/branch/remove_assignable/_generated/lasso.CubeTransit/index.html new file mode 100644 index 0000000..1bdcd1f --- /dev/null +++ b/branch/remove_assignable/_generated/lasso.CubeTransit/index.html @@ -0,0 +1,571 @@ + + + + + + + lasso.CubeTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.CubeTransit

+
+
+class lasso.CubeTransit(parameters={})[source]
+

Bases: object

+

Class for storing information about transit defined in Cube line +files.

+

Has the capability to:

+
+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
+

Typical usage example:

+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+lines
+

list of strings representing unique line names in +the cube network.

+
+
Type:
+

list

+
+
+
+ +
+
+line_properties
+

dictionary of line properties keyed by line name. Property +values are stored in a dictionary by property name. These +properties are directly read from the cube line files and haven’t +been translated to standard transit values.

+
+
Type:
+

dict

+
+
+
+ +
+
+shapes
+

dictionary of shapes +keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns:

+
+
    +
  • ‘node_id’ (int): positive integer of node id

  • +
  • ‘node’ (int): node number, with negative indicating a non-stop

  • +
  • ‘stop’ (boolean): indicates if it is a stop

  • +
  • ‘order’ (int): order within this shape

  • +
+
+
+
Type:
+

dict

+
+
+
+ +
+
+program_type
+

Either PT or TRNBLD

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

Parameters instance that will be applied to this instance which +includes information about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+source_list
+

List of cube line file sources that have been read and added.

+
+
Type:
+

list

+
+
+
+ +
+
+diff_dict
+
+
Type:
+

dict

+
+
+
+ +
+
+__init__(parameters={})[source]
+

Constructor for CubeTransit

+

parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([parameters])

Constructor for CubeTransit

add_additional_time_periods(...)

Copies a route to another cube time period with appropriate values for time-period-specific properties.

add_cube(transit_source)

Reads a .lin file and adds it to existing TransitNetwork instance.

build_route_name([route_id, time_period, ...])

Create a route name by contatenating route, time period, agency, and direction

calculate_start_end_times(line_properties_dict)

Calculate the start and end times of the property change WARNING: Doesn't take care of discongruous time periods!!!!

create_add_route_card_dict(line)

Creates a project card change formatted dictionary for adding a route based on the information in self.route_properties for the line.

create_delete_route_card_dict(line, ...)

Creates a project card change formatted dictionary for deleting a line.

create_from_cube(transit_source[, parameters])

Reads a cube .lin file and stores as TransitNetwork object.

create_update_route_card_dict(line, ...)

Creates a project card change formatted dictionary for updating the line.

cube_properties_to_standard_properties(...)

Converts cube style properties to standard properties.

evaluate_differences(base_transit)

    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
+

evaluate_route_property_differences(...[, ...])

Checks if any values have been updated or added for a specific route and creates project card entries for each.

evaluate_route_shape_changes(shape_build, ...)

Compares two route shapes and constructs returns list of changes suitable for a project card.

get_time_period_numbers_from_cube_properties(...)

Finds properties that are associated with time periods and the returns the numbers in them.

unpack_route_name(line_name)

Unpacks route name into direction, route, agency, and time period info

+
+
+add_additional_time_periods(new_time_period_number, orig_line_name)[source]
+

Copies a route to another cube time period with appropriate +values for time-period-specific properties.

+
+
New properties are stored under the new name in:
    +
  • ::self.shapes

  • +
  • ::self.line_properties

  • +
+
+
+
+
Parameters:
+
    +
  • new_time_period_number (int) – cube time period number

  • +
  • orig_line_name (str) – name of the originating line, from which +the new line will copy its properties.

  • +
+
+
Returns:
+

Line name with new time period.

+
+
+
+ +
+
+add_cube(transit_source)[source]
+

Reads a .lin file and adds it to existing TransitNetwork instance.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
+
+ +
+
+static build_route_name(route_id='', time_period='', agency_id=0, direction_id=1)[source]
+

Create a route name by contatenating route, time period, agency, and direction

+
+
Parameters:
+
    +
  • route_id – i.e. 452-111

  • +
  • time_period – i.e. pk

  • +
  • direction_id – i.e. 1

  • +
  • agency_id – i.e. 0

  • +
+
+
Returns:
+

constructed line_name i.e. “0_452-111_452_pk1”

+
+
+
+ +
+
+calculate_start_end_times(line_properties_dict)[source]
+

Calculate the start and end times of the property change +WARNING: Doesn’t take care of discongruous time periods!!!!

+
+
Parameters:
+

line_properties_dict – dictionary of cube-flavor properties for a transit line

+
+
+
+ +
+
+create_add_route_card_dict(line)[source]
+

Creates a project card change formatted dictionary for adding +a route based on the information in self.route_properties for +the line.

+
+
Parameters:
+

line – name of line that is being updated

+
+
Returns:
+

A project card change-formatted dictionary for the route addition.

+
+
+
+ +
+
+create_delete_route_card_dict(line, base_transit_line_properties_dict)[source]
+

Creates a project card change formatted dictionary for deleting a line.

+
+
Parameters:
+
    +
  • line – name of line that is being deleted

  • +
  • base_transit_line_properties_dict – dictionary of cube-style +attribute values in order to find time periods and +start and end times.

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the route deletion.

+
+
+
+ +
+
+static create_from_cube(transit_source, parameters={})[source]
+

Reads a cube .lin file and stores as TransitNetwork object.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
Returns:
+

A ::CubeTransit object created from the transit_source.

+
+
+
+ +
+
+create_update_route_card_dict(line, updated_properties_dict)[source]
+

Creates a project card change formatted dictionary for updating +the line.

+
+
Parameters:
+
    +
  • line – name of line that is being updated

  • +
  • updated_properties_dict – dictionary of attributes to update as +‘property’: <property name>, +‘set’: <new property value>

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the attribute update.

+
+
+
+ +
+
+static cube_properties_to_standard_properties(cube_properties_dict)[source]
+

Converts cube style properties to standard properties.

+

This is most pertinent to time-period specific variables like headway, +and varibles that have stnadard units like headway, which is minutes +in cube and seconds in standard format.

+
+
Parameters:
+

cube_properties_dict – <cube style property name> : <property value>

+
+
Returns:
+

+
<standard

style property name>, “set” : <property value with correct units>`

+
+
+

+
+
Return type:
+

A list of dictionaries with values for `”property”

+
+
+
+ +
+
+evaluate_differences(base_transit)[source]
+
    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
  3. +
    For routes being added or updated, identify if the time periods

    have changed or if there are multiples, and make duplicate lines if so

    +
    +
    +
  4. +
  5. Create project card dictionaries for each change.

  6. +
+
+
Parameters:
+

base_transit (CubeTransit) – an instance of this class for the base condition

+
+
Returns:
+

A list of dictionaries containing project card changes +required to evaluate the differences between the base network +and this transit network instance.

+
+
+
+ +
+
+evaluate_route_property_differences(properties_build, properties_base, time_period_number, absolute=True, validate_base=False)[source]
+

Checks if any values have been updated or added for a specific +route and creates project card entries for each.

+
+
Parameters:
+
    +
  • properties_build – ::<property_name>: <property_value>

  • +
  • properties_base – ::<property_name>: <property_value>

  • +
  • time_period_number – time period to evaluate

  • +
  • absolute – if True, will use set command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway

  • +
  • validate_base – if True, will add the existing line in the project card

  • +
+
+
Returns:
+

+
a list of dictionary values suitable for writing to a project card

{ +‘property’: <property_name>, +‘set’: <set value>, +‘change’: <change from existing value>, +‘existing’: <existing value to check>, +}

+
+
+

+
+
Return type:
+

transit_change_list (list)

+
+
+
+ +
+
+static evaluate_route_shape_changes(shape_build, shape_base)[source]
+

Compares two route shapes and constructs returns list of changes +suitable for a project card.

+
+
Parameters:
+
    +
  • shape_build – DataFrame of the build-version of the route shape.

  • +
  • shape_base – dDataFrame of the base-version of the route shape.

  • +
+
+
Returns:
+

List of shape changes formatted as a project card-change dictionary.

+
+
+
+ +
+
+static get_time_period_numbers_from_cube_properties(properties_list)[source]
+

Finds properties that are associated with time periods and the +returns the numbers in them.

+
+
Parameters:
+

properties_list (list) – list of all properties.

+
+
Returns:
+

list of strings of the time period numbers found

+
+
+
+ +
+
+static unpack_route_name(line_name)[source]
+

Unpacks route name into direction, route, agency, and time period info

+
+
Parameters:
+

line_name (str) – i.e. “0_452-111_452_pk1”

+
+
Returns:
+

452-111 +time_period (str): i.e. pk +direction_id (str) : i.e. 1 +agency_id (str) : i.e. 0

+
+
Return type:
+

route_id (str)

+
+
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_generated/lasso.ModelRoadwayNetwork/index.html b/branch/remove_assignable/_generated/lasso.ModelRoadwayNetwork/index.html new file mode 100644 index 0000000..8837127 --- /dev/null +++ b/branch/remove_assignable/_generated/lasso.ModelRoadwayNetwork/index.html @@ -0,0 +1,1573 @@ + + + + + + + lasso.ModelRoadwayNetwork — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.ModelRoadwayNetwork

+
+
+class lasso.ModelRoadwayNetwork(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Bases: RoadwayNetwork

+

Subclass of network_wrangler class RoadwayNetwork

+

A representation of the physical roadway network and its properties.

+
+
+__init__(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Constructor

+
+
Parameters:
+
    +
  • nodes – geodataframe of nodes

  • +
  • links – dataframe of links

  • +
  • shapes – geodataframe of shapes

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. +If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(nodes, links, shapes[, parameters])

Constructor

add_counts([network_variable, ...])

Adds count variable.

add_incident_link_data_to_nodes([links_df, ...])

Add data from links going to/from nodes to node.

add_new_roadway_feature_change(links, nodes)

add the new roadway features defined in the project card.

add_variable_using_shst_reference([...])

Join network links with source data, via SHST API node match result.

addition_map(links, nodes)

Shows which links and nodes are added to the roadway network

apply(project_card_dictionary)

Wrapper method to apply a project to a roadway network.

apply_managed_lane_feature_change(link_idx, ...)

Apply the managed lane feature changes to the roadway network

apply_python_calculation(pycode[, in_place])

Changes roadway network object by executing pycode.

apply_roadway_feature_change(link_idx, ...)

Changes the roadway attributes for the selected features based on the project card information passed

assess_connectivity([mode, ...])

Returns a network graph and list of disconnected subgraphs as described by a list of their member nodes.

build_selection_key(selection_dict)

Selections are stored by a key combining the query and the A and B ids.

calculate_area_type([area_type_shape, ...])

#MC Calculates area type variable.

calculate_centroidconnect(parameters[, ...])

Calculates centroid connector variable.

calculate_county([county_shape, ...])

#MC Calculates county variable.

calculate_distance([network_variable, ...])

calculate link distance in miles

calculate_mpo([county_network_variable, ...])

Calculates mpo variable.

calculate_use([network_variable, ...])

Calculates use variable.

convert_int([int_col_names])

Convert integer columns

create_ML_variable([network_variable, overwrite])

Created ML lanes placeholder for project to write out ML changes

create_calculated_variables()

Creates calculated roadway variables.

create_dummy_connector_links(ml_df[, ...])

create dummy connector links between the general purpose and managed lanes

create_hov_corridor_variable([...])

Created hov corridor placeholder for project to write out corridor changes

create_managed_lane_network([...])

Create a roadway network with managed lanes links separated out.

create_managed_variable([network_variable, ...])

Created placeholder for project to write out managed

dataframe_to_fixed_width(df)

Convert dataframe to fixed width format, geometry column will not be transformed.

delete_roadway_feature_change(links, nodes)

delete the roadway features defined in the project card.

deletion_map(links, nodes)

Shows which links and nodes are deleted from the roadway network

fill_na()

Fill na values from create_managed_lane_network()

from_RoadwayNetwork(roadway_network_object)

RoadwayNetwork to ModelRoadwayNetwork

get_attribute(links_df, join_key, ...)

Gets attribute from source data using SHST match result.

get_managed_lane_node_ids(nodes_list[, scalar])

Transform a list of node IDS by a scalar.

get_modal_graph(links_df, nodes_df[, mode, ...])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

get_modal_links_nodes(links_df, nodes_df[, ...])

Returns nodes and link dataframes for specific mode.

get_property_by_time_period_and_group(prop)

Return a series for the properties with a specific group or time period.

identify_segment(O_id, D_id[, ...])

+
param endpoints:
+

list of length of two unique keys of nodes making up endpoints of segment

+
+
+

identify_segment_endpoints([mode, links_df, ...])

+
param mode:
+

list of modes of the network, one of drive,`transit`,

+
+
+

is_network_connected([mode, links_df, nodes_df])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

load_transform_network(node_filename, ...[, ...])

Reads roadway network files from disk and transforms them into GeoDataFrames.

network_connection_plot(G, ...)

Plot a graph to check for network connection.

orig_dest_nodes_foreign_key(selection[, ...])

Returns the foreign key id (whatever is used in the u and v variables in the links file) for the AB nodes as a tuple.

ox_graph(nodes_df, links_df[, ...])

create an osmnx-flavored network graph

path_search(candidate_links_df, O_id, D_id)

+
param candidate_links:
+

selection of links geodataframe with links likely to be part of path

+
+
+

read(link_filename, node_filename, ...[, ...])

Reads in links and nodes network standard.

read_match_result(path)

Reads the shst geojson match returns.

rename_variables_for_dbf(input_df[, ...])

Rename attributes for DBF/SHP, make sure length within 10 chars.

roadway_net_to_gdf(roadway_net)

+
rtype:
+

GeoDataFrame

+
+
+

roadway_standard_to_met_council_network([...])

Rename and format roadway attributes to be consistent with what metcouncil's model is expecting.

select_roadway_features(selection[, ...])

Selects roadway features that satisfy selection criteria

selection_has_unique_link_id(selection_dict)

+
rtype:
+

bool

+
+
+

selection_map(selected_link_idx[, A, B, ...])

Shows which links are selected for roadway property change or parallel managed lanes category of roadway projects.

shortest_path(graph_links_df, O_id, D_id[, ...])

+
rtype:
+

tuple

+
+
+

split_properties_by_time_period_and_category([...])

Splits properties by time period, assuming a variable structure of

update_distance([links_df, use_shapes, ...])

Calculate link distance in specified units to network variable using either straight line distance or (if specified) shape distance if available.

validate_link_schema(link_filename[, ...])

Validate roadway network data link schema and output a boolean

validate_node_schema(node_file[, ...])

Validate roadway network data node schema and output a boolean

validate_properties(properties[, ...])

If there are change or existing commands, make sure that that property exists in the network.

validate_selection(selection[, ...])

Evaluate whetther the selection dictionary contains the minimum required values.

validate_shape_schema(shape_file[, ...])

Validate roadway network data shape schema and output a boolean

validate_uniqueness()

Confirms that the unique identifiers are met.

write([path, filename])

Writes a network in the roadway network standard

write_roadway_as_fixedwidth(output_dir[, ...])

Writes out fixed width file.

write_roadway_as_shp(output_dir[, ...])

Write out dbf/shp/gpkg for cube.

+

Attributes

+ + + + + + +

CALCULATED_VALUES

+
+
+add_counts(network_variable='AADT', mndot_count_shst_data=None, widot_count_shst_data=None, mndot_count_variable_shp=None, widot_count_variable_shp=None)[source]
+

Adds count variable. +#MC +join the network with count node data, via SHST API node match result

+
+
Parameters:
+
    +
  • network_variable (str) – Name of the variable that should be written to. Default to “AADT”.

  • +
  • mndot_count_shst_data (str) – File path to MNDOT count location SHST API node match result.

  • +
  • widot_count_shst_data (str) – File path to WIDOT count location SHST API node match result.

  • +
  • mndot_count_variable_shp (str) – File path to MNDOT count location geodatabase.

  • +
  • widot_count_variable_shp (str) – File path to WIDOT count location geodatabase.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+ +

Add data from links going to/from nodes to node.

+
+
Return type:
+

DataFrame

+
+
Parameters:
+
    +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
  • link_variables – list of columns in links dataframe to add to incident nodes

  • +
+
+
Returns:
+

nodes DataFrame with link data where length is N*number of links going in/out

+
+
+
+ +
+
+add_new_roadway_feature_change(links, nodes)
+

add the new roadway features defined in the project card. +new shapes are also added for the new roadway links.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – list of dictionaries

  • +
  • nodes – list of dictionaries

  • +
+
+
+

returns: None

+
+ +
+
+add_variable_using_shst_reference(var_shst_csvdata=None, shst_csv_variable=None, network_variable=None, network_var_type=<class 'int'>, overwrite=False)[source]
+

Join network links with source data, via SHST API node match result.

+
+
Parameters:
+
    +
  • var_shst_csvdata (str) – File path to SHST API return.

  • +
  • shst_csv_variable (str) – Variable name in the source data.

  • +
  • network_variable (str) – Name of the variable that should be written to.

  • +
  • network_var_type – Variable type in the written network.

  • +
  • overwrite (bool) – True is overwriting existing variable. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+addition_map(links, nodes)
+

Shows which links and nodes are added to the roadway network

+
+ +
+
+apply(project_card_dictionary)
+

Wrapper method to apply a project to a roadway network.

+
+
Parameters:
+

project_card_dictionary – dict +a dictionary of the project card object

+
+
+
+ +
+
+apply_managed_lane_feature_change(link_idx, properties, in_place=True)
+

Apply the managed lane feature changes to the roadway network

+
+
Parameters:
+
    +
  • link_idx – list of lndices of all links to apply change to

  • +
  • properties – list of dictionarys roadway properties to change

  • +
  • in_place – boolean to indicate whether to update self or return +a new roadway network object

  • +
+
+
+
+ +
+
+apply_python_calculation(pycode, in_place=True)
+

Changes roadway network object by executing pycode.

+
+
Parameters:
+
    +
  • pycode – python code which changes values in the roadway network object

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+
+ +
+
+apply_roadway_feature_change(link_idx, properties, in_place=True)
+

Changes the roadway attributes for the selected features based on the +project card information passed

+
+
Parameters:
+
    +
  • link_idx – list +lndices of all links to apply change to

  • +
  • properties – list of dictionarys +roadway properties to change

  • +
  • in_place – boolean +update self or return a new roadway network object

  • +
+
+
+
+ +
+
+assess_connectivity(mode='', ignore_end_nodes=True, links_df=None, nodes_df=None)
+

Returns a network graph and list of disconnected subgraphs +as described by a list of their member nodes.

+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • ignore_end_nodes – if True, ignores stray singleton nodes

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+
Returns: Tuple of

Network Graph (osmnx flavored networkX DiGraph) +List of disconnected subgraphs described by the list of their

+
+

member nodes (as described by their model_node_id)

+
+
+
+
+ +
+
+build_selection_key(selection_dict)
+

Selections are stored by a key combining the query and the A and B ids. +This method combines the two for you based on the selection dictionary.

+
+
Return type:
+

tuple

+
+
Parameters:
+

selection_dictonary – Selection Dictionary

+
+
+

Returns: Tuple serving as the selection key.

+
+ +
+
+calculate_area_type(area_type_shape=None, area_type_shape_variable=None, network_variable='area_type', area_type_codes_dict=None, downtown_area_type_shape=None, downtown_area_type=None, overwrite=False)[source]
+

#MC +Calculates area type variable.

+

This uses the centroid of the geometry field to determine which area it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • area_type_shape (str) – The File path to area geodatabase.

  • +
  • area_type_shape_variable (str) – The variable name of area type in area geodadabase.

  • +
  • network_variable (str) – The variable name of area type in network standard. Default to “area_type”.

  • +
  • area_type_codes_dict – The dictionary to map input area_type_shape_variable to network_variable

  • +
  • downtown_area_type_shape – The file path to the downtown area type boundary.

  • +
  • downtown_area_type (int) – Integer value of downtown area type

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_centroidconnect(parameters, network_variable='centroidconnect', highest_taz_number=None, as_integer=True, overwrite=False)[source]
+

Calculates centroid connector variable.

+
+
Parameters:
+
    +
  • parameters (Parameters) – A Lasso Parameters, which stores input files.

  • +
  • network_variable (str) – Variable that should be written to in the network. Default to “centroidconnect”

  • +
  • highest_taz_number (int) – the max TAZ number in the network.

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Default to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

RoadwayNetwork

+
+
+
+ +
+
+calculate_county(county_shape=None, county_shape_variable=None, network_variable='county', county_codes_dict=None, overwrite=False)[source]
+

#MC +Calculates county variable.

+

This uses the centroid of the geometry field to determine which county it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • county_shape (str) – The File path to county geodatabase.

  • +
  • county_shape_variable (str) – The variable name of county in county geodadabase.

  • +
  • network_variable (str) – The variable name of county in network standard. Default to “county”.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_distance(network_variable='distance', centroidconnect_only=False, overwrite=False)[source]
+

calculate link distance in miles

+
+
Parameters:
+
    +
  • centroidconnect_only (Bool) – True if calculating distance for centroidconnectors only. Default to False.

  • +
  • overwrite (Bool) – True if overwriting existing variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_mpo(county_network_variable='county', network_variable='mpo', as_integer=True, mpo_counties=None, overwrite=False)[source]
+

Calculates mpo variable. +#MC +:param county_variable: Name of the variable where the county names are stored. Default to “county”. +:type county_variable: str +:param network_variable: Name of the variable that should be written to. Default to “mpo”. +:type network_variable: str +:param as_integer: If true, will convert true/false to 1/0s. +:type as_integer: bool +:param mpo_counties: List of county names that are within mpo region. +:type mpo_counties: list +:param overwrite: True if overwriting existing county variable in network. Default to False. +:type overwrite: Bool

+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_use(network_variable='use', as_integer=True, overwrite=False)[source]
+

Calculates use variable.

+
+
Parameters:
+
    +
  • network_variable (str) – Variable that should be written to in the network. Default to “use”

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Defauly to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+convert_int(int_col_names=[])[source]
+

Convert integer columns

+
+ +
+
+create_ML_variable(network_variable='ML_lanes', overwrite=False)[source]
+

Created ML lanes placeholder for project to write out ML changes

+

ML lanes default to 0, ML info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_calculated_variables()[source]
+

Creates calculated roadway variables.

+
+
Parameters:
+

None

+
+
+
+ +
+ +

create dummy connector links between the general purpose and managed lanes

+
+
Parameters:
+
    +
  • gp_df – GeoDataFrame +dataframe of general purpose links (where managed lane also exists)

  • +
  • ml_df – GeoDataFrame +dataframe of corresponding managed lane links,

  • +
  • access_lanes – int +number of lanes in access dummy link

  • +
  • egress_lanes – int +number of lanes in egress dummy link

  • +
  • access_roadway – str +roaday type for access dummy link

  • +
  • egress_roadway – str +roadway type for egress dummy link

  • +
  • access_name_prefix – str +prefix for access dummy link name

  • +
  • egress_name_prefix – str +prefix for egress dummy link name

  • +
+
+
+
+ +
+
+create_hov_corridor_variable(network_variable='segment_id', overwrite=False)[source]
+

Created hov corridor placeholder for project to write out corridor changes

+

hov corridor id default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_managed_lane_network(keep_same_attributes_ml_and_gp=None, keep_additional_attributes_ml_and_gp=[], managed_lanes_required_attributes=[], managed_lanes_node_id_scalar=None, managed_lanes_link_id_scalar=None, in_place=False)
+

Create a roadway network with managed lanes links separated out. +Add new parallel managed lane links, access/egress links, +and add shapes corresponding to the new links

+
+
Return type:
+

RoadwayNetwork

+
+
Parameters:
+
    +
  • keep_same_attributes_ml_and_gp – list of attributes to copy from general purpose +lane to managed lane. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to KEEP_SAME_ATTRIBUTES_ML_AND_GP.

  • +
  • keep_additional_attributes_ml_and_gp – list of additional attributes to add. This is useful +if you want to leave the default attributes and then ALSO some others.

  • +
  • managed_lanes_required_attributes – list of attributes that are required to be specified +in new managed lanes. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_REQUIRED_ATTRIBUTES.

  • +
  • managed_lanes_node_id_scalar – integer value added to original node IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_NODE_ID_SCALAR.

  • +
  • managed_lanes_link_id_scalar – integer value added to original link IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_LINK_ID_SCALAR.

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+

returns: A RoadwayNetwork instance

+
+ +
+
+create_managed_variable(network_variable='managed', overwrite=False)[source]
+

Created placeholder for project to write out managed

+

managed default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+static dataframe_to_fixed_width(df)[source]
+

Convert dataframe to fixed width format, geometry column will not be transformed.

+
+
Parameters:
+

df (pandas DataFrame) –

+
+
Returns:
+

dataframe with fixed width for each column. +dict: dictionary with columns names as keys, column width as values.

+
+
Return type:
+

pandas dataframe

+
+
+
+ +
+
+delete_roadway_feature_change(links, nodes, ignore_missing=True)
+

delete the roadway features defined in the project card. +valid links and nodes defined in the project gets deleted +and shapes corresponding to the deleted links are also deleted.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – dict +list of dictionaries

  • +
  • nodes – dict +list of dictionaries

  • +
  • ignore_missing – bool +If True, will only warn about links/nodes that are missing from +network but specified to “delete” in project card +If False, will fail.

  • +
+
+
+
+ +
+
+deletion_map(links, nodes)
+

Shows which links and nodes are deleted from the roadway network

+
+ +
+
+fill_na()[source]
+

Fill na values from create_managed_lane_network()

+
+ +
+
+static from_RoadwayNetwork(roadway_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • roadway_network_object (RoadwayNetwork) –

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static get_attribute(links_df, join_key, source_shst_ref_df, source_gdf, field_name)[source]
+

Gets attribute from source data using SHST match result.

+
+
Parameters:
+
    +
  • links_df (dataframe) – The network dataframe that new attribute should be written to.

  • +
  • join_key (str) – SHST ID variable name used to join source data with network dataframe.

  • +
  • source_shst_ref_df (str) – File path to source data SHST match result.

  • +
  • source_gdf (str) – File path to source data.

  • +
  • field_name (str) – Name of the attribute to get from source data.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+static get_managed_lane_node_ids(nodes_list, scalar=4500000)
+

Transform a list of node IDS by a scalar. +..todo #237 what if node ids are not a number?

+
+
Parameters:
+
    +
  • nodes_list – list of integers

  • +
  • scalar – value to add to node IDs

  • +
+
+
+

Returns: list of integers

+
+ +
+
+static get_modal_graph(links_df, nodes_df, mode=None, modes_to_network_link_variables={'bike': ['bike_access'], 'bus': ['bus_only', 'drive_access'], 'drive': ['drive_access'], 'rail': ['rail_only'], 'transit': ['bus_only', 'rail_only', 'drive_access'], 'walk': ['walk_access']})
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: networkx: osmnx: DiGraph of network

+
+ +
+ +

Returns nodes and link dataframes for specific mode.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • modes – list of the modes of the network to be kept, must be in drive,`transit`,`rail`,`bus`, +walk, bike. For example, if bike and walk are selected, both bike and walk links will be kept.

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: tuple of DataFrames for links, nodes filtered by mode

+

links with walk access are not marked as having walk access +Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145 +modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]

+
+ +
+
+get_property_by_time_period_and_group(prop, time_period=None, category=None, default_return=None)
+

Return a series for the properties with a specific group or time period.

+
+
Parameters:
+
    +
  • prop (str) – the variable that you want from network

  • +
  • time_period (list(str)) – the time period that you are querying for +i.e. [‘16:00’, ‘19:00’]

  • +
  • category (str or list(str)(Optional)) –

    the group category +i.e. “sov”

    +

    or

    +

    list of group categories in order of search, i.e. +[“hov3”,”hov2”]

    +

  • +
  • default_return (what to return if variable or time period not found. Default is None.) –

  • +
+
+
Return type:
+

pandas series

+
+
+
+ +
+
+identify_segment(O_id, D_id, selection_dict={}, mode=None, nodes_df=None, links_df=None)
+
+
Parameters:
+
    +
  • endpoints – list of length of two unique keys of nodes making up endpoints of segment

  • +
  • selection_dict – dictionary of link variables to select candidate links from, otherwise will create a graph of ALL links which will be both a RAM hog and could result in odd shortest paths.

  • +
  • segment_variables – list of variables to keep

  • +
+
+
+
+ +
+
+identify_segment_endpoints(mode='', links_df=None, nodes_df=None, min_connecting_links=10, min_distance=None, max_link_deviation=2)
+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+ +
+
+is_network_connected(mode=None, links_df=None, nodes_df=None)
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
+
+
+

Returns: boolean

+
+ +
+
+static load_transform_network(node_filename, link_filename, shape_filename, crs=4326, node_foreign_key='model_node_id', validate_schema=True, **kwargs)
+

Reads roadway network files from disk and transforms them into GeoDataFrames.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • node_filename – file name for nodes.

  • +
  • link_filename – file name for links.

  • +
  • shape_filename – file name for shapes.

  • +
  • crs – coordinate reference system. Defaults to value in CRS.

  • +
  • node_foreign_key – variable linking the node table to the link table. Defaults +to NODE_FOREIGN_KEY.

  • +
  • validate_schema – boolean indicating if network should be validated to schema.

  • +
+
+
+

returns: tuple of GeodataFrames nodes_df, links_df, shapes_df

+
+ +
+
+static network_connection_plot(G, disconnected_subgraph_nodes)
+

Plot a graph to check for network connection.

+
+
Parameters:
+
    +
  • G – OSMNX flavored networkX graph.

  • +
  • disconnected_subgraph_nodes – List of disconnected subgraphs described by the list of their +member nodes (as described by their model_node_id).

  • +
+
+
+

returns: fig, ax : tuple

+
+ +
+
+orig_dest_nodes_foreign_key(selection, node_foreign_key='')
+

Returns the foreign key id (whatever is used in the u and v +variables in the links file) for the AB nodes as a tuple.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • selection – selection dictionary with A and B keys

  • +
  • node_foreign_key – variable name for whatever is used by the u and v variable

  • +
  • specified (in the links_df file. If nothing is) –

  • +
  • whatever (assume) –

  • +
  • is (default) –

  • +
+
+
+

Returns: tuple of (A_id, B_id)

+
+ +
+
+static ox_graph(nodes_df, links_df, node_foreign_key='model_node_id', link_foreign_key=['A', 'B'], unique_link_key='model_link_id')
+

create an osmnx-flavored network graph

+

osmnx doesn’t like values that are arrays, so remove the variables +that have arrays. osmnx also requires that certain variables +be filled in, so do that too.

+
+
Parameters:
+
    +
  • nodes_df – GeoDataFrame of nodes

  • +
  • link_df – GeoDataFrame of links

  • +
  • node_foreign_key – field referenced in link_foreign_key

  • +
  • link_foreign_key – list of attributes that define the link start and end nodes to the node foreign key

  • +
  • unique_link_key – primary key for links

  • +
+
+
+

Returns: a networkx multidigraph

+
+ +
+ +
+
Parameters:
+
    +
  • candidate_links – selection of links geodataframe with links likely to be part of path

  • +
  • O_id – origin node foreigh key ID

  • +
  • D_id – destination node foreigh key ID

  • +
  • weight_column – column to use for weight of shortest path. Defaults to “i” (iteration)

  • +
  • weight_factor – optional weight to multiply the weight column by when finding the shortest path

  • +
  • search_breadth

  • +
+
+
+

Returns

+
+ +
+
+static read(link_filename, node_filename, shape_filename, fast=False, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Reads in links and nodes network standard.

+
+
Parameters:
+
    +
  • link_filename (str) – File path to link json.

  • +
  • node_filename (str) – File path to node geojson.

  • +
  • shape_filename (str) – File path to link true shape geojson

  • +
  • fast (bool) – boolean that will skip validation to speed up read time.

  • +
  • recalculate_calculated_variables (bool) – calculates fields from spatial joins, etc.

  • +
  • recalculate_distance (bool) – re-calculates distance.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static read_match_result(path)[source]
+

Reads the shst geojson match returns.

+

Returns shst dataframe.

+

Reading lots of same type of file and concatenating them into a single DataFrame.

+
+
Parameters:
+

path (str) – File path to SHST match results.

+
+
Returns:
+

geopandas geodataframe

+
+
Return type:
+

geodataframe

+
+
+

##todo +not sure why we need, but should be in utilities not this class

+
+ +
+
+rename_variables_for_dbf(input_df, variable_crosswalk=None, output_variables=None, convert_geometry_to_xy=False)[source]
+

Rename attributes for DBF/SHP, make sure length within 10 chars.

+
+
Parameters:
+
    +
  • input_df (dataframe) – Network standard DataFrame.

  • +
  • variable_crosswalk (str) – File path to variable name crosswalk from network standard to DBF names.

  • +
  • output_variables (list) – List of strings for DBF variables.

  • +
  • convert_geometry_to_xy (bool) – True if converting node geometry to X/Y

  • +
+
+
Returns:
+

dataframe

+
+
+
+ +
+
+static roadway_net_to_gdf(roadway_net)
+
+
Return type:
+

GeoDataFrame

+
+
+

Turn the roadway network into a GeoDataFrame +:param roadway_net: the roadway network to export

+

returns: shapes dataframe

+
+ +
+
+roadway_standard_to_met_council_network(output_epsg=None)[source]
+

Rename and format roadway attributes to be consistent with what metcouncil’s model is expecting. +#MC +:param output_epsg: epsg number of output network. +:type output_epsg: int

+
+
Returns:
+

None

+
+
+
+ +
+
+select_roadway_features(selection, search_mode='drive', force_search=False, sp_weight_factor=None)
+

Selects roadway features that satisfy selection criteria

+
+
Return type:
+

GeoDataFrame

+
+
+
+
Example usage:
+
net.select_roadway_features(
+
selection = [ {

# a match condition for the from node using osm, +# shared streets, or model node number +‘from’: {‘osm_model_link_id’: ‘1234’}, +# a match for the to-node.. +‘to’: {‘shstid’: ‘4321’}, +# a regex or match for facility condition +# could be # of lanes, facility type, etc. +‘facility’: {‘name’:’Main St’}, +}, … ])

+
+
+
+
+
+
+
+
Parameters:
+
    +
  • selection – dictionary with keys for: +A - from node +B - to node +link - which includes at least a variable for name

  • +
  • search_mode – mode which you are searching for; defaults to “drive”

  • +
  • force_search – boolean directing method to perform search even if one +with same selection dict is stored from a previous search.

  • +
  • sp_weight_factor – multiple used to discourage shortest paths which +meander from original search returned from name or ref query. +If not set here, will default to value of sp_weight_factor in +RoadwayNetwork instance. If not set there, will defaul to SP_WEIGHT_FACTOR.

  • +
+
+
+

Returns: a list of link IDs in selection

+
+ +
+ +
+
Return type:
+

bool

+
+
Parameters:
+

selection_dictionary – Dictionary representation of selection +of roadway features, containing a “link” key.

+
+
+
+
Returns: A boolean indicating if the selection dictionary contains

a unique identifier for links.

+
+
+
+ +
+
+selection_map(selected_link_idx, A=None, B=None, candidate_link_idx=[])
+

Shows which links are selected for roadway property change or parallel +managed lanes category of roadway projects.

+
+
Parameters:
+
    +
  • selected_links_idx – list of selected link indices

  • +
  • candidate_links_idx – optional list of candidate link indices to also include in map

  • +
  • A – optional foreign key of starting node of a route selection

  • +
  • B – optional foreign key of ending node of a route selection

  • +
+
+
+
+ +
+
+shortest_path(graph_links_df, O_id, D_id, nodes_df=None, weight_column='i', weight_factor=100)
+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • graph_links_df

  • +
  • O_id – foreign key for start node

  • +
  • D_id – foreign key for end node

  • +
  • nodes_df – optional nodes df, otherwise will use network instance

  • +
  • weight_column – column to use as a weight, defaults to “i”

  • +
  • weight_factor – any additional weighting to multiply the weight column by, defaults to SP_WEIGHT_FACTOR

  • +
+
+
+

Returns: tuple with length of four +- Boolean if shortest path found +- nx Directed graph of graph links +- route of shortest path nodes as List +- links in shortest path selected from links_df

+
+ +
+
+split_properties_by_time_period_and_category(properties_to_split=None)[source]
+

Splits properties by time period, assuming a variable structure of

+
+
Parameters:
+

properties_to_split

dict +dictionary of output variable prefix mapped to the source variable and what to stratify it by +e.g. +{

+
+

’lanes’ : {‘v’:’lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘ML_lanes’ : {‘v’:’ML_lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘use’ : {‘v’:’use’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}},

+
+

}

+

+
+
+
+ +
+
+update_distance(links_df=None, use_shapes=False, units='miles', network_variable='distance', overwrite=True, inplace=True)
+

Calculate link distance in specified units to network variable using either straight line +distance or (if specified) shape distance if available.

+
+
Parameters:
+
    +
  • links_df – Links GeoDataFrame. Useful if want to update a portion of network links +(i.e. only centroid connectors). If not provided, will use entire self.links_df.

  • +
  • use_shapes – if True, will add length information from self.shapes_df rather than crow-fly. +If no corresponding shape found in self.shapes_df, will default to crow-fly.

  • +
  • units – units to use. Defaults to the standard unit of miles. Available units: “meters”, “miles”.

  • +
  • network_variable – variable to store link distance in. Defaults to “distance”.

  • +
  • overwrite – Defaults to True and will overwrite all existing calculated distances. +False will only update NaNs.

  • +
  • inplace – updates self.links_df

  • +
+
+
Returns:
+

links_df with updated distance

+
+
+
+ +
+ +

Validate roadway network data link schema and output a boolean

+
+ +
+
+static validate_node_schema(node_file, schema_location='roadway_network_node.json')
+

Validate roadway network data node schema and output a boolean

+
+ +
+
+validate_properties(properties, ignore_existing=False, require_existing_for_change=False)
+

If there are change or existing commands, make sure that that +property exists in the network.

+
+
Return type:
+

bool

+
+
Parameters:
+
    +
  • properties – properties dictionary to be evaluated

  • +
  • ignore_existing – If True, will only warn about properties +that specify an “existing” value. If False, will fail.

  • +
  • require_existing_for_change – If True, will fail if there isn’t +a specified value in theproject card for existing when a +change is specified.

  • +
+
+
+

Returns: boolean value as to whether the properties dictonary is valid.

+
+ +
+
+validate_selection(selection, selection_requires=['link'])
+

Evaluate whetther the selection dictionary contains the +minimum required values.

+
+
Return type:
+

bool

+
+
Parameters:
+

selection – selection dictionary to be evaluated

+
+
+

Returns: boolean value as to whether the selection dictonary is valid.

+
+ +
+
+static validate_shape_schema(shape_file, schema_location='roadway_network_shape.json')
+

Validate roadway network data shape schema and output a boolean

+
+ +
+
+validate_uniqueness()
+

Confirms that the unique identifiers are met.

+
+
Return type:
+

bool

+
+
+
+ +
+
+write(path='.', filename=None)
+

Writes a network in the roadway network standard

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • path – the path were the output will be saved

  • +
  • filename – the name prefix of the roadway files that will be generated

  • +
+
+
+
+ +
+
+write_roadway_as_fixedwidth(output_dir, node_output_variables=None, link_output_variables=None, output_link_txt=None, output_node_txt=None, output_link_header_width_txt=None, output_node_header_width_txt=None, output_cube_network_script=None, drive_only=False)[source]
+

Writes out fixed width file.

+

This function does: +1. write out link and node fixed width data files for cube. +2. write out header and width correspondence. +3. write out cube network building script with header and width specification.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to where links, nodes and script will be written and run

  • +
  • node_output_variables (list) – list of node variable names.

  • +
  • link_output_variables (list) – list of link variable names.

  • +
  • output_link_txt (str) – File name of output link database (within output_dir)

  • +
  • output_node_txt (str) – File name of output node database (within output_dir)

  • +
  • output_link_header_width_txt (str) – File name of link column width records (within output_dir)

  • +
  • output_node_header_width_txt (str) – File name of node column width records (within output_dir)

  • +
  • output_cube_network_script (str) – File name of CUBE network building script (within output_dir)

  • +
  • drive_only (bool) – If True, only writes drive nodes and links

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+write_roadway_as_shp(output_dir, node_output_variables=None, link_output_variables=None, data_to_csv=True, data_to_dbf=False, output_link_shp=None, output_node_shp=None, output_link_csv=None, output_node_csv=None, output_gpkg=None, output_link_gpkg_layer=None, output_node_gpkg_layer=None, output_gpkg_link_filter=None)[source]
+

Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to directory

  • +
  • node_output_variables (list) – List of strings for node output variables.

  • +
  • link_output_variables (list) – List of strings for link output variables.

  • +
  • data_to_csv (bool) – True if write network in csv format.

  • +
  • data_to_dbf (bool) – True if write network in dbf/shp format.

  • +
  • output_link_shp (str) – File name to output link dbf/shp.

  • +
  • output_node_shp (str) – File name of output node dbf/shp.

  • +
  • output_link_csv (str) – File name to output link csv.

  • +
  • output_node_csv (str) – File name to output node csv.

  • +
  • output_gpkg (str) – File name to output GeoPackage.

  • +
  • output_link_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_node_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_gpkg_link_filter (str) – Optional column name to additional output link subset layers

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_generated/lasso.Parameters/index.html b/branch/remove_assignable/_generated/lasso.Parameters/index.html new file mode 100644 index 0000000..74e76df --- /dev/null +++ b/branch/remove_assignable/_generated/lasso.Parameters/index.html @@ -0,0 +1,555 @@ + + + + + + + lasso.Parameters — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Parameters

+
+
+class lasso.Parameters(**kwargs)[source]
+

Bases: object

+

A class representing all the parameters defining the networks +including time of day, categories, etc.

+

Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class. +.. highlight:: python

+
+
Attr:
+
time_period_to_time (dict): Maps time period abbreviations used in

Cube to time of days used on gtfs and highway network standard +Default:

+
{
+    "EA": ("3:00", "6:00"),
+    "AM": ("6:00, "10:00"),
+    "MD": ("10:00", "15:00"),
+    "PM": ("15:00", "19:00"),
+    "EV": ("19:00", "3:00"),
+}
+
+
+
+
cube_time_periods (dict): Maps cube time period numbers used in

transit line files to the time period abbreviations in time_period_to_time +dictionary. +Default:

+
{"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"}
+
+
+
+
categories (dict): Maps demand category abbreviations to a list of

network categories they are allowed to use. +Default:

+
{
+    # suffix, source (in order of search)
+    "sov": ["sov", "default"],
+    "hov2": ["hov2", "default", "sov"],
+    "hov3": ["hov3", "hov2", "default", "sov"],
+    "truck": ["trk", "sov", "default"],
+}
+
+
+
+
properties_to_split (dict): Dictionary mapping variables in standard

roadway network to categories and time periods that need to be +split out in final model network to get variables like LANES_AM. +Default:

+
{
+    "lanes": {
+        "v": "lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "ML_lanes": {
+        "v": "ML_lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "use": {
+        "v": "use",
+        "time_periods": self.time_periods_to_time
+    },
+}
+
+
+
+
county_shape (str): File location of shapefile defining counties.

Default:

+
r"metcouncil_data/county/cb_2017_us_county_5m.shp"
+
+
+
+
county_variable_shp (str): Property defining the county n ame in

the county_shape file. +Default:

+
NAME
+
+
+
+
lanes_lookup_file (str): Lookup table of number of lanes for different data sources.

Default:

+
r"metcouncil_data/lookups/lanes.csv"
+
+
+
+
centroid_connect_lanes (int): Number of lanes for centroid connectors.

Default:

+
1
+
+
+
+
mpo_counties (list): list of county names within MPO boundary.

Default:

+
[
+    "ANOKA",
+    "DAKOTA",
+    "HENNEPIN",
+    "RAMSEY",
+    "SCOTT",
+    "WASHINGTON",
+    "CARVER",
+]
+
+
+
+
taz_shape (str):

Default:

+
r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp"
+
+
+
+
taz_data (str):

Default:

+
??
+
+
+
+
highest_taz_number (int): highest TAZ number in order to define

centroid connectors. +Default:

+
3100
+
+
+
+
output_variables (list): list of variables to output in final model

network. +Default:

+
[
+    "model_link_id",
+    "link_id",
+    "A",
+    "B",
+    "shstGeometryId",
+    "distance",
+    "roadway",
+    "name",
+    "roadway_class",
+    "bike_access",
+    "walk_access",
+    "drive_access",
+    "truck_access",
+    "trn_priority_EA",
+    "trn_priority_AM",
+    "trn_priority_MD",
+    "trn_priority_PM",
+    "trn_priority_EV",
+    "ttime_assert_EA",
+    "ttime_assert_AM",
+    "ttime_assert_MD",
+    "ttime_assert_PM",
+    "ttime_assert_EV",
+    "lanes_EA",
+    "lanes_AM",
+    "lanes_MD",
+    "lanes_PM",
+    "lanes_EV",
+    "price_sov_EA",
+    "price_hov2_EA",
+    "price_hov3_EA",
+    "price_truck_EA",
+    "price_sov_AM",
+    "price_hov2_AM",
+    "price_hov3_AM",
+    "price_truck_AM",
+    "price_sov_MD",
+    "price_hov2_MD",
+    "price_hov3_MD",
+    "price_truck_MD",
+    "price_sov_PM",
+    "price_hov2_PM",
+    "price_hov3_PM",
+    "price_truck_PM",
+    "price_sov_EV",
+    "price_hov2_EV",
+    "price_hov3_EV",
+    "price_truck_EV",
+    "roadway_class_idx",
+    "facility_type",
+    "county",
+    "centroidconnect",
+    "model_node_id",
+    "N",
+    "osm_node_id",
+    "bike_node",
+    "transit_node",
+    "walk_node",
+    "drive_node",
+    "geometry",
+    "X",
+    "Y",
+    "ML_lanes_EA",
+    "ML_lanes_AM",
+    "ML_lanes_MD",
+    "ML_lanes_PM",
+    "ML_lanes_EV",
+    "segment_id",
+    "managed",
+    "bus_only",
+    "rail_only"
+]
+
+
+
+
osm_facility_type_dict (dict): Mapping between OSM Roadway variable

and facility type. Default:

+
+
area_type_shape (str): Location of shapefile defining area type.

Default:

+
r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp"
+
+
+
+
area_type_variable_shp (str): property in area_type_shape with area

type in it. +Default:

+
"COMDES2040"
+
+
+
+
area_type_code_dict (dict): Mapping of the area_type_variable_shp to

the area type code used in the MetCouncil cube network. +Default:

+
{
+    23: 4,  # urban center
+    24: 3,
+    25: 2,
+    35: 2,
+    36: 1,
+    41: 1,
+    51: 1,
+    52: 1,
+    53: 1,
+    60: 1,
+}
+
+
+
+
downtown_area_type_shape (str): Location of shapefile defining downtown area type.

Default:

+
r"metcouncil_data/area_type/downtownzones_TAZ.shp"
+
+
+
+
downtown_area_type (int): Area type integer for downtown.

Default:

+
5
+
+
+
+
mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property

associated with roadway class. Default:

+
r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp"
+
+
+
+
mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp

associated with roadway class. Default:

+
"ROUTE_SYS"
+
+
+
+
widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property

associated with roadway class. Default:

+
r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp"
+
+
+
+
widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape

associated with roadway class.Default:

+
"RDWY_CTGY_"
+
+
+
+
mndot_count_shape (str): Shapefile of MnDOT links with a property

associated with counts. Default:

+
r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp"
+
+
+
+
mndot_count_variable_shp (str): The property in mndot_count_shape

associated with counts. Default:

+
+
::

“lookups/osm_highway_facility_type_crosswalk.csv”

+
+
+
+
legacy_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from Legacy TM2 network. Default:

+
"lookups/legacy_tm2_attributes.csv"
+
+
+
+
osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId

from OSM. Default:

+
"lookups/osm_lanes_attributes.csv"
+
+
+
+
tam_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from TAM TM2 network. Default:

+
"lookups/tam_tm2_attributes.csv"
+
+
+
+
tom_tom_attributes (str): CSV file of link attributes by

shStReferenceId from TomTom network. Default:

+
"lookups/tomtom_attributes.csv"
+
+
+
+
sfcta_attributes (str): CSV file of link attributes by

shStReferenceId from SFCTA network. Default:

+
"lookups/sfcta_attributes.csv"
+
+
+
+
output_epsg (int): EPSG type of geographic projection for output

shapefiles. Default:

+
102646
+
+
+
+
output_link_shp (str): Output shapefile for roadway links. Default:
+
::

r”tests/scratch/links.shp”

+
+
+
+
output_node_shp (str): Output shapefile for roadway nodes. Default:
+
::

r”tests/scratch/nodes.shp”

+
+
+
+
output_link_csv (str): Output csv for roadway links. Default:
+
::

r”tests/scratch/links.csv”

+
+
+
+
output_node_csv (str): Output csv for roadway nodes. Default:
+
::

r”tests/scratch/nodes.csv”

+
+
+
+
output_link_txt (str): Output fixed format txt for roadway links. Default:
+
::

r”tests/scratch/links.txt”

+
+
+
+
output_node_txt (str): Output fixed format txt for roadway nodes. Default:
+
::

r”tests/scratch/nodes.txt”

+
+
+
+
output_link_header_width_txt (str): Header for txt roadway links. Default:
+
::

r”tests/scratch/links_header_width.txt”

+
+
+
+
output_node_header_width_txt (str): Header for txt for roadway Nodes. Default:
+
::

r”tests/scratch/nodes_header_width.txt”

+
+
+
+
output_cube_network_script (str): Cube script for importing

fixed-format roadway network. Default:

+
r"tests/scratch/make_complete_network_from_fixed_width_file.s
+
+
+
+
+
+
+
+
+__init__(**kwargs)[source]
+

Time period and category splitting info

+
+ +

Methods

+ + + + + + +

__init__(**kwargs)

Time period and category splitting info

+

Attributes

+ + + + + + + + + + + + + + + +

cube_time_periods

#MC self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7}

properties_to_split

Details for calculating the county based on the centroid of the link.

county_link_range_dict

self.county_code_dict = {

zones

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+
self.county_code_dict = {

“Anoka”: 1, +“Carver”: 2, +“Dakota”: 3, +“Hennepin”: 4, +“Ramsey”: 5, +“Scott”: 6, +“Washington”: 7, +“external”: 10, +“Chisago”: 11, +“Goodhue”: 12, +“Isanti”: 13, +“Le Sueur”: 14, +“McLeod”: 15, +“Pierce”: 16, +“Polk”: 17, +“Rice”: 18, +“Sherburne”: 19, +“Sibley”: 20, +“St. Croix”: 21, +“Wright”: 22,

+
+
+

}

+
+ +
+
+cube_time_periods
+

#MC +self.route_type_bus_mode_dict = {“Urb Loc”: 5, “Sub Loc”: 6, “Express”: 7}

+

self.route_type_mode_dict = {0: 8, 2: 9}

+

self.cube_time_periods = {“1”: “AM”, “2”: “MD”} +self.cube_time_periods_name = {“AM”: “pk”, “MD”: “op”}

+
+ +
+
+properties_to_split
+

Details for calculating the county based on the centroid of the link. +The NAME varible should be the name of a field in shapefile.

+
+ +
+
+zones
+

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_generated/lasso.Project/index.html b/branch/remove_assignable/_generated/lasso.Project/index.html new file mode 100644 index 0000000..db9cc2c --- /dev/null +++ b/branch/remove_assignable/_generated/lasso.Project/index.html @@ -0,0 +1,521 @@ + + + + + + + lasso.Project — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Project

+
+
+class lasso.Project(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

Bases: object

+

A single or set of changes to the roadway or transit system.

+

Compares a base and a build transit network or a base and build +highway network and produces project cards.

+

Typical usage example:

+
test_project = Project.create_project(
+    base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+    build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+)
+test_project.evaluate_changes()
+test_project.write_project_card(
+    os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+)
+
+
+
+
+DEFAULT_PROJECT_NAME
+

a class-level constant that defines what +the project name will be if none is set.

+
+ +
+
+STATIC_VALUES
+

a class-level constant which defines values that +are not evaluated when assessing changes.

+
+ +
+
+card_data
+

{“project”: <project_name>, “changes”: <list of change dicts>}

+
+
Type:
+

dict

+
+
+
+ +
+ +

pandas dataframe of CUBE roadway link changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+roadway_node_changes
+

pandas dataframe of CUBE roadway node changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+transit_changes
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+base_roadway_network
+
+
Type:
+

RoadwayNetwork

+
+
+
+ +
+
+base_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+build_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+project_name
+

name of the project, set to DEFAULT_PROJECT_NAME if not provided

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

an instance of the Parameters class which sets a bunch of parameters

+
+ +
+
+__init__(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

ProjectCard constructor.

+
+
Parameters:
+
    +
  • roadway_link_changes – dataframe of roadway changes read from a log file

  • +
  • roadway_node_changes – dataframe of roadway changes read from a log file

  • +
  • transit_changes – dataframe of transit changes read from a log file

  • +
  • base_roadway_network – RoadwayNetwork instance for base case

  • +
  • base_transit_network – StandardTransit instance for base case

  • +
  • base_cube_transit_network – CubeTransit instance for base transit network

  • +
  • build_cube_transit_network – CubeTransit instance for build transit network

  • +
  • project_name – name of the project

  • +
  • evaluate – defaults to false, but if true, will create card data

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
+

returns: instance of ProjectCard

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([roadway_link_changes, ...])

ProjectCard constructor.

add_highway_changes([...])

Evaluates changes from the log file based on the base highway object and adds entries into the self.card_data dictionary.

add_transit_changes()

Evaluates changes between base and build transit objects and adds entries into the self.card_data dictionary.

create_project([roadway_log_file, ...])

Constructor for a Project instance.

determine_roadway_network_changes_compatability(...)

Checks to see that any links or nodes that change exist in base roadway network.

emme_id_to_wrangler_id(emme_link_change_df, ...)

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

emme_name_to_wrangler_name(...)

rename emme names to wrangler names using crosswalk file

evaluate_changes()

Determines which changes should be evaluated, initiates self.card_data to be an aggregation of transit and highway changes.

get_object_from_network_build_command()

determine the network build object is node or link

get_operation_from_network_build_command()

determine the network build object action type

read_logfile(logfilename)

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

read_network_build_file(networkbuildfilename)

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

write_project_card([filename])

Writes project cards.

+

Attributes

+ + + + + + + + + + + + +

CALCULATED_VALUES

DEFAULT_PROJECT_NAME

STATIC_VALUES

+
+
+add_highway_changes(limit_variables_to_existing_network=False)[source]
+

Evaluates changes from the log file based on the base highway object and +adds entries into the self.card_data dictionary.

+
+
Parameters:
+

limit_variables_to_existing_network (bool) – True if no ad-hoc variables. Default to False.

+
+
+
+ +
+
+add_transit_changes()[source]
+

Evaluates changes between base and build transit objects and +adds entries into the self.card_data dictionary.

+
+ +
+
+static create_project(roadway_log_file=None, roadway_shp_file=None, roadway_csv_file=None, network_build_file=None, emme_node_id_crosswalk_file=None, emme_name_crosswalk_file=None, base_roadway_dir=None, base_transit_dir=None, base_cube_transit_source=None, build_cube_transit_source=None, roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name=None, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Constructor for a Project instance.

+
+
Parameters:
+
    +
  • roadway_log_file – File path to consuming logfile or a list of logfile paths.

  • +
  • roadway_shp_file – File path to consuming shape file for roadway changes.

  • +
  • roadway_csv_file – File path to consuming csv file for roadway changes.

  • +
  • network_build_file – File path to consuming EMME network build for network changes.

  • +
  • base_roadway_dir – Folder path to base roadway network.

  • +
  • base_transit_dir – Folder path to base transit network.

  • +
  • base_cube_transit_source – Folder path to base transit network or cube line file string.

  • +
  • base_cube_transit_file – File path to base transit network.

  • +
  • build_cube_transit_source – Folder path to build transit network or cube line file string.

  • +
  • build_cube_transit_file – File path to build transit network.

  • +
  • roadway_link_changes – pandas dataframe of CUBE roadway link changes.

  • +
  • roadway_node_changes – pandas dataframe of CUBE roadway node changes.

  • +
  • transit_changes – build transit changes.

  • +
  • base_roadway_network – Base roadway network object.

  • +
  • base_cube_transit_network – Base cube transit network object.

  • +
  • build_cube_transit_network – Build cube transit network object.

  • +
  • project_name – If not provided, will default to the roadway_log_file filename if +provided (or the first filename if a list is provided)

  • +
  • recalculate_calculated_variables – if reading in a base network, if this is true it +will recalculate variables such as area type, etc. This only needs to be true +if you are creating project cards that are changing the calculated variables.

  • +
  • recalculate_distance – recalculate the distance variable. This only needs to be +true if you are creating project cards that change the distance.

  • +
  • parameters – dictionary of parameters

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in +the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables +in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

A Project instance.

+
+
+
+ +
+
+static determine_roadway_network_changes_compatability(base_roadway_network, roadway_link_changes, roadway_node_changes, parameters)[source]
+

Checks to see that any links or nodes that change exist in base roadway network.

+
+ +
+
+static emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file)[source]
+

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

+
+ +
+
+static emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file)[source]
+

rename emme names to wrangler names using crosswalk file

+
+ +
+
+evaluate_changes()[source]
+

Determines which changes should be evaluated, initiates +self.card_data to be an aggregation of transit and highway changes.

+
+ +
+
+get_object_from_network_build_command()[source]
+

determine the network build object is node or link

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘N’ for node, ‘L’ for link

+
+
+
+ +
+
+get_operation_from_network_build_command()[source]
+

determine the network build object action type

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘A’, ‘C’, ‘D’

+
+
+
+ +
+
+static read_logfile(logfilename)[source]
+

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

logfilename (str or list[str]) – File path to CUBE logfile or list of logfile paths.

+
+
Returns:
+

A DataFrame reprsentation of the log file.

+
+
+
+ +
+
+static read_network_build_file(networkbuildfilename)[source]
+

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

networkbuildfilename (str or list[str]) – File path to emme nework build file or list of network build file paths.

+
+
Returns:
+

A DataFrame representation of the network build file

+
+
+
+ +
+
+write_project_card(filename=None)[source]
+

Writes project cards.

+
+
Parameters:
+

filename (str) – File path to output .yml

+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'assign_group', 'centroidconnect']
+
+ +
+
+DEFAULT_PROJECT_NAME = 'USER TO define'
+
+ +
+
+STATIC_VALUES = ['model_link_id', 'area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_generated/lasso.StandardTransit/index.html b/branch/remove_assignable/_generated/lasso.StandardTransit/index.html new file mode 100644 index 0000000..07f2d1d --- /dev/null +++ b/branch/remove_assignable/_generated/lasso.StandardTransit/index.html @@ -0,0 +1,453 @@ + + + + + + + lasso.StandardTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.StandardTransit

+
+
+class lasso.StandardTransit(ptg_feed, parameters={})[source]
+

Bases: object

+

Holds a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s +Cube Line files.

+

Typical usage example:

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+feed
+

Partridge Feed object containing read-only access to GTFS feed

+
+ +
+
+parameters
+

Parameters instance containing information +about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+__init__(ptg_feed, parameters={})[source]
+
+
Parameters:
+
    +
  • ptg_feed – partridge feed object

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(ptg_feed[, parameters])

+
param ptg_feed:
+

partridge feed object

+
+
+

calculate_cube_mode(row)

Assigns a cube mode number by following logic.

cube_format(row)

Creates a string represnting the route in cube line file notation. #MC :param row: row of a DataFrame representing a cube-formatted trip, with the Attributes trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR.

evaluate_differences(transit_changes)

Compare changes from the transit_changes dataframe with the standard transit network returns the project card changes in dictionary format

fromTransitNetwork(transit_network_object[, ...])

RoadwayNetwork to ModelRoadwayNetwork

read_gtfs(gtfs_feed_dir[, parameters])

Reads GTFS files from a directory and returns a StandardTransit instance.

route_properties_gtfs_to_cube(self)

Prepare gtfs for cube lin file.

shape_gtfs_to_cube(row[, add_nntime])

Creates a list of nodes that for the route in appropriate cube format.

shape_gtfs_to_dict_list(trip_id, shape_id, ...)

This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of stepping through the routed nodes and corresponding them with shape nodes.

shape_gtfs_to_emme(trip_row)

Creates transit segment for the trips in appropriate emme format.

time_to_cube_time_period(start_time_secs[, ...])

Converts seconds from midnight to the cube time period.

write_as_cube_lin([outpath])

Writes the gtfs feed as a cube line file after converting gtfs properties to MetCouncil cube properties.

+
+
+calculate_cube_mode(row)[source]
+

Assigns a cube mode number by following logic. +#MC +For rail, uses GTFS route_type variable: +https://developers.google.com/transit/gtfs/reference

+
+
::

# route_type : cube_mode +route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail

+
+

3: 0, # Bus; further disaggregated for cube +2: 9} # Rail

+
+
+
+

For buses, uses route id numbers and route name to find +express and suburban buses as follows:

+
+
::
+
if not cube_mode:
+
if ‘express’ in row[‘LONGNAME’].lower():

cube_mode = 7 # Express

+
+
elif int(row[‘route_id’].split(“-“)[0]) > 99:

cube_mode = 6 # Suburban Local

+
+
else:

cube_mode = 5 # Urban Local

+
+
+
+
+
+
+
+
Parameters:
+

row – A DataFrame row with route_type, route_long_name, and route_id

+
+
Returns:
+

cube mode number

+
+
+
+ +
+
+cube_format(row)[source]
+

Creates a string represnting the route in cube line file notation. +#MC +:param row: row of a DataFrame representing a cube-formatted trip, with the Attributes

+
+

trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR

+
+
+
Returns:
+

string representation of route in cube line file notation

+
+
+
+ +
+
+evaluate_differences(transit_changes)[source]
+

Compare changes from the transit_changes dataframe with the standard transit network +returns the project card changes in dictionary format

+
+ +
+
+static fromTransitNetwork(transit_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • transit_network_object – Reference to an instance of TransitNetwork.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit

+
+
+
+ +
+
+static read_gtfs(gtfs_feed_dir, parameters={})[source]
+

Reads GTFS files from a directory and returns a StandardTransit +instance.

+
+
Parameters:
+
    +
  • gtfs_feed_dir – location of the GTFS files

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit instance

+
+
+
+ +
+
+static route_properties_gtfs_to_cube(self)[source]
+

Prepare gtfs for cube lin file. +#MC +Does the following operations: +1. Combines route, frequency, trip, and shape information +2. Converts time of day to time periods +3. Calculates cube route name from gtfs route name and properties +4. Assigns a cube-appropriate mode number +5. Assigns a cube-appropriate operator number

+
+
Returns:
+

+
DataFrame of trips with cube-appropriate values for:
    +
  • NAME

  • +
  • ONEWAY

  • +
  • OPERATOR

  • +
  • MODE

  • +
  • HEADWAY

  • +
+
+
+

+
+
Return type:
+

trip_df (DataFrame)

+
+
+
+ +
+
+shape_gtfs_to_cube(row, add_nntime=False)[source]
+

Creates a list of nodes that for the route in appropriate +cube format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a string representation of the node list

for a route in cube format.

+
+
+
+ +
+
+shape_gtfs_to_dict_list(trip_id, shape_id, add_nntime)[source]
+

This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of +stepping through the routed nodes and corresponding them with shape nodes.

+

TODO: eliminate this necessity by tagging the stop nodes in the shapes to begin with when +the transit routing on the roadway network is first performed.

+

As such, I’m copying the code from StandardTransit.shape_gtfs_to_cube() with minimal modifications.

+
+
Parameters:
+
    +
  • question (shape_id of the trip in) –

  • +
  • question

  • +
+
+
Returns:
+

trip_id +shape_id +shape_pt_sequence +shape_mode_node_id +is_stop +access +stop_sequence

+
+
Return type:
+

list of dict records with columns

+
+
+
+ +
+
+shape_gtfs_to_emme(trip_row)[source]
+

Creates transit segment for the trips in appropriate +emme format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a dataframe representation of the transit segment

for a trip in emme format.

+
+
+
+ +
+
+time_to_cube_time_period(start_time_secs, as_str=True, verbose=False)[source]
+

Converts seconds from midnight to the cube time period.

+
+
Parameters:
+
    +
  • start_time_secs – start time for transit trip in seconds +from midnight

  • +
  • as_str – if True, returns the time period as a string, +otherwise returns a numeric time period

  • +
+
+
Returns:
+

+
if as_str is False, returns the numeric

time period

+
+
this_tp: if as_str is True, returns the Cube time period

name abbreviation

+
+
+

+
+
Return type:
+

this_tp_num

+
+
+
+ +
+
+write_as_cube_lin(outpath=None)[source]
+

Writes the gtfs feed as a cube line file after +converting gtfs properties to MetCouncil cube properties. +#MC +:param outpath: File location for output cube line file.

+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_generated/lasso.logger/index.html b/branch/remove_assignable/_generated/lasso.logger/index.html new file mode 100644 index 0000000..b6cabdb --- /dev/null +++ b/branch/remove_assignable/_generated/lasso.logger/index.html @@ -0,0 +1,143 @@ + + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.logger

+

Functions

+ + + + + + +

setupLogging(infoLogFilename, debugLogFilename)

Sets up the logger.

+
+
+lasso.logger.setupLogging(infoLogFilename, debugLogFilename, logToConsole=True)[source]
+

Sets up the logger. The infoLog is terse, just gives the bare minimum of details +so the network composition will be clear later. +The debuglog is very noisy, for debugging.

+

Pass none to either. +Spews it all out to console too, if logToConsole is true.

+
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_generated/lasso.util/index.html b/branch/remove_assignable/_generated/lasso.util/index.html new file mode 100644 index 0000000..d088911 --- /dev/null +++ b/branch/remove_assignable/_generated/lasso.util/index.html @@ -0,0 +1,1696 @@ + + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.util

+

Functions

+ + + + + + + + + + + + + + + + + + + + + + + + +

column_name_to_parts(c[, parameters])

create_locationreference(node, link)

geodesic_point_buffer(lat, lon, meters)

creates circular buffer polygon for node

get_shared_streets_intersection_hash(lat, long)

Calculated per:

hhmmss_to_datetime(hhmmss_str)

Creates a datetime time object from a string of hh:mm:ss

secs_to_datetime(secs)

Creates a datetime time object from a seconds from midnight

shorten_name(name)

+
+
+class lasso.util.Point(*args)[source]
+

Bases: BaseGeometry

+

A geometry type that represents a single coordinate with +x,y and possibly z values.

+

A point is a zero-dimensional feature and has zero length and zero area.

+
+
Parameters:
+

args (float, or sequence of floats) –

The coordinates can either be passed as a single parameter, or as +individual float values using multiple parameters:

+
    +
  1. 1 parameter: a sequence or array-like of with 2 or 3 values.

  2. +
  3. 2 or 3 parameters (float): x, y, and possibly z.

  4. +
+

+
+
+
+
+x, y, z
+

Coordinate values

+
+
Type:
+

float

+
+
+
+ +

Examples

+

Constructing the Point using separate parameters for x and y:

+
>>> p = Point(1.0, -1.0)
+
+
+

Constructing the Point using a list of x, y coordinates:

+
>>> p = Point([1.0, -1.0])
+>>> print(p)
+POINT (1 -1)
+>>> p.y
+-1.0
+>>> p.x
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG circle element for the Point geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG circle diameter. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property x
+

Return x coordinate.

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+

Example

+
>>> x, y = Point(0, 0).xy
+>>> list(x)
+[0.0]
+>>> list(y)
+[0.0]
+
+
+
+ +
+
+property y
+

Return y coordinate.

+
+ +
+
+property z
+

Return z coordinate.

+
+ +
+ +
+
+class lasso.util.Polygon(shell=None, holes=None)[source]
+

Bases: BaseGeometry

+

A geometry type representing an area that is enclosed by a linear ring.

+

A polygon is a two-dimensional feature and has a non-zero area. It may +have one or more negative-space “holes” which are also bounded by linear +rings. If any rings cross each other, the feature is invalid and +operations on it may fail.

+
+
Parameters:
+
    +
  • shell (sequence) – A sequence of (x, y [,z]) numeric coordinate pairs or triples, or +an array-like with shape (N, 2) or (N, 3). +Also can be a sequence of Point objects.

  • +
  • holes (sequence) – A sequence of objects which satisfy the same requirements as the +shell parameters above

  • +
+
+
+
+
+exterior
+

The ring which bounds the positive space of the polygon.

+
+
Type:
+

LinearRing

+
+
+
+ +
+
+interiors
+

A sequence of rings which bound all existing holes.

+
+
Type:
+

sequence

+
+
+
+ +

Examples

+

Create a square polygon with no holes

+
>>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.))
+>>> polygon = Polygon(coords)
+>>> polygon.area
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+classmethod from_bounds(xmin, ymin, xmax, ymax)[source]
+

Construct a Polygon() from spatial bounds.

+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG path element for the Polygon geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG stroke-width. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property exterior
+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property interiors
+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+
+ +
+ +
+
+class lasso.util.partial[source]
+

Bases: object

+

partial(func, *args, **keywords) - new function with partial application +of the given arguments and keywords.

+
+
+args
+

tuple of arguments to future partial calls

+
+ +
+
+func
+

function object to use in future partial calls

+
+ +
+
+keywords
+

dictionary of keyword arguments to future partial calls

+
+ +
+ +
+
+lasso.util.column_name_to_parts(c, parameters=None)[source]
+
+ +
+
+lasso.util.create_locationreference(node, link)[source]
+
+ +
+
+lasso.util.geodesic_point_buffer(lat, lon, meters)[source]
+

creates circular buffer polygon for node

+
+
Parameters:
+
    +
  • lat – node lat

  • +
  • lon – node lon

  • +
  • meters – buffer distance, radius of circle

  • +
+
+
Returns:
+

Polygon

+
+
+
+ +
+
+lasso.util.get_shared_streets_intersection_hash(lat, long, osm_node_id=None)[source]
+
+
Calculated per:

https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565

+
+
Expected in/out
+
-93.0965985, 44.952112199999995 osm_node_id = 954734870

69f13f881649cb21ee3b359730790bb9

+
+
+
+
+
+ +
+
+lasso.util.hhmmss_to_datetime(hhmmss_str)[source]
+

Creates a datetime time object from a string of hh:mm:ss

+
+
Parameters:
+

hhmmss_str – string of hh:mm:ss

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.secs_to_datetime(secs)[source]
+

Creates a datetime time object from a seconds from midnight

+
+
Parameters:
+

secs – seconds from midnight

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.shorten_name(name)[source]
+
+ +
+
+lasso.util.transform(func, geom)[source]
+

Applies func to all coordinates of geom and returns a new +geometry of the same type from the transformed coordinates.

+

func maps x, y, and optionally z to output xp, yp, zp. The input +parameters may iterable types like lists or arrays or single values. +The output shall be of the same type. Scalars in, scalars out. +Lists in, lists out.

+

For example, here is an identity function applicable to both types +of input.

+
+
+
def id_func(x, y, z=None):

return tuple(filter(None, [x, y, z]))

+
+
+

g2 = transform(id_func, g1)

+
+

Using pyproj >= 2.1, this example will accurately project Shapely geometries:

+
+

import pyproj

+

wgs84 = pyproj.CRS(‘EPSG:4326’) +utm = pyproj.CRS(‘EPSG:32618’)

+

project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform

+

g2 = transform(project, g1)

+
+

Note that the always_xy kwarg is required here as Shapely geometries only support +X,Y coordinate ordering.

+

Lambda expressions such as the one in

+
+

g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1)

+
+

also satisfy the requirements for func.

+
+ +
+
+lasso.util.unidecode(string, errors='ignore', replace_str='?')
+

Transliterate an Unicode object into an ASCII string

+
+
Return type:
+

str

+
+
+
>>> unidecode("北亰")
+"Bei Jing "
+
+
+

This function first tries to convert the string using ASCII codec. +If it fails (because of non-ASCII characters), it falls back to +transliteration using the character tables.

+

This is approx. five times faster if the string only contains ASCII +characters, but slightly slower than unicode_expect_nonascii if +non-ASCII characters are present.

+

errors specifies what to do with characters that have not been +found in replacement tables. The default is ‘ignore’ which ignores +the character. ‘strict’ raises an UnidecodeError. ‘replace’ +substitutes the character with replace_str (default is ‘?’). +‘preserve’ keeps the original character.

+

Note that if ‘preserve’ is used the returned string might not be +ASCII!

+
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_modules/functools/index.html b/branch/remove_assignable/_modules/functools/index.html new file mode 100644 index 0000000..b3abe79 --- /dev/null +++ b/branch/remove_assignable/_modules/functools/index.html @@ -0,0 +1,1083 @@ + + + + + + functools — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for functools

+"""functools.py - Tools for working with functions and callable objects
+"""
+# Python module wrapper for _functools C module
+# to allow utilities written in Python to be added
+# to the functools module.
+# Written by Nick Coghlan <ncoghlan at gmail.com>,
+# Raymond Hettinger <python at rcn.com>,
+# and Łukasz Langa <lukasz at langa.pl>.
+#   Copyright (C) 2006-2013 Python Software Foundation.
+# See C source code for _functools credits/copyright
+
+__all__ = ['update_wrapper', 'wraps', 'WRAPPER_ASSIGNMENTS', 'WRAPPER_UPDATES',
+           'total_ordering', 'cmp_to_key', 'lru_cache', 'reduce', 'partial',
+           'partialmethod', 'singledispatch', 'singledispatchmethod',
+           "cached_property"]
+
+from abc import get_cache_token
+from collections import namedtuple
+# import types, weakref  # Deferred to single_dispatch()
+from reprlib import recursive_repr
+from _thread import RLock
+
+
+################################################################################
+### update_wrapper() and wraps() decorator
+################################################################################
+
+# update_wrapper() and wraps() are tools to help write
+# wrapper functions that can handle naive introspection
+
+WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__',
+                       '__annotations__')
+WRAPPER_UPDATES = ('__dict__',)
+def update_wrapper(wrapper,
+                   wrapped,
+                   assigned = WRAPPER_ASSIGNMENTS,
+                   updated = WRAPPER_UPDATES):
+    """Update a wrapper function to look like the wrapped function
+
+       wrapper is the function to be updated
+       wrapped is the original function
+       assigned is a tuple naming the attributes assigned directly
+       from the wrapped function to the wrapper function (defaults to
+       functools.WRAPPER_ASSIGNMENTS)
+       updated is a tuple naming the attributes of the wrapper that
+       are updated with the corresponding attribute from the wrapped
+       function (defaults to functools.WRAPPER_UPDATES)
+    """
+    for attr in assigned:
+        try:
+            value = getattr(wrapped, attr)
+        except AttributeError:
+            pass
+        else:
+            setattr(wrapper, attr, value)
+    for attr in updated:
+        getattr(wrapper, attr).update(getattr(wrapped, attr, {}))
+    # Issue #17482: set __wrapped__ last so we don't inadvertently copy it
+    # from the wrapped function when updating __dict__
+    wrapper.__wrapped__ = wrapped
+    # Return the wrapper so this can be used as a decorator via partial()
+    return wrapper
+
+def wraps(wrapped,
+          assigned = WRAPPER_ASSIGNMENTS,
+          updated = WRAPPER_UPDATES):
+    """Decorator factory to apply update_wrapper() to a wrapper function
+
+       Returns a decorator that invokes update_wrapper() with the decorated
+       function as the wrapper argument and the arguments to wraps() as the
+       remaining arguments. Default arguments are as for update_wrapper().
+       This is a convenience function to simplify applying partial() to
+       update_wrapper().
+    """
+    return partial(update_wrapper, wrapped=wrapped,
+                   assigned=assigned, updated=updated)
+
+
+################################################################################
+### total_ordering class decorator
+################################################################################
+
+# The total ordering functions all invoke the root magic method directly
+# rather than using the corresponding operator.  This avoids possible
+# infinite recursion that could occur when the operator dispatch logic
+# detects a NotImplemented result and then calls a reflected method.
+
+def _gt_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a < b) and (a != b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _le_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (a < b) or (a == b).'
+    op_result = self.__lt__(other)
+    return op_result or self == other
+
+def _ge_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a < b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _ge_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a <= b) or (a == b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _lt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (a <= b) and (a != b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _gt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a <= b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _lt_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a > b) and (a != b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _ge_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (a > b) or (a == b).'
+    op_result = self.__gt__(other)
+    return op_result or self == other
+
+def _le_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a > b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _le_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a >= b) or (a == b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _gt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (a >= b) and (a != b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _lt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a >= b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+_convert = {
+    '__lt__': [('__gt__', _gt_from_lt),
+               ('__le__', _le_from_lt),
+               ('__ge__', _ge_from_lt)],
+    '__le__': [('__ge__', _ge_from_le),
+               ('__lt__', _lt_from_le),
+               ('__gt__', _gt_from_le)],
+    '__gt__': [('__lt__', _lt_from_gt),
+               ('__ge__', _ge_from_gt),
+               ('__le__', _le_from_gt)],
+    '__ge__': [('__le__', _le_from_ge),
+               ('__gt__', _gt_from_ge),
+               ('__lt__', _lt_from_ge)]
+}
+
+def total_ordering(cls):
+    """Class decorator that fills in missing ordering methods"""
+    # Find user-defined comparisons (not those inherited from object).
+    roots = {op for op in _convert if getattr(cls, op, None) is not getattr(object, op, None)}
+    if not roots:
+        raise ValueError('must define at least one ordering operation: < > <= >=')
+    root = max(roots)       # prefer __lt__ to __le__ to __gt__ to __ge__
+    for opname, opfunc in _convert[root]:
+        if opname not in roots:
+            opfunc.__name__ = opname
+            setattr(cls, opname, opfunc)
+    return cls
+
+
+################################################################################
+### cmp_to_key() function converter
+################################################################################
+
+def cmp_to_key(mycmp):
+    """Convert a cmp= function into a key= function"""
+    class K(object):
+        __slots__ = ['obj']
+        def __init__(self, obj):
+            self.obj = obj
+        def __lt__(self, other):
+            return mycmp(self.obj, other.obj) < 0
+        def __gt__(self, other):
+            return mycmp(self.obj, other.obj) > 0
+        def __eq__(self, other):
+            return mycmp(self.obj, other.obj) == 0
+        def __le__(self, other):
+            return mycmp(self.obj, other.obj) <= 0
+        def __ge__(self, other):
+            return mycmp(self.obj, other.obj) >= 0
+        __hash__ = None
+    return K
+
+try:
+    from _functools import cmp_to_key
+except ImportError:
+    pass
+
+
+################################################################################
+### reduce() sequence to a single item
+################################################################################
+
+_initial_missing = object()
+
+def reduce(function, sequence, initial=_initial_missing):
+    """
+    reduce(function, sequence[, initial]) -> value
+
+    Apply a function of two arguments cumulatively to the items of a sequence,
+    from left to right, so as to reduce the sequence to a single value.
+    For example, reduce(lambda x, y: x+y, [1, 2, 3, 4, 5]) calculates
+    ((((1+2)+3)+4)+5).  If initial is present, it is placed before the items
+    of the sequence in the calculation, and serves as a default when the
+    sequence is empty.
+    """
+
+    it = iter(sequence)
+
+    if initial is _initial_missing:
+        try:
+            value = next(it)
+        except StopIteration:
+            raise TypeError("reduce() of empty sequence with no initial value") from None
+    else:
+        value = initial
+
+    for element in it:
+        value = function(value, element)
+
+    return value
+
+try:
+    from _functools import reduce
+except ImportError:
+    pass
+
+
+################################################################################
+### partial() argument application
+################################################################################
+
+# Purely functional, no descriptor behaviour
+
[docs]class partial: + """New function with partial application of the given arguments + and keywords. + """ + + __slots__ = "func", "args", "keywords", "__dict__", "__weakref__" + + def __new__(cls, func, /, *args, **keywords): + if not callable(func): + raise TypeError("the first argument must be callable") + + if hasattr(func, "func"): + args = func.args + args + keywords = {**func.keywords, **keywords} + func = func.func + + self = super(partial, cls).__new__(cls) + + self.func = func + self.args = args + self.keywords = keywords + return self + + def __call__(self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(*self.args, *args, **keywords) + + @recursive_repr() + def __repr__(self): + qualname = type(self).__qualname__ + args = [repr(self.func)] + args.extend(repr(x) for x in self.args) + args.extend(f"{k}={v!r}" for (k, v) in self.keywords.items()) + if type(self).__module__ == "functools": + return f"functools.{qualname}({', '.join(args)})" + return f"{qualname}({', '.join(args)})" + + def __reduce__(self): + return type(self), (self.func,), (self.func, self.args, + self.keywords or None, self.__dict__ or None) + + def __setstate__(self, state): + if not isinstance(state, tuple): + raise TypeError("argument to __setstate__ must be a tuple") + if len(state) != 4: + raise TypeError(f"expected 4 items in state, got {len(state)}") + func, args, kwds, namespace = state + if (not callable(func) or not isinstance(args, tuple) or + (kwds is not None and not isinstance(kwds, dict)) or + (namespace is not None and not isinstance(namespace, dict))): + raise TypeError("invalid partial state") + + args = tuple(args) # just in case it's a subclass + if kwds is None: + kwds = {} + elif type(kwds) is not dict: # XXX does it need to be *exactly* dict? + kwds = dict(kwds) + if namespace is None: + namespace = {} + + self.__dict__ = namespace + self.func = func + self.args = args + self.keywords = kwds
+ +try: + from _functools import partial +except ImportError: + pass + +# Descriptor version +class partialmethod(object): + """Method descriptor with partial application of the given arguments + and keywords. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(*args, **keywords): + if len(args) >= 2: + self, func, *args = args + elif not args: + raise TypeError("descriptor '__init__' of partialmethod " + "needs an argument") + elif 'func' in keywords: + func = keywords.pop('func') + self, *args = args + import warnings + warnings.warn("Passing 'func' as keyword argument is deprecated", + DeprecationWarning, stacklevel=2) + else: + raise TypeError("type 'partialmethod' takes at least one argument, " + "got %d" % (len(args)-1)) + args = tuple(args) + + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError("{!r} is not callable or a descriptor" + .format(func)) + + # func could be a descriptor like classmethod which isn't callable, + # so we can't inherit from partial (it verifies func is callable) + if isinstance(func, partialmethod): + # flattening is mandatory in order to place cls/self before all + # other arguments + # it's also more efficient since only one function will be called + self.func = func.func + self.args = func.args + args + self.keywords = {**func.keywords, **keywords} + else: + self.func = func + self.args = args + self.keywords = keywords + __init__.__text_signature__ = '($self, func, /, *args, **keywords)' + + def __repr__(self): + args = ", ".join(map(repr, self.args)) + keywords = ", ".join("{}={!r}".format(k, v) + for k, v in self.keywords.items()) + format_string = "{module}.{cls}({func}, {args}, {keywords})" + return format_string.format(module=self.__class__.__module__, + cls=self.__class__.__qualname__, + func=self.func, + args=args, + keywords=keywords) + + def _make_unbound_method(self): + def _method(cls_or_self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(cls_or_self, *self.args, *args, **keywords) + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method._partialmethod = self + return _method + + def __get__(self, obj, cls=None): + get = getattr(self.func, "__get__", None) + result = None + if get is not None: + new_func = get(obj, cls) + if new_func is not self.func: + # Assume __get__ returning something new indicates the + # creation of an appropriate callable + result = partial(new_func, *self.args, **self.keywords) + try: + result.__self__ = new_func.__self__ + except AttributeError: + pass + if result is None: + # If the underlying descriptor didn't do anything, treat this + # like an instance method + result = self._make_unbound_method().__get__(obj, cls) + return result + + @property + def __isabstractmethod__(self): + return getattr(self.func, "__isabstractmethod__", False) + +# Helper functions + +def _unwrap_partial(func): + while isinstance(func, partial): + func = func.func + return func + +################################################################################ +### LRU Cache function decorator +################################################################################ + +_CacheInfo = namedtuple("CacheInfo", ["hits", "misses", "maxsize", "currsize"]) + +class _HashedSeq(list): + """ This class guarantees that hash() will be called no more than once + per element. This is important because the lru_cache() will hash + the key multiple times on a cache miss. + + """ + + __slots__ = 'hashvalue' + + def __init__(self, tup, hash=hash): + self[:] = tup + self.hashvalue = hash(tup) + + def __hash__(self): + return self.hashvalue + +def _make_key(args, kwds, typed, + kwd_mark = (object(),), + fasttypes = {int, str}, + tuple=tuple, type=type, len=len): + """Make a cache key from optionally typed positional and keyword arguments + + The key is constructed in a way that is flat as possible rather than + as a nested structure that would take more memory. + + If there is only a single argument and its data type is known to cache + its hash value, then that argument is returned without a wrapper. This + saves space and improves lookup speed. + + """ + # All of code below relies on kwds preserving the order input by the user. + # Formerly, we sorted() the kwds before looping. The new way is *much* + # faster; however, it means that f(x=1, y=2) will now be treated as a + # distinct call from f(y=2, x=1) which will be cached separately. + key = args + if kwds: + key += kwd_mark + for item in kwds.items(): + key += item + if typed: + key += tuple(type(v) for v in args) + if kwds: + key += tuple(type(v) for v in kwds.values()) + elif len(key) == 1 and type(key[0]) in fasttypes: + return key[0] + return _HashedSeq(key) + +def lru_cache(maxsize=128, typed=False): + """Least-recently-used cache decorator. + + If *maxsize* is set to None, the LRU features are disabled and the cache + can grow without bound. + + If *typed* is True, arguments of different types will be cached separately. + For example, f(3.0) and f(3) will be treated as distinct calls with + distinct results. + + Arguments to the cached function must be hashable. + + View the cache statistics named tuple (hits, misses, maxsize, currsize) + with f.cache_info(). Clear the cache and statistics with f.cache_clear(). + Access the underlying function with f.__wrapped__. + + See: http://en.wikipedia.org/wiki/Cache_replacement_policies#Least_recently_used_(LRU) + + """ + + # Users should only access the lru_cache through its public API: + # cache_info, cache_clear, and f.__wrapped__ + # The internals of the lru_cache are encapsulated for thread safety and + # to allow the implementation to change (including a possible C version). + + if isinstance(maxsize, int): + # Negative maxsize is treated as 0 + if maxsize < 0: + maxsize = 0 + elif callable(maxsize) and isinstance(typed, bool): + # The user_function was passed in directly via the maxsize argument + user_function, maxsize = maxsize, 128 + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + elif maxsize is not None: + raise TypeError( + 'Expected first argument to be an integer, a callable, or None') + + def decorating_function(user_function): + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + + return decorating_function + +def _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo): + # Constants shared by all lru cache instances: + sentinel = object() # unique object used to signal cache misses + make_key = _make_key # build a key from the function arguments + PREV, NEXT, KEY, RESULT = 0, 1, 2, 3 # names for the link fields + + cache = {} + hits = misses = 0 + full = False + cache_get = cache.get # bound method to lookup a key or return None + cache_len = cache.__len__ # get cache size without calling len() + lock = RLock() # because linkedlist updates aren't threadsafe + root = [] # root of the circular doubly linked list + root[:] = [root, root, None, None] # initialize by pointing to self + + if maxsize == 0: + + def wrapper(*args, **kwds): + # No caching -- just a statistics update + nonlocal misses + misses += 1 + result = user_function(*args, **kwds) + return result + + elif maxsize is None: + + def wrapper(*args, **kwds): + # Simple caching without ordering or size limit + nonlocal hits, misses + key = make_key(args, kwds, typed) + result = cache_get(key, sentinel) + if result is not sentinel: + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + cache[key] = result + return result + + else: + + def wrapper(*args, **kwds): + # Size limited caching that tracks accesses by recency + nonlocal root, hits, misses, full + key = make_key(args, kwds, typed) + with lock: + link = cache_get(key) + if link is not None: + # Move the link to the front of the circular queue + link_prev, link_next, _key, result = link + link_prev[NEXT] = link_next + link_next[PREV] = link_prev + last = root[PREV] + last[NEXT] = root[PREV] = link + link[PREV] = last + link[NEXT] = root + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + with lock: + if key in cache: + # Getting here means that this same key was added to the + # cache while the lock was released. Since the link + # update is already done, we need only return the + # computed result and update the count of misses. + pass + elif full: + # Use the old root to store the new key and result. + oldroot = root + oldroot[KEY] = key + oldroot[RESULT] = result + # Empty the oldest link and make it the new root. + # Keep a reference to the old key and old result to + # prevent their ref counts from going to zero during the + # update. That will prevent potentially arbitrary object + # clean-up code (i.e. __del__) from running while we're + # still adjusting the links. + root = oldroot[NEXT] + oldkey = root[KEY] + oldresult = root[RESULT] + root[KEY] = root[RESULT] = None + # Now update the cache dictionary. + del cache[oldkey] + # Save the potentially reentrant cache[key] assignment + # for last, after the root and links have been put in + # a consistent state. + cache[key] = oldroot + else: + # Put result in a new link at the front of the queue. + last = root[PREV] + link = [last, root, key, result] + last[NEXT] = root[PREV] = cache[key] = link + # Use the cache_len bound method instead of the len() function + # which could potentially be wrapped in an lru_cache itself. + full = (cache_len() >= maxsize) + return result + + def cache_info(): + """Report cache statistics""" + with lock: + return _CacheInfo(hits, misses, maxsize, cache_len()) + + def cache_clear(): + """Clear the cache and cache statistics""" + nonlocal hits, misses, full + with lock: + cache.clear() + root[:] = [root, root, None, None] + hits = misses = 0 + full = False + + wrapper.cache_info = cache_info + wrapper.cache_clear = cache_clear + return wrapper + +try: + from _functools import _lru_cache_wrapper +except ImportError: + pass + + +################################################################################ +### singledispatch() - single-dispatch generic function decorator +################################################################################ + +def _c3_merge(sequences): + """Merges MROs in *sequences* to a single MRO using the C3 algorithm. + + Adapted from http://www.python.org/download/releases/2.3/mro/. + + """ + result = [] + while True: + sequences = [s for s in sequences if s] # purge empty sequences + if not sequences: + return result + for s1 in sequences: # find merge candidates among seq heads + candidate = s1[0] + for s2 in sequences: + if candidate in s2[1:]: + candidate = None + break # reject the current head, it appears later + else: + break + if candidate is None: + raise RuntimeError("Inconsistent hierarchy") + result.append(candidate) + # remove the chosen candidate + for seq in sequences: + if seq[0] == candidate: + del seq[0] + +def _c3_mro(cls, abcs=None): + """Computes the method resolution order using extended C3 linearization. + + If no *abcs* are given, the algorithm works exactly like the built-in C3 + linearization used for method resolution. + + If given, *abcs* is a list of abstract base classes that should be inserted + into the resulting MRO. Unrelated ABCs are ignored and don't end up in the + result. The algorithm inserts ABCs where their functionality is introduced, + i.e. issubclass(cls, abc) returns True for the class itself but returns + False for all its direct base classes. Implicit ABCs for a given class + (either registered or inferred from the presence of a special method like + __len__) are inserted directly after the last ABC explicitly listed in the + MRO of said class. If two implicit ABCs end up next to each other in the + resulting MRO, their ordering depends on the order of types in *abcs*. + + """ + for i, base in enumerate(reversed(cls.__bases__)): + if hasattr(base, '__abstractmethods__'): + boundary = len(cls.__bases__) - i + break # Bases up to the last explicit ABC are considered first. + else: + boundary = 0 + abcs = list(abcs) if abcs else [] + explicit_bases = list(cls.__bases__[:boundary]) + abstract_bases = [] + other_bases = list(cls.__bases__[boundary:]) + for base in abcs: + if issubclass(cls, base) and not any( + issubclass(b, base) for b in cls.__bases__ + ): + # If *cls* is the class that introduces behaviour described by + # an ABC *base*, insert said ABC to its MRO. + abstract_bases.append(base) + for base in abstract_bases: + abcs.remove(base) + explicit_c3_mros = [_c3_mro(base, abcs=abcs) for base in explicit_bases] + abstract_c3_mros = [_c3_mro(base, abcs=abcs) for base in abstract_bases] + other_c3_mros = [_c3_mro(base, abcs=abcs) for base in other_bases] + return _c3_merge( + [[cls]] + + explicit_c3_mros + abstract_c3_mros + other_c3_mros + + [explicit_bases] + [abstract_bases] + [other_bases] + ) + +def _compose_mro(cls, types): + """Calculates the method resolution order for a given class *cls*. + + Includes relevant abstract base classes (with their respective bases) from + the *types* iterable. Uses a modified C3 linearization algorithm. + + """ + bases = set(cls.__mro__) + # Remove entries which are already present in the __mro__ or unrelated. + def is_related(typ): + return (typ not in bases and hasattr(typ, '__mro__') + and issubclass(cls, typ)) + types = [n for n in types if is_related(n)] + # Remove entries which are strict bases of other entries (they will end up + # in the MRO anyway. + def is_strict_base(typ): + for other in types: + if typ != other and typ in other.__mro__: + return True + return False + types = [n for n in types if not is_strict_base(n)] + # Subclasses of the ABCs in *types* which are also implemented by + # *cls* can be used to stabilize ABC ordering. + type_set = set(types) + mro = [] + for typ in types: + found = [] + for sub in typ.__subclasses__(): + if sub not in bases and issubclass(cls, sub): + found.append([s for s in sub.__mro__ if s in type_set]) + if not found: + mro.append(typ) + continue + # Favor subclasses with the biggest number of useful bases + found.sort(key=len, reverse=True) + for sub in found: + for subcls in sub: + if subcls not in mro: + mro.append(subcls) + return _c3_mro(cls, abcs=mro) + +def _find_impl(cls, registry): + """Returns the best matching implementation from *registry* for type *cls*. + + Where there is no registered implementation for a specific type, its method + resolution order is used to find a more generic implementation. + + Note: if *registry* does not contain an implementation for the base + *object* type, this function may return None. + + """ + mro = _compose_mro(cls, registry.keys()) + match = None + for t in mro: + if match is not None: + # If *match* is an implicit ABC but there is another unrelated, + # equally matching implicit ABC, refuse the temptation to guess. + if (t in registry and t not in cls.__mro__ + and match not in cls.__mro__ + and not issubclass(match, t)): + raise RuntimeError("Ambiguous dispatch: {} or {}".format( + match, t)) + break + if t in registry: + match = t + return registry.get(match) + +def singledispatch(func): + """Single-dispatch generic function decorator. + + Transforms a function into a generic function, which can have different + behaviours depending upon the type of its first argument. The decorated + function acts as the default implementation, and additional + implementations can be registered using the register() attribute of the + generic function. + """ + # There are many programs that use functools without singledispatch, so we + # trade-off making singledispatch marginally slower for the benefit of + # making start-up of such applications slightly faster. + import types, weakref + + registry = {} + dispatch_cache = weakref.WeakKeyDictionary() + cache_token = None + + def dispatch(cls): + """generic_func.dispatch(cls) -> <function implementation> + + Runs the dispatch algorithm to return the best available implementation + for the given *cls* registered on *generic_func*. + + """ + nonlocal cache_token + if cache_token is not None: + current_token = get_cache_token() + if cache_token != current_token: + dispatch_cache.clear() + cache_token = current_token + try: + impl = dispatch_cache[cls] + except KeyError: + try: + impl = registry[cls] + except KeyError: + impl = _find_impl(cls, registry) + dispatch_cache[cls] = impl + return impl + + def register(cls, func=None): + """generic_func.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_func*. + + """ + nonlocal cache_token + if func is None: + if isinstance(cls, type): + return lambda f: register(cls, f) + ann = getattr(cls, '__annotations__', {}) + if not ann: + raise TypeError( + f"Invalid first argument to `register()`: {cls!r}. " + f"Use either `@register(some_class)` or plain `@register` " + f"on an annotated function." + ) + func = cls + + # only import typing if annotation parsing is necessary + from typing import get_type_hints + argname, cls = next(iter(get_type_hints(func).items())) + if not isinstance(cls, type): + raise TypeError( + f"Invalid annotation for {argname!r}. " + f"{cls!r} is not a class." + ) + registry[cls] = func + if cache_token is None and hasattr(cls, '__abstractmethods__'): + cache_token = get_cache_token() + dispatch_cache.clear() + return func + + def wrapper(*args, **kw): + if not args: + raise TypeError(f'{funcname} requires at least ' + '1 positional argument') + + return dispatch(args[0].__class__)(*args, **kw) + + funcname = getattr(func, '__name__', 'singledispatch function') + registry[object] = func + wrapper.register = register + wrapper.dispatch = dispatch + wrapper.registry = types.MappingProxyType(registry) + wrapper._clear_cache = dispatch_cache.clear + update_wrapper(wrapper, func) + return wrapper + + +# Descriptor version +class singledispatchmethod: + """Single-dispatch generic method descriptor. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(self, func): + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError(f"{func!r} is not callable or a descriptor") + + self.dispatcher = singledispatch(func) + self.func = func + + def register(self, cls, method=None): + """generic_method.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_method*. + """ + return self.dispatcher.register(cls, func=method) + + def __get__(self, obj, cls=None): + def _method(*args, **kwargs): + method = self.dispatcher.dispatch(args[0].__class__) + return method.__get__(obj, cls)(*args, **kwargs) + + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method.register = self.register + update_wrapper(_method, self.func) + return _method + + @property + def __isabstractmethod__(self): + return getattr(self.func, '__isabstractmethod__', False) + + +################################################################################ +### cached_property() - computed once per instance, cached as attribute +################################################################################ + +_NOT_FOUND = object() + + +class cached_property: + def __init__(self, func): + self.func = func + self.attrname = None + self.__doc__ = func.__doc__ + self.lock = RLock() + + def __set_name__(self, owner, name): + if self.attrname is None: + self.attrname = name + elif name != self.attrname: + raise TypeError( + "Cannot assign the same cached_property to two different names " + f"({self.attrname!r} and {name!r})." + ) + + def __get__(self, instance, owner=None): + if instance is None: + return self + if self.attrname is None: + raise TypeError( + "Cannot use cached_property instance without calling __set_name__ on it.") + try: + cache = instance.__dict__ + except AttributeError: # not all objects have __dict__ (e.g. class defines slots) + msg = ( + f"No '__dict__' attribute on {type(instance).__name__!r} " + f"instance to cache {self.attrname!r} property." + ) + raise TypeError(msg) from None + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + with self.lock: + # check if another thread filled cache while we awaited lock + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + val = self.func(instance) + try: + cache[self.attrname] = val + except TypeError: + msg = ( + f"The '__dict__' attribute on {type(instance).__name__!r} instance " + f"does not support item assignment for caching {self.attrname!r} property." + ) + raise TypeError(msg) from None + return val +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_modules/index.html b/branch/remove_assignable/_modules/index.html new file mode 100644 index 0000000..d50382b --- /dev/null +++ b/branch/remove_assignable/_modules/index.html @@ -0,0 +1,116 @@ + + + + + + Overview: module code — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+ +
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_modules/lasso/logger/index.html b/branch/remove_assignable/_modules/lasso/logger/index.html new file mode 100644 index 0000000..53dd18d --- /dev/null +++ b/branch/remove_assignable/_modules/lasso/logger/index.html @@ -0,0 +1,153 @@ + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.logger

+import logging
+
+__all__ = ["WranglerLogger", "setupLogging"]
+
+
+# for all the Wrangler logging needs!
+WranglerLogger = logging.getLogger("WranglerLogger")
+
+
+
[docs]def setupLogging(infoLogFilename, debugLogFilename, logToConsole=True): + """Sets up the logger. The infoLog is terse, just gives the bare minimum of details + so the network composition will be clear later. + The debuglog is very noisy, for debugging. + + Pass none to either. + Spews it all out to console too, if logToConsole is true. + """ + # clear handlers if any exist already + WranglerLogger.handlers = [] + + # create a logger + WranglerLogger.setLevel(logging.DEBUG) + + if infoLogFilename: + infologhandler = logging.StreamHandler(open(infoLogFilename, "w")) + infologhandler.setLevel(logging.INFO) + infologhandler.setFormatter( + logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") + ) + WranglerLogger.addHandler(infologhandler) + + if debugLogFilename: + debugloghandler = logging.StreamHandler(open(debugLogFilename, "w")) + debugloghandler.setLevel(logging.DEBUG) + debugloghandler.setFormatter( + logging.Formatter("%(asctime)s %(levelname)s %(message)s", "%Y-%m-%d %H:%M") + ) + WranglerLogger.addHandler(debugloghandler) + + if logToConsole: + consolehandler = logging.StreamHandler() + consolehandler.setLevel(logging.DEBUG) + consolehandler.setFormatter( + logging.Formatter("%(name)-12s: %(levelname)-8s %(message)s") + ) + WranglerLogger.addHandler(consolehandler)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_modules/lasso/parameters/index.html b/branch/remove_assignable/_modules/lasso/parameters/index.html new file mode 100644 index 0000000..9c8122b --- /dev/null +++ b/branch/remove_assignable/_modules/lasso/parameters/index.html @@ -0,0 +1,1047 @@ + + + + + + lasso.parameters — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.parameters

+import os
+from .logger import WranglerLogger
+
+
+from pyproj import CRS
+
+
+def get_base_dir(lasso_base_dir=os.getcwd()):
+    d = lasso_base_dir
+    for i in range(3):
+        if "metcouncil_data" in os.listdir(d):
+
+            WranglerLogger.info("Lasso base directory set as: {}".format(d))
+            return d
+        d = os.path.dirname(d)
+
+    msg = "Cannot find Lasso base directory from {}, please input using keyword in parameters: `lasso_base_dir =` ".format(
+        lasso_base_dir
+    )
+    WranglerLogger.error(msg)
+    raise (ValueError(msg))
+
+
+
[docs]class Parameters: + """A class representing all the parameters defining the networks + including time of day, categories, etc. + + Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + .. highlight:: python + + Attr: + time_period_to_time (dict): Maps time period abbreviations used in + Cube to time of days used on gtfs and highway network standard + Default: + :: + { + "EA": ("3:00", "6:00"), + "AM": ("6:00, "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + cube_time_periods (dict): Maps cube time period numbers used in + transit line files to the time period abbreviations in time_period_to_time + dictionary. + Default: + :: + {"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"} + categories (dict): Maps demand category abbreviations to a list of + network categories they are allowed to use. + Default: + :: + { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + properties_to_split (dict): Dictionary mapping variables in standard + roadway network to categories and time periods that need to be + split out in final model network to get variables like LANES_AM. + Default: + :: + { + "lanes": { + "v": "lanes", + "time_periods": self.time_periods_to_time + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_periods_to_time + }, + "use": { + "v": "use", + "time_periods": self.time_periods_to_time + }, + } + + county_shape (str): File location of shapefile defining counties. + Default: + :: + r"metcouncil_data/county/cb_2017_us_county_5m.shp" + + county_variable_shp (str): Property defining the county n ame in + the county_shape file. + Default: + :: + NAME + lanes_lookup_file (str): Lookup table of number of lanes for different data sources. + Default: + :: + r"metcouncil_data/lookups/lanes.csv" + centroid_connect_lanes (int): Number of lanes for centroid connectors. + Default: + :: + 1 + mpo_counties (list): list of county names within MPO boundary. + Default: + :: + [ + "ANOKA", + "DAKOTA", + "HENNEPIN", + "RAMSEY", + "SCOTT", + "WASHINGTON", + "CARVER", + ] + + taz_shape (str): + Default: + :: + r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp" + taz_data (str): + Default: + :: + ?? + highest_taz_number (int): highest TAZ number in order to define + centroid connectors. + Default: + :: + 3100 + + output_variables (list): list of variables to output in final model + network. + Default: + :: + [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + "distance", + "roadway", + "name", + "roadway_class", + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "trn_priority_EA", + "trn_priority_AM", + "trn_priority_MD", + "trn_priority_PM", + "trn_priority_EV", + "ttime_assert_EA", + "ttime_assert_AM", + "ttime_assert_MD", + "ttime_assert_PM", + "ttime_assert_EV", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "price_sov_EA", + "price_hov2_EA", + "price_hov3_EA", + "price_truck_EA", + "price_sov_AM", + "price_hov2_AM", + "price_hov3_AM", + "price_truck_AM", + "price_sov_MD", + "price_hov2_MD", + "price_hov3_MD", + "price_truck_MD", + "price_sov_PM", + "price_hov2_PM", + "price_hov3_PM", + "price_truck_PM", + "price_sov_EV", + "price_hov2_EV", + "price_hov3_EV", + "price_truck_EV", + "roadway_class_idx", + "facility_type", + "county", + "centroidconnect", + "model_node_id", + "N", + "osm_node_id", + "bike_node", + "transit_node", + "walk_node", + "drive_node", + "geometry", + "X", + "Y", + "ML_lanes_EA", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "segment_id", + "managed", + "bus_only", + "rail_only" + ] + + osm_facility_type_dict (dict): Mapping between OSM Roadway variable + and facility type. Default: + + area_type_shape (str): Location of shapefile defining area type. + Default: + :: + r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp" + area_type_variable_shp (str): property in area_type_shape with area + type in it. + Default: + :: + "COMDES2040" + area_type_code_dict (dict): Mapping of the area_type_variable_shp to + the area type code used in the MetCouncil cube network. + Default: + :: + { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + downtown_area_type_shape (str): Location of shapefile defining downtown area type. + Default: + :: + r"metcouncil_data/area_type/downtownzones_TAZ.shp" + downtown_area_type (int): Area type integer for downtown. + Default: + :: + 5 + mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp" + mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp + associated with roadway class. Default: + :: + "ROUTE_SYS" + widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp" + widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape + associated with roadway class.Default: + :: + "RDWY_CTGY_" + mndot_count_shape (str): Shapefile of MnDOT links with a property + associated with counts. Default: + :: + r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp" + mndot_count_variable_shp (str): The property in mndot_count_shape + associated with counts. Default: + + :: + "lookups/osm_highway_facility_type_crosswalk.csv" + legacy_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from Legacy TM2 network. Default: + :: + "lookups/legacy_tm2_attributes.csv" + osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId + from OSM. Default: + :: + "lookups/osm_lanes_attributes.csv" + tam_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from TAM TM2 network. Default: + :: + "lookups/tam_tm2_attributes.csv" + tom_tom_attributes (str): CSV file of link attributes by + shStReferenceId from TomTom network. Default: + :: + "lookups/tomtom_attributes.csv" + sfcta_attributes (str): CSV file of link attributes by + shStReferenceId from SFCTA network. Default: + :: + "lookups/sfcta_attributes.csv" + output_epsg (int): EPSG type of geographic projection for output + shapefiles. Default: + :: + 102646 + output_link_shp (str): Output shapefile for roadway links. Default: + :: + r"tests/scratch/links.shp" + output_node_shp (str): Output shapefile for roadway nodes. Default: + :: + r"tests/scratch/nodes.shp" + output_link_csv (str): Output csv for roadway links. Default: + :: + r"tests/scratch/links.csv" + output_node_csv (str): Output csv for roadway nodes. Default: + :: + r"tests/scratch/nodes.csv" + output_link_txt (str): Output fixed format txt for roadway links. Default: + :: + r"tests/scratch/links.txt" + output_node_txt (str): Output fixed format txt for roadway nodes. Default: + :: + r"tests/scratch/nodes.txt" + output_link_header_width_txt (str): Header for txt roadway links. Default: + :: + r"tests/scratch/links_header_width.txt" + output_node_header_width_txt (str): Header for txt for roadway Nodes. Default: + :: + r"tests/scratch/nodes_header_width.txt" + output_cube_network_script (str): Cube script for importing + fixed-format roadway network. Default: + :: + r"tests/scratch/make_complete_network_from_fixed_width_file.s + + + + """ + +
[docs] def __init__(self, **kwargs): + """ + Time period and category splitting info + """ + if "time_periods_to_time" in kwargs: + self.time_periods_to_time = kwargs.get("time_periods_to_time") + else: + self.time_period_to_time = { + "EA": ("3:00", "6:00"), + "AM": ("6:00", "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + + #MTC + self.cube_time_periods = { + "1": "EA", + "2": "AM", + "3": "MD", + "4": "PM", + "5": "EV", + } + + """ + #MC + self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7} + + self.route_type_mode_dict = {0: 8, 2: 9} + + self.cube_time_periods = {"1": "AM", "2": "MD"} + self.cube_time_periods_name = {"AM": "pk", "MD": "op"} + """ + if "categories" in kwargs: + self.categories = kwargs.get("categories") + else: + self.categories = { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + + # prefix, source variable, categories + self.properties_to_split = { + "lanes": { + "v": "lanes", + "time_periods": self.time_period_to_time, + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_period_to_time, + }, + "useclass": { + "v": "useclass", + "time_periods": self.time_period_to_time, + }, + } + + """ + Details for calculating the county based on the centroid of the link. + The NAME varible should be the name of a field in shapefile. + """ + #MTC + if 'lasso_base_dir' in kwargs: + self.base_dir = get_base_dir(lasso_base_dir = kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + + if 'data_file_location' in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "mtc_data") + + #MC + if "lasso_base_dir" in kwargs: + self.base_dir = get_base_dir(lasso_base_dir=kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + """ + if "data_file_location" in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "metcouncil_data") + """ + + #-------- + if "settings_location" in kwargs: + self.settings_location = kwargs.get("settings_location") + else: + self.settings_location = os.path.join(self.base_dir, "examples", "settings") + + if "scratch_location" in kwargs: + self.scratch_location = kwargs.get("scratch_location") + else: + self.scratch_location = os.path.join(self.base_dir, "tests", "scratch") + + ### COUNTIES + + self.county_shape = os.path.join( + self.data_file_location, "county", "county.shp" + ) + self.county_variable_shp = "NAME" + + #MTC + self.county_code_dict = { + 'San Francisco':1, + 'San Mateo':2, + 'Santa Clara':3, + 'Alameda':4, + 'Contra Costa':5, + 'Solano':6, + 'Napa':7, + 'Sonoma':8, + 'Marin':9, + 'External':10, + } + + self.county_centroid_range_dict = { + 'San Francisco':range(1,100000), + 'San Mateo':range(100001,200000), + 'Santa Clara':range(200001,300000), + 'Alameda':range(300001,400000), + 'Contra Costa':range(400001,500000), + 'Solano':range(500001,600000), + 'Napa':range(600001,700000), + 'Sonoma':range(700001,800000), + 'Marin':range(800001,900000), + 'External':range(900001,1000000) + } + + self.county_node_range_dict = { + 'San Francisco':range(1000000,1500000), + 'San Mateo':range(1500000,2000000), + 'Santa Clara':range(2000000,2500000), + 'Alameda':range(2500000,3000000), + 'Contra Costa':range(3000000,3500000), + 'Solano':range(3500000,4000000), + 'Napa':range(4000000,4500000), + 'Sonoma':range(4500000,5000000), + 'Marin':range(5000000,5500000), + } + + self.county_hov_node_range_dict = { + 'San Francisco':range(5500000,6000000), + 'San Mateo':range(6000000,6500000), + 'Santa Clara':range(6500000,7000000), + 'Alameda':range(7000000,7500000), + 'Contra Costa':range(7500000,8000000), + 'Solano':range(8000000,8500000), + 'Napa':range(8500000,9000000), + 'Sonoma':range(9000000,9500000), + 'Marin':range(9500000,10000000), + } + + self.county_link_range_dict = { + 'San Francisco':range(1,1000000), + 'San Mateo':range(1000000,2000000), + 'Santa Clara':range(2000000,3000000), + 'Alameda':range(3000000,4000000), + 'Contra Costa':range(4000000,5000000), + 'Solano':range(5000000,6000000), + 'Napa':range(6000000,7000000), + 'Sonoma':range(7000000,8000000), + 'Marin':range(8000000,9000000) + } + + #MC + """ + self.county_code_dict = { + "Anoka": 1, + "Carver": 2, + "Dakota": 3, + "Hennepin": 4, + "Ramsey": 5, + "Scott": 6, + "Washington": 7, + "external": 10, + "Chisago": 11, + "Goodhue": 12, + "Isanti": 13, + "Le Sueur": 14, + "McLeod": 15, + "Pierce": 16, + "Polk": 17, + "Rice": 18, + "Sherburne": 19, + "Sibley": 20, + "St. Croix": 21, + "Wright": 22, + } + """ + + self.mpo_counties = [ + 1, + 3, + 4, + 5, + 6, + 7, + 8, + 9 + ] + + self.taz_N_list = list(range(1, 10000)) + list(range(100001, 110000)) + list(range(200001, 210000)) + list(range(300001, 310000))\ + + list(range(400001, 410000)) + list(range(500001, 510000)) + list(range(600001, 610000)) + list(range(700001, 710000))\ + + list(range(800001, 810000)) + list(range(900001, 1000000)) + + self.maz_N_list = list(range(10001, 90000)) + list(range(110001, 190000)) + list(range(210001, 290000)) + list(range(310001, 390000))\ + + list(range(410001, 490000)) + list(range(510001, 590000)) + list(range(610001, 690000)) + list(range(710001, 790000))\ + + list(range(810001, 890000)) + + self.tap_N_list = list(range(90001, 99999)) + list(range(190001, 199999)) + list(range(290001, 299999)) + list(range(390001, 399999))\ + + list(range(490001, 499999)) + list(range(590001, 599999)) + list(range(690001, 699999)) + list(range(790001, 799999))\ + + list(range(890001, 899999)) + + self.tap_N_start = { + "San Francisco" : 90001, + "San Mateo" : 190001, + "Santa Clara" : 290001, + "Alameda" : 390001, + "Contra Costa" : 490001, + "Solano" : 590001, + "Napa" : 690001, + "Sonoma" : 790001, + "Marin" : 890001 + } + + #MTC + self.osm_facility_type_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_facility_type_crosswalk.csv" + ) + #MC + ### Lanes + self.lanes_lookup_file = os.path.join( + self.data_file_location, "lookups", "lanes.csv" + ) + + ### TAZS + + self.taz_shape = os.path.join( + self.data_file_location, "TAZ", "TAZOfficialWCurrentForecasts.shp" + ) + ###### + #MTC + self.osm_lanes_attributes = os.path.join( + self.data_file_location, "lookups", "osm_lanes_attributes.csv" + ) + + self.legacy_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "legacy_tm2_attributes.csv" + ) + + self.assignable_analysis = os.path.join( + self.data_file_location, "lookups", "assignable_analysis_links.csv" + ) + ### + ### AREA TYPE - MC + self.area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "ThriveMSP2040CommunityDesignation.shp", + ) + self.area_type_variable_shp = "COMDES2040" + # area type map from raw data to model category + + # source https://metrocouncil.org/Planning/Publications-And-Resources/Thrive-MSP-2040-Plan-(1)/7_ThriveMSP2040_LandUsePoliciesbyCD.aspx + # urban center + # urban + # suburban + # suburban edge + # emerging suburban edge + # rural center + # diversified rural + # rural residential + # agricultural + self.area_type_code_dict = { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + + self.downtown_area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "downtownzones_TAZ.shp", + ) + + self.downtown_area_type = int(5) + + self.centroid_connect_lanes = int(1) + + self.osm_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_asgngrp_crosswalk.csv" + ) + self.mrcc_roadway_class_shape = os.path.join( + self.data_file_location, "mrcc", "trans_mrcc_centerlines.shp" + ) + #### + ###MTC + self.tam_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "tam_tm2_attributes.csv" + ) + + self.sfcta_attributes = os.path.join( + self.data_file_location, "lookups", "sfcta_attributes.csv" + ) + + self.tomtom_attributes = os.path.join( + self.data_file_location, "lookups", "tomtom_attributes.csv" + ) + + self.pems_attributes = os.path.join( + self.data_file_location, "lookups", "pems_attributes.csv" + ) + + self.centroid_file = os.path.join( + self.data_file_location, "centroid", "centroid_node.pickle" + ) + #### + ###MC + self.widot_shst_data = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "widot.out.matched.geojson", + ) + #### + + self.centroid_connector_link_file = os.path.join( + self.data_file_location, "centroid", "cc_link.pickle" + ) + + self.centroid_connector_shape_file = os.path.join( + self.data_file_location, "centroid", "cc_shape.pickle" + ) + + self.tap_file = os.path.join( + self.data_file_location, "tap", "tap_node.pickle" + ) + + self.tap_connector_link_file = os.path.join( + self.data_file_location, "tap", "tap_link.pickle" + ) + + self.tap_connector_shape_file = os.path.join( + self.data_file_location, "tap", "tap_shape.pickle" + ) + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + ###MTC + self.log_to_net_crosswalk = os.path.join(self.settings_location, "log_to_net.csv") + + self.emme_name_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "emme_attribute_names.csv" + ) + #### + #MC + self.mndot_count_variable_shp = "AADT_mn" + + self.widot_county_shape = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "TRADAS_(counts).shp", + ) + ### + ###MTC + self.mode_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "gtfs_to_tm2_mode_crosswalk.csv" + ) + + self.veh_cap_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "transitSeatCap.csv" + ) + + self.faresystem_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "faresystem_crosswalk.txt" + ) + + # https://app.asana.com/0/12291104512575/1200287255197808/f + self.fare_2015_to_2010_deflator = 0.927 + self.fare_2015_to_2000_deflator = 180.20/258.27 + #### + #MC + self.widot_count_variable_shp = "AADT_wi" + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + self.log_to_net_crosswalk = os.path.join( + self.settings_location, "log_to_net.csv" + ) + + self.subregion_boundary_file = os.path.join( + self.data_file_location, 'emme', 'subregion_boundary_for_active_modes.shp' + ) + + self.subregion_boundary_id_variable = 'subregion' + #### + + self.output_variables = [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + #MTC + 'name', + "distance", + #"roadway", + #"name", + #MC + #"shape_id", + #"distance", + #"roadway", + #"name", + #"roadway_class", + #### + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "county", + "model_node_id", + "N", + "osm_node_id", + "geometry", + "X", + "Y", + "segment_id", + "managed", + "bus_only", + "rail_only", + #MTC + "assignable", + "cntype", + "useclass_AM", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "useclass_EA", + "transit", + "tollbooth", + "tollseg", + "ft", + "tap_drive", + "tollbooth", + "tollseg", + "farezone", + "tap_id", + #### + #MC + "bike_facility", + "mrcc_id", + "ROUTE_SYS", # mrcc functional class + ] + + self.output_link_shp = os.path.join(self.scratch_location, "links.shp") + self.output_node_shp = os.path.join(self.scratch_location, "nodes.shp") + self.output_link_csv = os.path.join(self.scratch_location, "links.csv") + self.output_node_csv = os.path.join(self.scratch_location, "nodes.csv") + self.output_link_txt = os.path.join(self.scratch_location, "links.txt") + self.output_node_txt = os.path.join(self.scratch_location, "nodes.txt") + self.output_link_header_width_txt = os.path.join( + self.scratch_location, "links_header_width.txt" + ) + self.output_node_header_width_txt = os.path.join( + self.scratch_location, "nodes_header_width.txt" + ) + self.output_cube_network_script = os.path.join( + self.scratch_location, "make_complete_network_from_fixed_width_file.s" + ) + self.output_dir = os.path.join(self.scratch_location) + self.output_proj = CRS("epsg:2875") + self.output_proj4 = '+proj=lcc +lat_0=32.1666666666667 +lon_0=-116.25 +lat_1=33.8833333333333 +lat_2=32.7833333333333 +x_0=2000000.0001016 +y_0=500000.0001016 +ellps=GRS80 +towgs84=-0.991,1.9072,0.5129,-1.25033e-07,-4.6785e-08,-5.6529e-08,0 +units=us-ft +no_defs +type=crs' + self.prj_file = os.path.join(self.data_file_location, 'projection', '2875.prj') + self.wkt_projection = 'PROJCS["NAD83(HARN) / California zone 6 (ftUS)",GEOGCS["NAD83(HARN)",DATUM["NAD83_High_Accuracy_Reference_Network",SPHEROID["GRS 1980",6378137,298.257222101],TOWGS84[-0.991,1.9072,0.5129,-1.25033E-07,-4.6785E-08,-5.6529E-08,0]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4152"]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["latitude_of_origin",32.1666666666667],PARAMETER["central_meridian",-116.25],PARAMETER["standard_parallel_1",33.8833333333333],PARAMETER["standard_parallel_2",32.7833333333333],PARAMETER["false_easting",6561666.667],PARAMETER["false_northing",1640416.667],UNIT["US survey foot",0.304800609601219],AXIS["Easting",EAST],AXIS["Northing",NORTH],AUTHORITY["EPSG","2875"]]' + + self.fare_matrix_output_variables = ["faresystem", "origin_farezone", "destination_farezone", "price"] + + self.zones = 4756 + """ + Create all the possible headway variable combinations based on the cube time periods setting + """ + self.time_period_properties_list = [ + p + "[" + str(t) + "]" + for p in ["HEADWAY", "FREQ"] + for t in self.cube_time_periods.keys() + ] + + self.int_col = [ + "model_link_id", + "model_node_id", + "A", + "B", + #MTC + #"county", + ### + #MC + # "lanes", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "roadway_class", + "assign_group", + #"county", + "area_type", + "trn_priority", + "AADT", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "centroidconnect", + "bike_facility", + #### + "drive_access", + "walk_access", + "bike_access", + "truck_access", + #MTC + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "ML_lanes_EA", + ### + #MC + "drive_node", + "walk_node", + "bike_node", + "transit_node", + # "ML_lanes", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_NT", + #### + "segment_id", + "managed", + "bus_only", + "rail_only", + "transit", + ##MTC + "ft", + "assignable", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EA", + "lanes_EV", + "useclass_AM", + "useclass_EA", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "tollseg", + "tollbooth", + "farezone", + "tap_id", + ] + + self.float_col = [ + "distance", + "price", + "X", + "Y" + "mrcc_id", + ] + + self.float_col = ["distance", "ttime_assert", "price", "X", "Y"] + + self.string_col = [ + "osm_node_id", + "name", + "roadway", + "shstGeometryId", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "ROUTE_SYS", + ] + + # paramters added for PNR simulation + self.pnr_node_location = os.path.join( + self.data_file_location, "lookups", "pnr_stations.csv" + ) + self.pnr_buffer = 20 + self.knr_buffer = 2.5 + self.walk_buffer = 0.75 + self.transfer_buffer = 1 + self.taz_list = os.path.join( + self.data_file_location, "lookups", "taz_lists.csv" + ) + self.sf_county = os.path.join( + self.data_file_location, "lookups", "SFcounty.shp" + ) + + self.__dict__.update(kwargs)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_modules/lasso/project/index.html b/branch/remove_assignable/_modules/lasso/project/index.html new file mode 100644 index 0000000..65ad8b5 --- /dev/null +++ b/branch/remove_assignable/_modules/lasso/project/index.html @@ -0,0 +1,1505 @@ + + + + + + lasso.project — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.project

+import json
+import os
+import re
+from typing import Any, Dict, Optional, Union, List
+from csv import reader
+
+from pandas.core import base
+
+import numpy as np
+import pandas as pd
+from pandas import DataFrame
+import geopandas as gpd
+
+from network_wrangler import ProjectCard
+from network_wrangler import RoadwayNetwork
+
+from .transit import CubeTransit, StandardTransit
+from .logger import WranglerLogger
+from .parameters import Parameters
+from .roadway import ModelRoadwayNetwork
+from .util import column_name_to_parts
+
+
+
[docs]class Project(object): + """A single or set of changes to the roadway or transit system. + + Compares a base and a build transit network or a base and build + highway network and produces project cards. + + .. highlight:: python + + Typical usage example: + :: + test_project = Project.create_project( + base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + test_project.evaluate_changes() + test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + + Attributes: + DEFAULT_PROJECT_NAME: a class-level constant that defines what + the project name will be if none is set. + STATIC_VALUES: a class-level constant which defines values that + are not evaluated when assessing changes. + card_data (dict): {"project": <project_name>, "changes": <list of change dicts>} + roadway_link_changes (DataFrame): pandas dataframe of CUBE roadway link changes. + roadway_node_changes (DataFrame): pandas dataframe of CUBE roadway node changes. + transit_changes (CubeTransit): + base_roadway_network (RoadwayNetwork): + base_cube_transit_network (CubeTransit): + build_cube_transit_network (CubeTransit): + project_name (str): name of the project, set to DEFAULT_PROJECT_NAME if not provided + parameters: an instance of the Parameters class which sets a bunch of parameters + """ + + DEFAULT_PROJECT_NAME = "USER TO define" + + STATIC_VALUES = [ + "model_link_id", + "area_type", + "county", + # "assign_group", + "centroidconnect", + ] + CALCULATED_VALUES = [ + "area_type", + "county", + "assign_group", + "centroidconnect", + ] + +
[docs] def __init__( + self, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[DataFrame] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = "", + evaluate: Optional[bool] = False, + parameters: Union[dict, Parameters] = {}, + ): + """ + ProjectCard constructor. + + args: + roadway_link_changes: dataframe of roadway changes read from a log file + roadway_node_changes: dataframe of roadway changes read from a log file + transit_changes: dataframe of transit changes read from a log file + base_roadway_network: RoadwayNetwork instance for base case + base_transit_network: StandardTransit instance for base case + base_cube_transit_network: CubeTransit instance for base transit network + build_cube_transit_network: CubeTransit instance for build transit network + project_name: name of the project + evaluate: defaults to false, but if true, will create card data + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + returns: instance of ProjectCard + """ + self.card_data = Dict[str, Dict[str, Any]] + + self.roadway_link_changes = roadway_link_changes + self.roadway_node_changes = roadway_node_changes + self.base_roadway_network = base_roadway_network + self.base_transit_network = base_transit_network + self.base_cube_transit_network = base_cube_transit_network + self.build_cube_transit_network = build_cube_transit_network + self.transit_changes = transit_changes + self.project_name = ( + project_name if project_name else Project.DEFAULT_PROJECT_NAME + ) + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + if base_roadway_network != None: + self.determine_roadway_network_changes_compatibility( + self.base_roadway_network, + self.roadway_link_changes, + self.roadway_node_changes, + self.parameters + ) + + if evaluate: + self.evaluate_changes()
+ +
[docs] def write_project_card(self, filename: str = None): + """ + Writes project cards. + + Args: + filename (str): File path to output .yml + + Returns: + None + """ + ProjectCard(self.card_data).write(out_filename=filename)
+ +
[docs] @staticmethod + def create_project( + roadway_log_file: Union[str, List[str], None] = None, + roadway_shp_file: Optional[str] = None, + roadway_csv_file: Optional[str] = None, + network_build_file: Optional[str] = None, + emme_node_id_crosswalk_file: Optional[str] = None, + emme_name_crosswalk_file: Optional[str] = None, + base_roadway_dir: Optional[str] = None, + base_transit_dir: Optional[str] = None, + base_cube_transit_source: Optional[str] = None, + build_cube_transit_source: Optional[str] = None, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[CubeTransit] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = None, + recalculate_calculated_variables: Optional[bool] = False, + recalculate_distance: Optional[bool] = False, + parameters: Optional[dict] = {}, + **kwargs, + ): + """ + Constructor for a Project instance. + + Args: + roadway_log_file: File path to consuming logfile or a list of logfile paths. + roadway_shp_file: File path to consuming shape file for roadway changes. + roadway_csv_file: File path to consuming csv file for roadway changes. + network_build_file: File path to consuming EMME network build for network changes. + base_roadway_dir: Folder path to base roadway network. + base_transit_dir: Folder path to base transit network. + base_cube_transit_source: Folder path to base transit network or cube line file string. + base_cube_transit_file: File path to base transit network. + build_cube_transit_source: Folder path to build transit network or cube line file string. + build_cube_transit_file: File path to build transit network. + roadway_link_changes: pandas dataframe of CUBE roadway link changes. + roadway_node_changes: pandas dataframe of CUBE roadway node changes. + transit_changes: build transit changes. + base_roadway_network: Base roadway network object. + base_cube_transit_network: Base cube transit network object. + build_cube_transit_network: Build cube transit network object. + project_name: If not provided, will default to the roadway_log_file filename if + provided (or the first filename if a list is provided) + recalculate_calculated_variables: if reading in a base network, if this is true it + will recalculate variables such as area type, etc. This only needs to be true + if you are creating project cards that are changing the calculated variables. + recalculate_distance: recalculate the distance variable. This only needs to be + true if you are creating project cards that change the distance. + parameters: dictionary of parameters + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in + the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables + in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + + Returns: + A Project instance. + """ + + if base_cube_transit_source and base_cube_transit_network: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_cube_transit_network' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_cube_transit_source: + base_cube_transit_network = CubeTransit.create_from_cube(base_cube_transit_source, parameters) + WranglerLogger.debug( + "Base network has {} lines".format(len(base_cube_transit_network.lines)) + ) + if len(base_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Base network lines: {}".format( + "\n - ".join(base_cube_transit_network.lines) + ) + ) + elif base_cube_transit_network: + pass + else: + msg = "No base cube transit network." + WranglerLogger.info(msg) + base_cube_transit_network = None + + if build_cube_transit_source and transit_changes: + msg = "Method takes only one of 'build_cube_transit_source' and 'transit_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if build_cube_transit_source: + WranglerLogger.debug("build") + build_cube_transit_network = CubeTransit.create_from_cube(build_cube_transit_source, parameters) + WranglerLogger.debug( + "Build network has {} lines".format(len(build_cube_transit_network.lines)) + ) + if len(build_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Build network lines: {}".format( + "\n - ".join(build_cube_transit_network.lines) + ) + ) + elif transit_changes: + pass + else: + msg = "No cube transit changes given or processed." + WranglerLogger.info(msg) + transit_changes = None + + if roadway_log_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_log_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_csv_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_csv_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_shp_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_shp_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and not project_name: + if type(roadway_log_file) == list: + project_name = os.path.splitext(os.path.basename(roadway_log_file[0]))[ + 0 + ] + WranglerLogger.info( + "No Project Name - Using name of first log file in list" + ) + else: + project_name = os.path.splitext(os.path.basename(roadway_log_file))[0] + WranglerLogger.info("No Project Name - Using name of log file") + if network_build_file and not project_name: + if type(network_build_file) == list: + with open(network_build_file[0]) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info( + "No Project Name - Using metadata of first network build file in list" + ) + else: + with open(network_build_file) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info("No Project Name - Using metadata of network build file") + if roadway_log_file: + roadway_link_changes, roadway_node_changes = Project.read_logfile(roadway_log_file) + elif roadway_shp_file: + roadway_changes = gpd.read_file(roadway_shp_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_link_changes = DataFrame(roadway_link_changes.drop("geometry", axis=1)) + roadway_node_changes = DataFrame(roadway_node_changes.drop("geometry", axis=1)) + roadway_node_changes["model_node_id"] = 0 + elif roadway_csv_file: + roadway_changes = pd.read_csv(roadway_csv_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_node_changes["model_node_id"] = 0 + elif network_build_file: + roadway_link_changes, roadway_node_changes, transit_changes = Project.read_network_build_file(network_build_file) + if emme_node_id_crosswalk_file: + # get wrangler IDs from emme element_id + roadway_link_changes, roadway_node_changes, transit_changes = Project.emme_id_to_wrangler_id( + roadway_link_changes, + roadway_node_changes, + transit_changes, + emme_node_id_crosswalk_file + ) + else: + msg = "User needs to specify emme node id crosswalk file using emme_node_id_crosswalk_file = " + WranglerLogger.error(msg) + raise ValueError(msg) + # rename emme attributes to wrangler attributes + if emme_name_crosswalk_file is None: + emme_name_crosswalk_file = parameters.emme_name_crosswalk_file + roadway_link_changes, roadway_node_changes = Project.emme_name_to_wrangler_name( + roadway_link_changes, + roadway_node_changes, + emme_name_crosswalk_file + ) + elif roadway_link_changes: + pass + elif roadway_node_changes: + pass + else: + msg = "No roadway changes given or processed." + WranglerLogger.info(msg) + roadway_link_changes = pd.DataFrame({}) + roadway_node_changes = pd.DataFrame({}) + + if base_roadway_network and base_roadway_dir: + msg = "Method takes only one of 'base_roadway_network' and 'base_roadway_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_roadway_dir: + base_roadway_network = ModelRoadwayNetwork.read( + os.path.join(base_roadway_dir, "link.json"), + os.path.join(base_roadway_dir, "node.geojson"), + os.path.join(base_roadway_dir, "shape.geojson"), + fast=True, + recalculate_calculated_variables=recalculate_calculated_variables, + recalculate_distance=recalculate_distance, + parameters=parameters, + **kwargs, + ) + base_roadway_network.split_properties_by_time_period_and_category() + elif base_roadway_network: + base_roadway_network.split_properties_by_time_period_and_category() + else: + msg = "No base roadway network." + WranglerLogger.info(msg) + base_roadway_network = None + + if base_cube_transit_source and base_transit_dir: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_transit_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_transit_dir: + base_transit_network = StandardTransit.read_gtfs( + gtfs_feed_dir=base_transit_dir, + parameters=parameters + ) + else: + msg = "No base transit network." + WranglerLogger.info(msg) + base_transit_network = None + + project = Project( + roadway_link_changes=roadway_link_changes, + roadway_node_changes=roadway_node_changes, + transit_changes=transit_changes, + base_roadway_network=base_roadway_network, + base_transit_network=base_transit_network, + base_cube_transit_network=base_cube_transit_network, + build_cube_transit_network=build_cube_transit_network, + evaluate=True, + project_name=project_name, + parameters=parameters, + ) + + return project
+ +
[docs] @staticmethod + def read_logfile(logfilename: Union[str, List[str]]): + """ + Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + logfilename (str or list[str]): File path to CUBE logfile or list of logfile paths. + + Returns: + A DataFrame reprsentation of the log file. + """ + if type(logfilename) == str: + logfilename = [logfilename] + + link_df = pd.DataFrame() + node_df = pd.DataFrame() + + for file in logfilename: + WranglerLogger.info("Reading logfile: {}".format(file)) + with open(file) as f: + _content = f.readlines() + + _node_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("N") + ] + WranglerLogger.debug("node lines: {}".format(_node_lines)) + _link_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("L") + ] + WranglerLogger.debug("link lines: {}".format(_link_lines)) + + _nodecol = ["OBJECT", "OPERATION", "GROUP"] + _node_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Node Cols: {}".format(_nodecol)) + _linkcol = ["OBJECT", "OPERATION", "GROUP"] + _link_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Link Cols: {}".format(_linkcol)) + + def split_log(x): + return list(reader([x], delimiter=',', quotechar='"'))[0] + + _node_df = pd.DataFrame([split_log(x) for x in _node_lines[1:]],columns = _nodecol) + WranglerLogger.debug("Node DF: {}".format(_node_df)) + _link_df = pd.DataFrame([split_log(x) for x in _link_lines[1:]],columns = _linkcol) + WranglerLogger.debug("Link DF: {}".format(_link_df)) + + node_df = pd.concat([node_df, _node_df]) + link_df = pd.concat([link_df, _link_df]) + + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + link_df.columns = [c.split("[")[0] for c in link_df.columns] + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + node_df.columns = [c.split("[")[0] for c in node_df.columns] + + if len(link_df) > 0: + # create operation history + action_history_df = ( + link_df.groupby(['A', 'B'])["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + link_df = pd.merge(link_df, action_history_df, on=['A', 'B'], how="left") + + if len(node_df) > 0: + action_history_df = ( + node_df.groupby('N')["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + node_df = pd.merge(node_df, action_history_df, on='N', how="left") + + WranglerLogger.info( + "Processed {} Node lines and {} Link lines".format( + node_df.shape[0], link_df.shape[0] + ) + ) + + return link_df, node_df
+ +
[docs] @staticmethod + def read_network_build_file(networkbuildfilename: Union[str, List[str]]): + """ + Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + networkbuildfilename (str or list[str]): File path to emme nework build file or list of network build file paths. + + Returns: + A DataFrame representation of the network build file + """ + if type(networkbuildfilename) == str: + networkbuildfilename = [networkbuildfilename] + + _link_command_history_df = DataFrame() + _node_command_history_df = DataFrame() + _transit_command_history_df = DataFrame() + + for file in networkbuildfilename: + WranglerLogger.info("Reading network build file: {}".format(file)) + with open(file) as f: + _content = json.load(f) + + _command_history = _content.get('command_history') + + # loop through all the commands + for command in _command_history: + if command.get('command') == 'set_attribute': + element_id = command.get('parameters').get('element_ids') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + _command_df[command.get('parameters').get('attribute_name')] = command.get('parameters').get('value') + + if command.get('command') in ['create_link', 'create_node']: + if command.get('command') == 'create_link': + element_id = command.get('results').get('changes').get('added').get('LINK') + if command.get('command') == 'create_node': + element_id = command.get('results').get('changes').get('added').get('NODE') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + for attribute_name, attribute_value in command.get('parameters').get('attributes').items(): + _command_df[attribute_name] = attribute_value + + if command.get('command') == 'delete_link': + element_id = command.get('results').get('changes').get('removed').get('LINK') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + if command.get('command') == 'modify_transit_line': + element_id = command.get('parameters').get('line_id') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : pd.Series(element_id), + 'object' : pd.Series(object), + 'operation' : pd.Series(operation) + } + ) + + _command_df['new_itinerary'] = [command.get('parameters').get('new_itinerary')] + + if ('L' in _command_df['object'].unique()): + _link_command_history_df = _link_command_history_df.append( + _command_df[_command_df['object'] == 'L'], + sort = False, + ignore_index = True + ) + + if ('N' in _command_df['object'].unique()): + _node_command_history_df = _node_command_history_df.append( + _command_df[_command_df['object'] == 'N'], + sort = False, + ignore_index = True + ) + + if ( + ('TRANSIT_LINE' in _command_df['object'].unique()) | + ('TRANSIT_STOP' in _command_df['object'].unique()) | + ('TRANSIT_SHAPE' in _command_df['object'].unique()) + ): + _transit_command_history_df = _transit_command_history_df.append( + _command_df[_command_df['object'].isin(['TRANSIT_LINE', 'TRANSIT_STOP', 'TRANSIT_SHAPE'])], + sort = False, + ignore_index = True + ) + + if len(_link_command_history_df) > 0: + # create operation history + link_action_history_df = ( + _link_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + link_action_history_df["operation_final"] = link_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + # get the last none null value for each element + # consolidate elements to single record + def get_last_valid(series): + if len(series.dropna()) > 0: + return series.dropna().iloc[-1] + else: + return np.nan + + #_command_history_df = _command_history_df.groupby(['element_id']).apply(get_last_valid).reset_index() + _link_command_history_df = _link_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _link_command_history_df = pd.merge(_link_command_history_df, link_action_history_df, on='element_id', how="left") + + if len(_node_command_history_df) > 0: + # create node operation history + node_action_history_df = ( + _node_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + node_action_history_df["operation_final"] = node_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + _node_command_history_df = _node_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _node_command_history_df = pd.merge(_node_command_history_df, node_action_history_df, on='element_id', how="left") + + WranglerLogger.info( + "Processed {} link element commands, {} node element commands".format( + _link_command_history_df.shape[0], + _node_command_history_df.shape[0] + ) + ) + + return _link_command_history_df, _node_command_history_df, _transit_command_history_df
+ +
[docs] @staticmethod + def emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file): + """ + rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder + """ + WranglerLogger.info('Reading emme node id crosswalk file from {}'.format(emme_node_id_crosswalk_file)) + emme_node_id_crosswalk_df = pd.read_csv(emme_node_id_crosswalk_file) + emme_node_id_dict = dict(zip(emme_node_id_crosswalk_df['emme_node_id'], emme_node_id_crosswalk_df['model_node_id'])) + + # get node changes + if len(emme_node_change_df) > 0: + emme_node_change_df['emme_id'] = emme_node_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + + # get new emme nodes + new_emme_node_id_list = [ + n for n in emme_node_change_df['emme_id'] if n not in emme_node_id_crosswalk_df['emme_node_id'] + ] + WranglerLogger.info('New emme node id list {}'.format(new_emme_node_id_list)) + new_wrangler_node = emme_node_id_crosswalk_df['model_node_id'].max() + + # add crosswalk for new emme nodes + for new_emme_node in new_emme_node_id_list: + new_wrangler_node = new_wrangler_node + 1 + emme_node_id_dict.update({new_emme_node : new_wrangler_node}) + + # for nodes update model_node_id + emme_node_change_df['model_node_id'] = emme_node_change_df['emme_id'].map(emme_node_id_dict).fillna(0) + + if len(emme_link_change_df) > 0: + emme_link_change_df['A'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + emme_link_change_df['B'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[-1])) + # for links update A,B nodes + emme_link_change_df['A'] = emme_link_change_df['A'].map(emme_node_id_dict) + emme_link_change_df['B'] = emme_link_change_df['B'].map(emme_node_id_dict) + + if len(emme_transit_changes_df) > 0: + emme_transit_changes_df['i_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-3] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + emme_transit_changes_df['j_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-2] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + # update i,j nodes + emme_transit_changes_df['i_node'] = emme_transit_changes_df[ + 'i_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + emme_transit_changes_df['j_node'] = emme_transit_changes_df[ + 'j_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + + # update routing nodes + emme_transit_changes_df['new_itinerary'] = emme_transit_changes_df.apply( + lambda x: [emme_node_id_dict.get(n) for n in x['new_itinerary']] if x['object'] == 'TRANSIT_SHAPE' else 0, + axis = 1 + ) + + return emme_link_change_df, emme_node_change_df, emme_transit_changes_df
+ +
[docs] def get_object_from_network_build_command(row): + """ + determine the network build object is node or link + + Args: + row: network build command history dataframe + + Returns: + 'N' for node, 'L' for link + """ + + if row.get('command') == 'create_link': + return 'L' + + if row.get('command') == 'create_node': + return 'N' + + if row.get('command') == 'delete_link': + return 'L' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'L' + if row.get('parameters').get('element_type') == 'NODE': + return 'N' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'TRANSIT_LINE' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'TRANSIT_STOP' + + if row.get('command') == 'modify_transit_line': + return 'TRANSIT_SHAPE'
+ +
[docs] def get_operation_from_network_build_command(row): + """ + determine the network build object action type + + Args: + row: network build command history dataframe + + Returns: + 'A', 'C', 'D' + """ + + if row.get('command') == 'create_link': + return 'A' + + if row.get('command') == 'create_node': + return 'A' + + if row.get('command') == 'delete_link': + return 'D' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'C' + if row.get('parameters').get('element_type') == 'NODE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'C' + + if row.get('command') == 'modify_transit_line': + return 'C'
+ +
[docs] @staticmethod + def emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file): + """ + rename emme names to wrangler names using crosswalk file + """ + + WranglerLogger.info('Reading emme attribute name crosswalk file {}'.format(emme_name_crosswalk_file)) + emme_name_crosswalk_df = pd.read_csv(emme_name_crosswalk_file) + emme_name_crosswalk_dict = dict(zip(emme_name_crosswalk_df['emme_name'], emme_name_crosswalk_df['wrangler_name'])) + + # drop columns we don't need from emme to avoid confusion + ignore_columns = [ + c for c in emme_link_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'A', 'B'] + ] + WranglerLogger.info('Ignoring link changes in {}'.format(ignore_columns)) + emme_link_change_df = emme_link_change_df.drop(ignore_columns, axis = 1) + + ignore_columns = [ + c for c in emme_node_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'model_node_id'] + ] + WranglerLogger.info('Ignoring node changes in {}'.format(ignore_columns)) + emme_node_change_df = emme_node_change_df.drop(ignore_columns, axis = 1) + + # rename emme name to wrangler name + emme_link_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + emme_node_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + + return emme_link_change_df, emme_node_change_df
+ +
[docs] @staticmethod + def determine_roadway_network_changes_compatability( + base_roadway_network: ModelRoadwayNetwork, + roadway_link_changes: DataFrame, + roadway_node_changes: DataFrame, + parameters: Parameters, + ): + """ + Checks to see that any links or nodes that change exist in base roadway network. + """ + WranglerLogger.info( + "Evaluating compatibility between roadway network changes and base network. Not evaluating deletions." + ) + + # CUBE log file saves all variable names in upper cases, need to convert them to be same as network + log_to_net_df = pd.read_csv(parameters.log_to_net_crosswalk) + log_to_net_dict = dict(zip(log_to_net_df["log"], log_to_net_df["net"])) + + dbf_to_net_df = pd.read_csv(parameters.net_to_dbf_crosswalk) + dbf_to_net_dict = dict(zip(dbf_to_net_df["dbf"], dbf_to_net_df["net"])) + + for c in roadway_link_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_link_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_link_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_link_changes.rename(columns=dbf_to_net_dict, inplace=True) + + for c in roadway_node_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_node_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_node_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_node_changes.rename(columns=dbf_to_net_dict, inplace=True) + + # for links "L" that change "C", + # find locations where there isn't a base roadway link + if len(roadway_link_changes) > 0: + link_changes_df = roadway_link_changes[ + roadway_link_changes["operation_final"] == "C" + ].copy() + + link_merge_df = pd.merge( + link_changes_df[["A", "B"]].astype(str), + base_roadway_network.links_df[["A", "B", "model_link_id"]].astype(str), + how="left", + on=["A", "B"], + ) + + missing_links = link_merge_df.loc[link_merge_df["model_link_id"].isna()] + + if missing_links.shape[0]: + msg = "Network missing the following AB links:\n{}".format(missing_links) + WranglerLogger.error(msg) + raise ValueError(msg) + + # for links "N" that change "C", + # find locations where there isn't a base roadway node + if len(roadway_node_changes) > 0: + node_changes_df = roadway_node_changes[ + roadway_node_changes["operation_final"] == "C" + ].copy() + + node_merge_df = pd.merge( + node_changes_df[["model_node_id"]], + base_roadway_network.nodes_df[["model_node_id", "geometry"]], + how="left", + on=["model_node_id"], + ) + missing_nodes = node_merge_df.loc[node_merge_df["geometry"].isna()] + if missing_nodes.shape[0]: + msg = "Network missing the following nodes:\n{}".format(missing_nodes) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] def evaluate_changes(self): + """ + Determines which changes should be evaluated, initiates + self.card_data to be an aggregation of transit and highway changes. + """ + highway_change_list = [] + transit_change_list = [] + + WranglerLogger.info("Evaluating project changes.") + + if (not self.roadway_link_changes.empty) | (not self.roadway_node_changes.empty): + highway_change_list = self.add_highway_changes() + + if (not self.transit_changes.empty) or ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + self.card_data = { + "project": self.project_name, + "changes": transit_change_list + highway_change_list, + }
+ +
[docs] def add_transit_changes(self): + """ + Evaluates changes between base and build transit objects and + adds entries into the self.card_data dictionary. + """ + if self.build_cube_transit_network: + transit_change_list = self.build_cube_transit_network.evaluate_differences( + self.base_cube_transit_network + ) + elif self.base_transit_network: + transit_change_list = self.base_transit_network.evaluate_differences( + self.transit_changes + ) + return transit_change_list
+ + @staticmethod + def _final_op(x): + if x["operation_history"][-1] == "D": + if "A" in x["operation_history"][:-1]: + return "N" + else: + return "D" + elif x["operation_history"][-1] == "A": + if "D" in x["operation_history"][:-1]: + return "C" + else: + return "A" + else: + if "A" in x["operation_history"][:-1]: + return "A" + else: + return "C" + +
[docs] def add_highway_changes(self, limit_variables_to_existing_network=False): + """ + Evaluates changes from the log file based on the base highway object and + adds entries into the self.card_data dictionary. + + Args: + limit_variables_to_existing_network (bool): True if no ad-hoc variables. Default to False. + """ + + for c in self.parameters.string_col: + if c in self.roadway_link_changes.columns: + self.roadway_link_changes[c] = self.roadway_link_changes[c].str.lstrip(" ") + if c in self.roadway_node_changes.columns: + self.roadway_node_changes[c] = self.roadway_node_changes[c].str.lstrip(" ") + + ## if worth it, could also add some functionality to network wrangler itself. + node_changes_df = self.roadway_node_changes.copy() + + link_changes_df = self.roadway_link_changes.copy() + + def _process_deletions(link_changes_df): + """ + create deletion section in project card + """ + WranglerLogger.debug("Processing link deletions") + + cube_delete_df = link_changes_df[link_changes_df["operation_final"] == "D"].copy() + + # make sure columns has the same type as base network + cube_delete_df['A'] = cube_delete_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_delete_df['B'] = cube_delete_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_delete_df.columns: + cube_delete_df.drop(['model_link_id'], axis = 1, inplace = True) + + cube_delete_df = pd.merge( + cube_delete_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if len(cube_delete_df) > 0: + links_to_delete = cube_delete_df["model_link_id"].tolist() + delete_link_dict = { + "category": "Roadway Deletion", + "links": {"model_link_id": links_to_delete}, + } + WranglerLogger.debug("{} Links Deleted.".format(len(links_to_delete))) + else: + delete_link_dict = None + WranglerLogger.debug("No link deletions processed") + + return delete_link_dict + + def _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ): + """""" + WranglerLogger.debug("Processing link additions") + cube_add_df = link_changes_df[link_changes_df["operation_final"] == "A"] + if len(cube_add_df) == 0: + WranglerLogger.debug("No link additions processed") + return {} + + if limit_variables_to_existing_network: + add_col = [ + c + for c in cube_add_df.columns + if c in self.base_roadway_network.links_df.columns + ] + else: + add_col = [ + c for c in cube_add_df.columns if c not in ["operation_final"] + ] + # can leave out "operation_final" from writing out, is there a reason to write it out? + + for x in add_col: + cube_add_df[x] = cube_add_df[x].astype(self.base_roadway_network.links_df[x].dtype) + + add_link_properties = cube_add_df[add_col].to_dict("records") + + # WranglerLogger.debug("Add Link Properties: {}".format(add_link_properties)) + WranglerLogger.debug("{} Links Added".format(len(add_link_properties))) + + return {"category": "Add New Roadway", "links": add_link_properties} + + def _process_node_additions(node_add_df): + """""" + WranglerLogger.debug("Processing node additions") + + if len(node_add_df) == 0: + WranglerLogger.debug("No node additions processed") + return [] + + node_add_df = node_add_df.drop(["operation_final"], axis=1) + + for x in node_add_df.columns: + node_add_df[x] = node_add_df[x].astype(self.base_roadway_network.nodes_df[x].dtype) + + add_nodes_dict_list = node_add_df.to_dict( + "records" + ) + WranglerLogger.debug("{} Nodes Added".format(len(add_nodes_dict_list))) + + return add_nodes_dict_list + + def _process_single_link_change(change_row, changeable_col): + """""" + + # 1. Find associated base year network values + base_df = self.base_roadway_network.links_df[ + (self.base_roadway_network.links_df["A"] == int(change_row.A)) + & (self.base_roadway_network.links_df["B"] == int(change_row.B)) + ] + + if not base_df.shape[0]: + msg = "No match found in network for AB combination: ({},{}). Incompatible base network.".format( + change_row.A, change_row.B + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + elif base_df.shape[0] > 1: + WranglerLogger.warning( + "Found more than one match in base network for AB combination: ({},{}). Selecting first one to operate on but AB should be unique to network.".format( + change_row.A, change_row.B + ) + ) + + base_row = base_df.iloc[0] + # WranglerLogger.debug("Properties with changes: {}".format(changeable_col)) + + # 2. find columns that changed (enough) + changed_col = [] + for col in changeable_col: + WranglerLogger.debug("Assessing Column: {}".format(col)) + # if it is the same as before, or a static value, don't process as a change + if str(change_row[col]).strip('"\'') == str(base_row[col]).strip('"\''): + continue + # if it is NaN or None, don't process as a change + if (change_row[col] != change_row[col]) | (change_row[col] is None): + continue + if (col == "roadway_class") & (change_row[col] == 0): + continue + # only look at distance if it has significantly changed + if col == "distance": + if ( + abs( + (change_row[col] - float(base_row[col])) + / base_row[col].astype(float) + ) + > 0.01 + ): + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + else: + continue + else: + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + + WranglerLogger.debug( + "Properties with changes that will be processed: {}".format(changed_col) + ) + + if not changed_col: + return pd.DataFrame() + + # 3. Iterate through columns with changed values and structure the changes as expected in project card + property_dict_list = [] + processed_properties = [] + + # check if it's a manged lane change + for c in changed_col: + if c.startswith("ML_"): + # TODO ML project card skeleton + msg = "Detected managed lane changes, please create managed lane project card!" + WranglerLogger.error(msg) + raise ValueError(msg) + return + + # regular roadway property change + for c in changed_col: + # WranglerLogger.debug("Processing Column: {}".format(c)) + ( + p_base_name, + p_time_period, + p_category, + managed_lane, + ) = column_name_to_parts(c, self.parameters) + + _d = { + "existing": base_row[c], + "set": change_row[c], + } + if c in Project.CALCULATED_VALUES: + _d = { + "set": change_row[c], + } + if p_time_period: + if managed_lane == 1: + _d["time"] = list( + self.parameters.time_period_to_time[p_time_period] + ) + if p_category: + _d["category"] = p_category + + # iterate through existing properties that have been changed and see if you should just add + if (p_base_name in processed_properties) & (managed_lane == 1): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + processed_p["timeofday"] += [_d] + elif (p_base_name in processed_properties) & (managed_lane == 0): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + if processed_p["set"] != change_row[c]: + msg = "Detected different changes for split-property variables on regular roadway links: " + msg += "conflicting \"{}\" values \"{}\", \"{}\"".format(p_base_name, processed_p["set"], change_row[c]) + WranglerLogger.error(msg) + raise ValueError(msg) + elif p_time_period: + if managed_lane == 1: + property_dict = {"property": p_base_name, "timeofday": [_d]} + processed_properties.append(p_base_name) + property_dict_list.append(property_dict) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + + card_df = pd.DataFrame( + { + "properties": pd.Series([property_dict_list]), + "model_link_id": pd.Series(base_row["model_link_id"]), + } + ) + + # WranglerLogger.debug('single change card_df:\n {}'.format(card_df)) + + return card_df + + def _process_link_changes(link_changes_df, changeable_col): + """""" + cube_change_df = link_changes_df[link_changes_df["operation_final"] == "C"].copy() + + # make sure columns has the same type as base network + cube_change_df['A'] = cube_change_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_change_df['B'] = cube_change_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_change_df.columns: + cube_change_df.drop('model_link_id', axis = 1, inplace = True) + + cube_change_df = pd.merge( + cube_change_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if not cube_change_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + change_link_dict_df = pd.DataFrame(columns=["properties", "model_link_id"]) + + for index, row in cube_change_df.iterrows(): + card_df = _process_single_link_change(row, changeable_col) + + change_link_dict_df = pd.concat( + [change_link_dict_df, card_df], ignore_index=True, sort=False + ) + + if not change_link_dict_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + # WranglerLogger.debug('change_link_dict_df Unaggregated:\n {}'.format(change_link_dict_df)) + + # Have to change to string so that it is a hashable type for the aggregation + change_link_dict_df["properties"] = change_link_dict_df[ + "properties" + ].astype(str) + # Group the changes that are the same + change_link_dict_df = ( + change_link_dict_df.groupby("properties")[["model_link_id"]] + .agg(lambda x: list(x)) + .reset_index() + ) + # WranglerLogger.debug('change_link_dict_df Aggregated:\n {}'.format(change_link_dict_df)) + + # Reformat model link id to correct "facility" format + change_link_dict_df["facility"] = change_link_dict_df.apply( + lambda x: {"link": [{"model_link_id": x.model_link_id}]}, axis=1 + ) + + # WranglerLogger.debug('change_link_dict_df 3: {}'.format(change_link_dict_df)) + change_link_dict_df["properties"] = change_link_dict_df["properties"].apply( + lambda x: json.loads( + x.replace("'\"", "'").replace("\"'", "'").replace("'", '"') + ) + ) + + change_link_dict_df["category"] = "Roadway Property Change" + + change_link_dict_list = change_link_dict_df[ + ["category", "facility", "properties"] + ].to_dict("record") + + WranglerLogger.debug( + "{} Changes Processed".format(len(change_link_dict_list)) + ) + return change_link_dict_list + + def _consolidate_actions(log, base, key_list): + log_df = log.copy() + # will be changed if to allow new variables being added/changed that are not in base network + changeable_col = [x for x in log_df.columns if x in base.columns] + #print(log_df) + #for x in changeable_col: + # print(x) + #log_df[x] = log_df[x].astype(base[x].dtype) + + if 'operation_final' not in log_df.columns: + action_history_df = ( + log_df.groupby(key_list)["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + + log_df = pd.merge(log_df, action_history_df, on=key_list, how="left") + log_df.drop_duplicates(subset=key_list, keep="last", inplace=True) + log_df["operation_final"] = log_df.apply(lambda x: Project._final_op(x), axis=1) + + return log_df[changeable_col + ["operation_final"]] + + delete_link_dict = None + add_link_dict = None + change_link_dict_list = [] + + if len(link_changes_df) != 0: + link_changes_df = _consolidate_actions( + link_changes_df, self.base_roadway_network.links_df, ["A", "B"] + ) + + # process deletions + delete_link_dict = _process_deletions(link_changes_df) + + # process additions + add_link_dict = _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ) + + # process changes + WranglerLogger.debug("Processing changes") + WranglerLogger.debug(link_changes_df) + changeable_col = list( + ( + set(link_changes_df.columns) + & set(self.base_roadway_network.links_df.columns) + ) + - set(Project.STATIC_VALUES) + ) + + cols_in_changes_not_in_net = list( + set(link_changes_df.columns) + - set(self.base_roadway_network.links_df.columns) + ) + + if cols_in_changes_not_in_net: + WranglerLogger.warning( + "The following attributes are specified in the changes but do not exist in the base network: {}".format( + cols_in_changes_not_in_net + ) + ) + + change_link_dict_list = _process_link_changes(link_changes_df, changeable_col) + + if len(node_changes_df) != 0: + node_changes_df = _consolidate_actions( + node_changes_df, self.base_roadway_network.nodes_df, ["model_node_id"] + ) + + # print error message for node change and node deletion + if ( + len(node_changes_df[node_changes_df["operation_final"].isin(["C", "D"])]) + > 0 + ): + msg = "NODE changes and deletions are not allowed!" + WranglerLogger.warning(msg) + #raise ValueError(msg) + node_add_df = node_changes_df[node_changes_df["operation_final"] == "A"] + + if add_link_dict: + add_link_dict["nodes"] = _process_node_additions(node_add_df) + else: + add_link_dict = {"category": "Add New Roadway", "nodes": _process_node_additions(node_add_df)} + + else: + None + + # combine together + + highway_change_list = list( + filter(None, [delete_link_dict] + [add_link_dict] + change_link_dict_list) + ) + + return highway_change_list
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_modules/lasso/roadway/index.html b/branch/remove_assignable/_modules/lasso/roadway/index.html new file mode 100644 index 0000000..be27108 --- /dev/null +++ b/branch/remove_assignable/_modules/lasso/roadway/index.html @@ -0,0 +1,2036 @@ + + + + + + lasso.roadway — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.roadway

+import copy
+import glob
+import os
+from typing import Optional, Union
+
+import geopandas as gpd
+import pandas as pd
+
+from geopandas import GeoDataFrame
+from pandas import DataFrame
+import numpy as np
+
+from network_wrangler import RoadwayNetwork
+from .parameters import Parameters
+from .logger import WranglerLogger
+
+
+
[docs]class ModelRoadwayNetwork(RoadwayNetwork): + """ + Subclass of network_wrangler class :ref:`RoadwayNetwork <network_wrangler:RoadwayNetwork>` + + A representation of the physical roadway network and its properties. + """ + + CALCULATED_VALUES = [ + "area_type", + "county", + "centroidconnect", + ] + +
[docs] def __init__( + self, + nodes: GeoDataFrame, + links: DataFrame, + shapes: GeoDataFrame, + parameters: Union[Parameters, dict] = {}, + **kwargs, + ): + """ + Constructor + + Args: + nodes: geodataframe of nodes + links: dataframe of links + shapes: geodataframe of shapes + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. + If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + """ + super().__init__(nodes, links, shapes, **kwargs) + + # will have to change if want to alter them + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.links_metcouncil_df = None + self.nodes_metcouncil_df = None + + self.fill_na() + self.convert_int()
+ # self.shapes_metcouncil_df = None + ##todo also write to file + # WranglerLogger.debug("Used PARAMS\n", '\n'.join(['{}: {}'.format(k,v) for k,v in self.parameters.__dict__.items()])) + +
[docs] @staticmethod + def read( + link_filename: str, + node_filename: str, + shape_filename: str, + fast: bool = False, + recalculate_calculated_variables=False, + recalculate_distance=False, + parameters: Union[dict, Parameters] = {}, + **kwargs, + ): + """ + Reads in links and nodes network standard. + + Args: + link_filename (str): File path to link json. + node_filename (str): File path to node geojson. + shape_filename (str): File path to link true shape geojson + fast (bool): boolean that will skip validation to speed up read time. + recalculate_calculated_variables (bool): calculates fields from spatial joins, etc. + recalculate_distance (bool): re-calculates distance. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + Returns: + ModelRoadwayNetwork + """ + + nodes_df, links_df, shapes_df = RoadwayNetwork.load_transform_network( + node_filename, + link_filename, + shape_filename, + validate_schema=not fast, + **kwargs, + ) + + m_road_net = ModelRoadwayNetwork( + nodes_df, + links_df, + shapes_df, + parameters=parameters, + **kwargs, + ) + + if recalculate_calculated_variables: + m_road_net.create_calculated_variables() + if recalculate_distance: + m_road_net.calculate_distance(overwrite=True) + + m_road_net.fill_na() + # this method is making period values as string "NaN", need to revise. + m_road_net.split_properties_by_time_period_and_category() + for c in m_road_net.links_df.columns: + m_road_net.links_df[c] = m_road_net.links_df[c].replace("NaN", np.nan) + m_road_net.convert_int() + + return m_road_net
+ +
[docs] @staticmethod + def from_RoadwayNetwork( + roadway_network_object, + parameters: Union[dict, Parameters] = {}, + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + roadway_network_object (RoadwayNetwork). + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + Returns: + ModelRoadwayNetwork + """ + + additional_params_dict = { + k: v + for k, v in roadway_network_object.__dict__.items() + if k not in ["nodes_df", "links_df", "shapes_df", "parameters"] + } + + return ModelRoadwayNetwork( + roadway_network_object.nodes_df, + roadway_network_object.links_df, + roadway_network_object.shapes_df, + parameters=parameters, + **additional_params_dict, + )
+ +
[docs] def split_properties_by_time_period_and_category(self, properties_to_split=None): + """ + Splits properties by time period, assuming a variable structure of + + Args: + properties_to_split: dict + dictionary of output variable prefix mapped to the source variable and what to stratify it by + e.g. + { + 'lanes' : {'v':'lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'ML_lanes' : {'v':'ML_lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'use' : {'v':'use', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + } + + """ + import itertools + + if properties_to_split == None: + properties_to_split = self.parameters.properties_to_split + + for out_var, params in properties_to_split.items(): + if params["v"] not in self.links_df.columns: + WranglerLogger.warning( + "Specified variable to split: {} not in network variables: {}. Returning 0.".format( + params["v"], str(self.links_df.columns) + ) + ) + if params.get("time_periods") and params.get("categories"): + + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + time_suffix + "_" + category_suffix + ] = 0 + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[out_var + "_" + time_suffix] = 0 + elif params.get("time_periods") and params.get("categories"): + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + category_suffix + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=params["categories"][category_suffix], + time_period=params["time_periods"][time_suffix], + ) + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[ + out_var + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=None, + time_period=params["time_periods"][time_suffix], + ) + else: + raise ValueError( + "Shoudn't have a category without a time period: {}".format(params) + )
+ +
[docs] def create_calculated_variables(self): + """ + Creates calculated roadway variables. + + Args: + None + """ + WranglerLogger.info("Creating calculated roadway variables.") + + #MTC + self.create_ML_variable() + #/MTC + #MC + self.calculate_area_type() + self.calculate_county() + self.calculate_mpo() + self.add_counts() + self.create_ML_variable() + self.create_hov_corridor_variable() + self.create_managed_variable()
+ #/MC + +
[docs] def calculate_county( + self, + county_shape=None, + county_shape_variable=None, + network_variable="county", + county_codes_dict=None, + overwrite=False, + ): + """ + #MC + Calculates county variable. + + This uses the centroid of the geometry field to determine which county it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + county_shape (str): The File path to county geodatabase. + county_shape_variable (str): The variable name of county in county geodadabase. + network_variable (str): The variable name of county in network standard. Default to "county". + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing County Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "County Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + county_shape = county_shape if county_shape else self.parameters.county_shape + + county_shape_variable = ( + county_shape_variable + if county_shape_variable + else self.parameters.county_variable_shp + ) + + WranglerLogger.info( + "Adding roadway network variable for county using a spatial join with: {}".format( + county_shape + ) + ) + + county_codes_dict = ( + county_codes_dict if county_codes_dict else self.parameters.county_code_dict + ) + if not county_codes_dict: + msg = "No county codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + county_gdf = gpd.read_file(county_shape) + county_gdf = county_gdf.to_crs(epsg=self.crs) + joined_gdf = gpd.sjoin(centroids_gdf, county_gdf, how="left", op="intersects") + + joined_gdf[county_shape_variable] = ( + joined_gdf[county_shape_variable] + .map(county_codes_dict) + .fillna(10) + .astype(int) + ) + + self.links_df[network_variable] = joined_gdf[county_shape_variable] + + WranglerLogger.info( + "Finished Calculating county variable: {}".format(network_variable) + )
+ +
[docs] def calculate_area_type( + self, + area_type_shape=None, + area_type_shape_variable=None, + network_variable="area_type", + area_type_codes_dict=None, + downtown_area_type_shape=None, + downtown_area_type=None, + overwrite=False, + ): + """ + #MC + Calculates area type variable. + + This uses the centroid of the geometry field to determine which area it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + area_type_shape (str): The File path to area geodatabase. + area_type_shape_variable (str): The variable name of area type in area geodadabase. + network_variable (str): The variable name of area type in network standard. Default to "area_type". + area_type_codes_dict: The dictionary to map input area_type_shape_variable to network_variable + downtown_area_type_shape: The file path to the downtown area type boundary. + downtown_area_type (int): Integer value of downtown area type + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Area Type Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Area Type Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Area Type from Spatial Data and adding as roadway network variable: {}".format( + network_variable + ) + ) + + """ + Verify inputs + """ + + area_type_shape = ( + area_type_shape if area_type_shape else self.parameters.area_type_shape + ) + + if not area_type_shape: + msg = "No area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(area_type_shape): + msg = "File not found for area type shape: {}".format(area_type_shape) + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_shape_variable = ( + area_type_shape_variable + if area_type_shape_variable + else self.parameters.area_type_variable_shp + ) + + if not area_type_shape_variable: + msg = "No area type shape varible specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_codes_dict = ( + area_type_codes_dict + if area_type_codes_dict + else self.parameters.area_type_code_dict + ) + if not area_type_codes_dict: + msg = "No area type codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type_shape = ( + downtown_area_type_shape + if downtown_area_type_shape + else self.parameters.downtown_area_type_shape + ) + + if not downtown_area_type_shape: + msg = "No downtown area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(downtown_area_type_shape): + msg = "File not found for downtown area type shape: {}".format( + downtown_area_type_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type = ( + downtown_area_type + if downtown_area_type + else self.parameters.downtown_area_type + ) + if not downtown_area_type: + msg = "No downtown area type value specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + WranglerLogger.debug("Reading Area Type Shapefile {}".format(area_type_shape)) + area_type_gdf = gpd.read_file(area_type_shape) + area_type_gdf = area_type_gdf.to_crs(epsg=self.crs) + + downtown_gdf = gpd.read_file(downtown_area_type_shape) + downtown_gdf = downtown_gdf.to_crs(epsg=self.crs) + + joined_gdf = gpd.sjoin( + centroids_gdf, area_type_gdf, how="left", op="intersects" + ) + + joined_gdf[area_type_shape_variable] = ( + joined_gdf[area_type_shape_variable] + .map(area_type_codes_dict) + .fillna(1) + .astype(int) + ) + + WranglerLogger.debug("Area Type Codes Used: {}".format(area_type_codes_dict)) + + d_joined_gdf = gpd.sjoin( + centroids_gdf, downtown_gdf, how="left", op="intersects" + ) + + d_joined_gdf["downtown_area_type"] = d_joined_gdf["Id"].fillna(-99).astype(int) + + joined_gdf.loc[ + d_joined_gdf["downtown_area_type"] == 0, area_type_shape_variable + ] = downtown_area_type + + WranglerLogger.debug( + "Downtown Area Type used boundary file: {}".format(downtown_area_type_shape) + ) + + self.links_df[network_variable] = joined_gdf[area_type_shape_variable] + + WranglerLogger.info( + "Finished Calculating Area Type from Spatial Data into variable: {}".format( + network_variable + ) + )
+ +
[docs] def calculate_mpo( + self, + county_network_variable="county", + network_variable="mpo", + as_integer=True, + mpo_counties=None, + overwrite=False, + ): + """ + Calculates mpo variable. + #MC + Args: + county_variable (str): Name of the variable where the county names are stored. Default to "county". + network_variable (str): Name of the variable that should be written to. Default to "mpo". + as_integer (bool): If true, will convert true/false to 1/0s. + mpo_counties (list): List of county names that are within mpo region. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating MPO as roadway network variable: {}".format(network_variable) + ) + """ + Verify inputs + """ + county_network_variable = ( + county_network_variable + if county_network_variable + else self.parameters.county_network_variable + ) + + if not county_network_variable: + msg = "No variable specified as containing 'county' in the network." + WranglerLogger.error(msg) + raise ValueError(msg) + if county_network_variable not in self.links_df.columns: + msg = "Specified county network variable: {} does not exist in network. Try running or debuging county calculation." + WranglerLogger.error(msg) + raise ValueError(msg) + + mpo_counties = mpo_counties if mpo_counties else self.parameters.mpo_counties + + if not mpo_counties: + msg = "No MPO Counties specified in method call or in parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("MPO Counties: {}".format(",".join(str(mpo_counties)))) + + """ + Start actual process + """ + + mpo = self.links_df[county_network_variable].isin(mpo_counties) + + if as_integer: + mpo = mpo.astype(int) + + self.links_df[network_variable] = mpo + + WranglerLogger.info( + "Finished calculating MPO variable: {}".format(network_variable) + )
+ +
[docs] def add_variable_using_shst_reference( + self, + var_shst_csvdata=None, + shst_csv_variable=None, + network_variable=None, + network_var_type=int, + overwrite=False, + ): + """ + Join network links with source data, via SHST API node match result. + + Args: + var_shst_csvdata (str): File path to SHST API return. + shst_csv_variable (str): Variable name in the source data. + network_variable (str): Name of the variable that should be written to. + network_var_type : Variable type in the written network. + overwrite (bool): True is overwriting existing variable. Default to False. + + Returns: + None + + """ + WranglerLogger.info( + "Adding Variable {} using Shared Streets Reference from {}".format( + network_variable, var_shst_csvdata + ) + ) + + var_shst_df = pd.read_csv(var_shst_csvdata) + + if "shstReferenceId" not in var_shst_df.columns: + msg = "'shstReferenceId' required but not found in {}".format(var_shst_data) + WranglerLogger.error(msg) + raise ValueError(msg) + + if shst_csv_variable not in var_shst_df.columns: + msg = "{} required but not found in {}".format( + shst_csv_variable, var_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + join_gdf = pd.merge( + self.links_df, + var_shst_df[["shstReferenceId", shst_csv_variable]], + how="left", + on="shstReferenceId", + ) + + join_gdf[shst_csv_variable].fillna(0, inplace=True) + + if network_variable in self.links_df.columns and not overwrite: + join_gdf.loc[join_gdf[network_variable] > 0, network_variable] = join_gdf[ + shst_csv_variable + ].astype(network_var_type) + else: + join_gdf[network_variable] = join_gdf[shst_csv_variable].astype( + network_var_type + ) + + self.links_df[network_variable] = join_gdf[network_variable] + + WranglerLogger.info( + "Added variable: {} using Shared Streets Reference".format(network_variable) + )
+ +
[docs] def add_counts( + self, + network_variable="AADT", + mndot_count_shst_data=None, + widot_count_shst_data=None, + mndot_count_variable_shp=None, + widot_count_variable_shp=None, + ): + + """ + Adds count variable. + #MC + join the network with count node data, via SHST API node match result + + Args: + network_variable (str): Name of the variable that should be written to. Default to "AADT". + mndot_count_shst_data (str): File path to MNDOT count location SHST API node match result. + widot_count_shst_data (str): File path to WIDOT count location SHST API node match result. + mndot_count_variable_shp (str): File path to MNDOT count location geodatabase. + widot_count_variable_shp (str): File path to WIDOT count location geodatabase. + + Returns: + None + """ + + WranglerLogger.info("Adding Counts") + + """ + Verify inputs + """ + + mndot_count_shst_data = ( + mndot_count_shst_data + if mndot_count_shst_data + else self.parameters.mndot_count_shst_data + ) + widot_count_shst_data = ( + widot_count_shst_data + if widot_count_shst_data + else self.parameters.widot_count_shst_data + ) + mndot_count_variable_shp = ( + mndot_count_variable_shp + if mndot_count_variable_shp + else self.parameters.mndot_count_variable_shp + ) + widot_count_variable_shp = ( + widot_count_variable_shp + if widot_count_variable_shp + else self.parameters.widot_count_variable_shp + ) + + for varname, var in { + "mndot_count_shst_data": mndot_count_shst_data, + "widot_count_shst_data": widot_count_shst_data, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(var): + msg = "{}' not found at following location: {}.".format(varname, var) + WranglerLogger.error(msg) + raise ValueError(msg) + + for varname, var in { + "mndot_count_variable_shp": mndot_count_variable_shp, + "widot_count_variable_shp": widot_count_variable_shp, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + WranglerLogger.debug( + "Adding MNDOT Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + mndot_count_shst_data, mndot_count_variable_shp, network_variable + ) + ) + # Add Minnesota Counts + self.add_variable_using_shst_reference( + var_shst_csvdata=mndot_count_shst_data, + shst_csv_variable=mndot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=True, + ) + WranglerLogger.debug( + "Adding WiDot Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + widot_count_shst_data, widot_count_variable_shp, network_variable + ) + ) + # Add Wisconsin Counts, but don't overwrite Minnesota + self.add_variable_using_shst_reference( + var_shst_csvdata=widot_count_shst_data, + shst_csv_variable=widot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=False, + ) + + self.links_df["count_AM"] = self.links_df[network_variable] / 4 + self.links_df["count_MD"] = self.links_df[network_variable] / 4 + self.links_df["count_PM"] = self.links_df[network_variable] / 4 + self.links_df["count_NT"] = self.links_df[network_variable] / 4 + + self.links_df["count_daily"] = self.links_df[network_variable] + self.links_df["count_year"] = 2017 + + WranglerLogger.info( + "Finished adding counts variable: {}".format(network_variable) + )
+ +
[docs] @staticmethod + def read_match_result(path): + """ + Reads the shst geojson match returns. + + Returns shst dataframe. + + Reading lots of same type of file and concatenating them into a single DataFrame. + + Args: + path (str): File path to SHST match results. + + Returns: + geodataframe: geopandas geodataframe + + ##todo + not sure why we need, but should be in utilities not this class + """ + refId_gdf = DataFrame() + refid_file = glob.glob(path) + for i in refid_file: + new = gpd.read_file(i) + refId_gdf = pd.concat([refId_gdf, new], ignore_index=True, sort=False) + return refId_gdf
+ +
[docs] @staticmethod + def get_attribute( + links_df, + join_key, # either "shstReferenceId", or "shstGeometryId", tests showed the latter gave better coverage + source_shst_ref_df, # source shst refId + source_gdf, # source dataframe + field_name, # , # targetted attribute from source + ): + """ + Gets attribute from source data using SHST match result. + + Args: + links_df (dataframe): The network dataframe that new attribute should be written to. + join_key (str): SHST ID variable name used to join source data with network dataframe. + source_shst_ref_df (str): File path to source data SHST match result. + source_gdf (str): File path to source data. + field_name (str): Name of the attribute to get from source data. + + Returns: + None + """ + # join based on shared streets geometry ID + # pp_link_id is shared streets match return + # source_ink_id is mrcc + WranglerLogger.debug( + "source ShSt rename_variables_for_dbf columns\n{}".format( + source_shst_ref_df.columns + ) + ) + WranglerLogger.debug("source gdf columns\n{}".format(source_gdf.columns)) + # end up with OSM network with the MRCC Link ID + # could also do with route_sys...would that be quicker? + join_refId_df = pd.merge( + links_df, + source_shst_ref_df[[join_key, "pp_link_id", "score"]].rename( + columns={"pp_link_id": "source_link_id", "score": "source_score"} + ), + how="left", + on=join_key, + ) + + # joined with MRCC dataframe to get route_sys + + join_refId_df = pd.merge( + join_refId_df, + source_gdf[["LINK_ID", field_name]].rename( + columns={"LINK_ID": "source_link_id"} + ), + how="left", + on="source_link_id", + ) + + # drop duplicated records with same field value + + join_refId_df.drop_duplicates( + subset=["model_link_id", "shstReferenceId", field_name], inplace=True + ) + + # more than one match, take the best score + + join_refId_df.sort_values( + by=["model_link_id", "source_score"], + ascending=True, + na_position="first", + inplace=True, + ) + + join_refId_df.drop_duplicates( + subset=["model_link_id"], keep="last", inplace=True + ) + + # self.links_df[field_name] = join_refId_df[field_name] + + return join_refId_df[links_df.columns.tolist() + [field_name, "source_link_id"]]
+ +
[docs] def calculate_use( + self, + network_variable="use", + as_integer=True, + overwrite=False, + ): + """ + Calculates use variable. + + Args: + network_variable (str): Variable that should be written to in the network. Default to "use" + as_integer (bool): If True, will convert true/false to 1/0s. Defauly to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "'use' Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating hov and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + #MTC + self.links_df[network_variable] = int(1) + #/MTC + + self.links_df[network_variable] = 0 + + self.links_df.loc[ + (self.links_df["assign_group"] == 8) | (self.links_df["access"] == "hov"), + network_variable, + ] = 100 + #/MC + + + if as_integer: + self.links_df[network_variable] = self.links_df[network_variable].astype( + int + ) + WranglerLogger.info( + "Finished calculating hov variable: {}".format(network_variable) + )
+ +
[docs] def create_ML_variable( + self, + network_variable="ML_lanes", + overwrite=False, + ): + """ + Created ML lanes placeholder for project to write out ML changes + + ML lanes default to 0, ML info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing ML Variable '{}' already in network".format( + network_variable + ) + ) + self.links_df[network_variable] = int(0) + else: + WranglerLogger.info( + "ML Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + WranglerLogger.info( + "Finished creating ML lanes variable: {}".format(network_variable) + )
+ +
[docs] def create_hov_corridor_variable( + self, + network_variable="segment_id", + overwrite=False, + ): + """ + Created hov corridor placeholder for project to write out corridor changes + + hov corridor id default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov corridor Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Hov corridor Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating hov corridor variable: {}".format(network_variable) + )
+ +
[docs] def create_managed_variable( + self, + network_variable="managed", + overwrite=False, + ): + """ + Created placeholder for project to write out managed + + managed default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing managed Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Managed Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating managed variable: {}".format(network_variable) + )
+ +
[docs] def calculate_centroidconnect( + self, + parameters, + network_variable="centroidconnect", + highest_taz_number=None, + as_integer=True, + overwrite=False, + ): + """ + Calculates centroid connector variable. + + Args: + parameters (Parameters): A Lasso Parameters, which stores input files. + network_variable (str): Variable that should be written to in the network. Default to "centroidconnect" + highest_taz_number (int): the max TAZ number in the network. + as_integer (bool): If True, will convert true/false to 1/0s. Default to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + Returns: + RoadwayNetwork + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Centroid Connector Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Centroid Connector Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Centroid Connector and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + highest_taz_number = ( + highest_taz_number if highest_taz_number else parameters.highest_taz_number + ) + + if not highest_taz_number: + msg = "No highest_TAZ number specified in method variable or in parameters" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug( + "Calculating Centroid Connectors using highest TAZ number: {}".format( + highest_taz_number + ) + ) + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + self.links_df[network_variable] = False + + self.links_df.loc[ + (self.links_df["A"] <= highest_taz_number) + | (self.links_df["B"] <= highest_taz_number), + network_variable, + ] = True + + if as_integer: + self.links_df[network_variable] = self.links_df[ + network_variable + ].astype(int) + WranglerLogger.info( + "Finished calculating centroid connector variable: {}".format(network_variable) + )
+ + +
[docs] def calculate_distance( + self, network_variable="distance", centroidconnect_only=False, overwrite=False + ): + """ + calculate link distance in miles + + Args: + centroidconnect_only (Bool): True if calculating distance for centroidconnectors only. Default to False. + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing distance Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Distance Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + #MC + if ("centroidconnect" not in self.links_df) & ("taz" not in self.links_df.roadway.unique()): + if centroidconnect_only: + msg = "No variable specified for centroid connector, calculating centroidconnect first" + WranglerLogger.error(msg) + raise ValueError(msg) + #/MC + + """ + Start actual process + """ + + temp_links_gdf = self.links_df.copy() + temp_links_gdf.crs = "EPSG:4326" + temp_links_gdf = temp_links_gdf.to_crs(epsg=26915) + + #MTC + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MTC + #MC + if centroidconnect_only: + WranglerLogger.info( + "Calculating {} for centroid connectors".format(network_variable) + ) + temp_links_gdf[network_variable] = np.where( + temp_links_gdf.centroidconnect == 1, + temp_links_gdf.geometry.length / 1609.34, + temp_links_gdf[network_variable], + ) + else: + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MC + + self.links_df[network_variable] = temp_links_gdf[network_variable]
+ +
[docs] def convert_int(self, int_col_names=[]): + """ + Convert integer columns + """ + + #MTC + WranglerLogger.info( + "Converting variable type to mtc standard" + ) + + int_col_names = self.parameters.int_col + #/MTC + #MC + """ + WranglerLogger.info("Converting variable type to MetCouncil standard") + + if not int_col_names: + int_col_names = self.parameters.int_col + #/MC + """ + ##Why are we doing this? + # int_col_names.remove("lanes") + + for c in list(set(self.links_df.columns) & set(int_col_names)): + try: + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + self.links_df[c] = self.links_df[c].replace("", 0) + self.links_df[c] = self.links_df[c].astype(int) + except ValueError: + try: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + except: + msg = f"Could not convert column {c} to integer." + WranglerLogger.error(msg) + raise ValueError(msg) + + for c in list(set(self.nodes_df.columns) & set(int_col_names)): + self.nodes_df[c] = self.nodes_df[c].replace("", 0) + self.nodes_df[c] = self.nodes_df[c].astype(int)
+ +
[docs] def fill_na(self): + """ + Fill na values from create_managed_lane_network() + """ + + WranglerLogger.info("Filling nan for network from network wrangler") + + num_col = self.parameters.int_col + self.parameters.float_col + + for x in list(self.links_df.columns): + if x in num_col: + self.links_df[x].fillna(0, inplace=True) + self.links_df[x] = self.links_df[x].apply( + lambda k: 0 if k in [np.nan, "", float("nan"), "NaN"] else k + ) + + else: + self.links_df[x].fillna("", inplace=True) + + for x in list(self.nodes_df.columns): + if x in num_col: + self.nodes_df[x].fillna(0, inplace=True) + else: + self.nodes_df[x].fillna("", inplace=True)
+ + +
[docs] def roadway_standard_to_met_council_network(self, output_epsg=None): + """ + Rename and format roadway attributes to be consistent with what metcouncil's model is expecting. + #MC + Args: + output_epsg (int): epsg number of output network. + + Returns: + None + """ + + WranglerLogger.info( + "Renaming roadway attributes to be consistent with what metcouncil's model is expecting" + ) + + """ + Verify inputs + """ + + output_epsg = output_epsg if output_epsg else self.parameters.output_epsg + + """ + Start actual process + """ + if "managed" in self.links_df.columns: + WranglerLogger.info("Creating managed lane network.") + self.create_managed_lane_network(in_place=True) + + # when ML and assign_group projects are applied together, assign_group is filled as "" by wrangler for ML links + for c in ModelRoadwayNetwork.CALCULATED_VALUES: + if c in self.links_df.columns and c in self.parameters.int_col: + self.links_df[c] = self.links_df[c].replace("", 0) + else: + WranglerLogger.info("Didn't detect managed lanes in network.") + + self.calculate_centroidconnect(self.parameters) + self.create_calculated_variables() + self.calculate_distance(overwrite=True) + + self.fill_na() + # no method to calculate price yet, will be hard coded in project card + WranglerLogger.info("Splitting variables by time period and category") + self.split_properties_by_time_period_and_category() + self.convert_int() + + self.links_metcouncil_df = self.links_df.copy() + self.nodes_metcouncil_df = self.nodes_df.copy() + + self.links_metcouncil_df = pd.merge( + self.links_metcouncil_df.drop( + "geometry", axis=1 + ), # drop the stick geometry in links_df + self.shapes_df[["shape_id", "geometry"]], + how="left", + on="shape_id", + ) + + self.links_metcouncil_df.crs = "EPSG:4326" + self.nodes_metcouncil_df.crs = "EPSG:4326" + WranglerLogger.info("Setting Coordinate Reference System to EPSG 26915") + self.links_metcouncil_df = self.links_metcouncil_df.to_crs(epsg=26915) + self.nodes_metcouncil_df = self.nodes_metcouncil_df.to_crs(epsg=26915) + + self.nodes_metcouncil_df["X"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.x + ) + self.nodes_metcouncil_df["Y"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.y + ) + + # CUBE expect node id to be N + self.nodes_metcouncil_df.rename(columns={"model_node_id": "N"}, inplace=True)
+ +
[docs] def rename_variables_for_dbf( + self, + input_df, + variable_crosswalk: str = None, + output_variables: list = None, + convert_geometry_to_xy=False, + ): + """ + Rename attributes for DBF/SHP, make sure length within 10 chars. + + Args: + input_df (dataframe): Network standard DataFrame. + variable_crosswalk (str): File path to variable name crosswalk from network standard to DBF names. + output_variables (list): List of strings for DBF variables. + convert_geometry_to_xy (bool): True if converting node geometry to X/Y + + Returns: + dataframe + + """ + WranglerLogger.info("Renaming variables so that they are DBF-safe") + + """ + Verify inputs + """ + + variable_crosswalk = ( + variable_crosswalk + if variable_crosswalk + else self.parameters.net_to_dbf_crosswalk + ) + + output_variables = ( + output_variables if output_variables else self.parameters.output_variables + ) + + """ + Start actual process + """ + + crosswalk_df = pd.read_csv(variable_crosswalk) + WranglerLogger.debug( + "Variable crosswalk: {} \n {}".format(variable_crosswalk, crosswalk_df) + ) + net_to_dbf_dict = dict(zip(crosswalk_df["net"], crosswalk_df["dbf"])) + + dbf_name_list = [] + + dbf_df = copy.deepcopy(input_df) + + # only write out variables that we specify + # if variable is specified in the crosswalk, rename it to that variable + for c in dbf_df.columns: + if c in output_variables: + try: + dbf_df.rename(columns={c: net_to_dbf_dict[c]}, inplace=True) + dbf_name_list += [net_to_dbf_dict[c]] + except: + dbf_name_list += [c] + + if "geometry" in dbf_df.columns: + if str(dbf_df["geometry"].iloc[0].geom_type) == "Point": + dbf_df["X"] = dbf_df.geometry.apply(lambda g: g.x) + dbf_df["Y"] = dbf_df.geometry.apply(lambda g: g.y) + dbf_name_list += ["X", "Y"] + + WranglerLogger.debug("DBF Variables: {}".format(",".join(dbf_name_list))) + + return dbf_df[dbf_name_list]
+ +
[docs] def write_roadway_as_shp( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + data_to_csv: bool = True, + data_to_dbf: bool = False, + output_link_shp: str = None, + output_node_shp: str = None, + output_link_csv: str = None, + output_node_csv: str = None, + output_gpkg: str = None, + output_link_gpkg_layer: str = None, + output_node_gpkg_layer: str = None, + output_gpkg_link_filter: str = None + ): + """ + Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names. + + Args: + output_dir (str): File path to directory + node_output_variables (list): List of strings for node output variables. + link_output_variables (list): List of strings for link output variables. + data_to_csv (bool): True if write network in csv format. + data_to_dbf (bool): True if write network in dbf/shp format. + output_link_shp (str): File name to output link dbf/shp. + output_node_shp (str): File name of output node dbf/shp. + output_link_csv (str): File name to output link csv. + output_node_csv (str): File name to output node csv. + output_gpkg (str): File name to output GeoPackage. + output_link_gpkg_layer (str): Layer name within output_gpkg to output links. + output_node_gpkg_layer (str): Layer name within output_gpkg to output links. + output_gpkg_link_filter (str): Optional column name to additional output link subset layers + + Returns: + None + """ + + WranglerLogger.info("Writing Network as Shapefile") + WranglerLogger.debug( + "Output Variables: \n - {}".format( + "\n - ".join(self.parameters.output_variables) + ) + ) + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_met_council_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + # unless specified that all the data goes to the DBF, only output A and B + dbf_link_output_variables = ( + #MTC + link_output_variables if link_output_variables else ["A", "B", "geometry"] + #MC + #link_output_variables if data_to_dbf else ["A", "B", "shape_id", "geometry"] + ) + + # Removing code to set this to versions from parameters + # User can use these as arg + + """ + Start Process + """ + # rename these to short only for shapefile option + if output_node_shp: + WranglerLogger.info("Renaming DBF Node Variables") + nodes_dbf_df = self.rename_variables_for_dbf(self.nodes_mtc_df, output_variables=node_output_variables) + else: + WranglerLogger.debug("nodes_mtc_df columns: {}".format(list(self.nodes_mtc_df.columns))) + nodes_dbf_df = self.nodes_mtc_df[node_output_variables] + + if output_link_shp: + WranglerLogger.info("Renaming DBF Link Variables") + links_dbf_df = self.rename_variables_for_dbf(self.links_mtc_df, output_variables=dbf_link_output_variables) + else: + WranglerLogger.debug("links_mtc_df columns: {}".format(list(self.links_mtc_df.columns))) + links_dbf_df = self.links_mtc_df[dbf_link_output_variables] + + links_dbf_df = gpd.GeoDataFrame(links_dbf_df, geometry=links_dbf_df["geometry"]) + + # temp debug + WranglerLogger.debug("links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))]:\n{}".format( + links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))] + )) + + if output_node_shp: + WranglerLogger.info("Writing Node Shapes: {}".format(os.path.join(output_dir, output_node_shp))) + nodes_dbf_df.to_file(os.path.join(output_dir, output_node_shp)) + + if output_gpkg and output_node_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Node Layer {}".format(os.path.join(output_dir, output_gpkg), output_node_gpkg_layer)) + nodes_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_node_gpkg_layer, driver="GPKG") + + if output_link_shp: + WranglerLogger.info("Writing Link Shapes: {}".format(os.path.join(output_dir, output_link_shp))) + links_dbf_df.to_file(os.path.join(output_dir, output_link_shp)) + + # debug test + link_schema = { + "properties": { + "A" : "int:8", + "B" : "int:8", + "model_link_id" : "int:10", + "shstGeometryId": "str:32", + "name" : "str:84", + "ft" : "int:2", + "assignable" : "int:18", + "cntype" : "str:80", + "distance" : "float", + "county" : "str:15", + "bike_access" : "int:2", + "drive_access" : "int:2", + "walk_access" : "int:2", + "rail_only" : "int:2", + "bus_only" : "int:2", + "transit" : "int:2", + "managed" : "int:2", + "tollbooth" : "int:2", + "tollseg" : "int:2", + "segment_id" : "int:4", + "lanes_EA" : "int:2", + "heuristic_num" : "int:2", + "lanes_AM" : "int:2", + "lanes_MD" : "int:2", + "lanes_PM" : "int:2", + "lanes_EV" : "int:2", + "useclass_EA" : "int:2", + "useclass_AM" : "int:2", + "useclass_MD" : "int:2", + "useclass_PM" : "int:2", + "useclass_EV" : "int:2" + }, + "geometry": "LineString" + } + if output_gpkg and output_link_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Link Layer {}".format(os.path.join(output_dir, output_gpkg), output_link_gpkg_layer)) + links_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_link_gpkg_layer, schema=link_schema, driver="GPKG") + + # output additional link layers if filter column is specified + # e.g. if county-subsets are output + if output_gpkg_link_filter: + link_value_counts = links_dbf_df[output_gpkg_link_filter].value_counts() + for filter_val,filter_count in link_value_counts.items(): + gpkg_layer_name = "{}_{}".format(output_link_gpkg_layer, filter_val) + gpkg_layer_name = gpkg_layer_name.replace(" ","_") + WranglerLogger.info("Writing GeoPackage {} with Link Layer {} for {} rows".format( + os.path.join(output_dir, output_gpkg), gpkg_layer_name, filter_count)) + links_dbf_df.loc[ links_dbf_df[output_gpkg_link_filter]==filter_val ].to_file( + os.path.join(output_dir, output_gpkg), layer=gpkg_layer_name, schema=link_schema, driver="GPKG") + + + + + if data_to_csv: + WranglerLogger.info( + "Writing Network Data to CSVs:\n - {}\n - {}".format( + output_link_csv, output_node_csv + ) + ) + self.links_mtc_df[link_output_variables].to_csv( + output_link_csv, index=False + ) + self.nodes_mtc_df[node_output_variables].to_csv( + output_node_csv, index=False + )
+ + + # this should be moved to util +
[docs] @staticmethod + def dataframe_to_fixed_width(df): + """ + Convert dataframe to fixed width format, geometry column will not be transformed. + + Args: + df (pandas DataFrame). + + Returns: + pandas dataframe: dataframe with fixed width for each column. + dict: dictionary with columns names as keys, column width as values. + """ + WranglerLogger.info("Starting fixed width conversion") + + # get the max length for each variable column + max_width_dict = dict( + [ + (v, df[v].apply(lambda r: len(str(r)) if r != None else 0).max()) + for v in df.columns.values + if v != "geometry" + ] + ) + + fw_df = df.drop("geometry", axis=1).copy() + for c in fw_df.columns: + fw_df[c] = fw_df[c].apply(lambda x: str(x)) + fw_df["pad"] = fw_df[c].apply(lambda x: " " * (max_width_dict[c] - len(x))) + fw_df[c] = fw_df.apply(lambda x: x["pad"] + x[c], axis=1) + + return fw_df, max_width_dict
+ +
[docs] def write_roadway_as_fixedwidth( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + output_link_txt: str = None, + output_node_txt: str = None, + output_link_header_width_txt: str = None, + output_node_header_width_txt: str = None, + output_cube_network_script: str = None, + drive_only: bool = False, + ): + """ + Writes out fixed width file. + + This function does: + 1. write out link and node fixed width data files for cube. + 2. write out header and width correspondence. + 3. write out cube network building script with header and width specification. + + Args: + output_dir (str): File path to where links, nodes and script will be written and run + node_output_variables (list): list of node variable names. + link_output_variables (list): list of link variable names. + output_link_txt (str): File name of output link database (within output_dir) + output_node_txt (str): File name of output node database (within output_dir) + output_link_header_width_txt (str): File name of link column width records (within output_dir) + output_node_header_width_txt (str): File name of node column width records (within output_dir) + output_cube_network_script (str): File name of CUBE network building script (within output_dir) + drive_only (bool): If True, only writes drive nodes and links + + Returns: + None + + """ + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_mtc_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + output_link_txt = ( + output_link_txt if output_link_txt else self.parameters.output_link_txt + ) + + output_node_txt = ( + output_node_txt if output_node_txt else self.parameters.output_node_txt + ) + + output_link_header_width_txt = ( + output_link_header_width_txt + if output_link_header_width_txt + else self.parameters.output_link_header_width_txt + ) + + output_node_header_width_txt = ( + output_node_header_width_txt + if output_node_header_width_txt + else self.parameters.output_node_header_width_txt + ) + + output_cube_network_script = ( + output_cube_network_script + if output_cube_network_script + else self.parameters.output_cube_network_script + ) + + """ + Start Process + """ + #MTC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_mtc_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df['drive_access'] == 1] + #/MTC + """ + #MC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_metcouncil_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df["drive_access"] == 1] + #/MC + """ + WranglerLogger.info("Writing out link database") + + link_ff_df.to_csv(os.path.join(output_dir, output_link_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out link header and width ----") + link_max_width_df = DataFrame( + list(link_max_width_dict.items()), columns=["header", "width"] + ) + link_max_width_df.to_csv(os.path.join(output_dir, output_link_header_width_txt), index=False) + + #MTC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_mtc_df[node_output_variables] + ) + #/MTC + """ + #MC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_metcouncil_df[node_output_variables] + ) + #/MC + """ + WranglerLogger.info("Writing out node database") + + if drive_only: + node_ff_df = node_ff_df.loc[node_ff_df["drive_node"] == 1] + + + node_ff_df.to_csv(os.path.join(output_dir, output_node_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out node header and width") + node_max_width_df = DataFrame( + list(node_max_width_dict.items()), columns=["header", "width"] + ) + node_max_width_df.to_csv(os.path.join(output_dir, output_node_header_width_txt), index=False) + + # write out cube script + s = 'RUN PGM = NETWORK MSG = "Read in network from fixed width file" \n' + s += 'FILEI LINKI[1] = "{}",'.format(output_link_txt) + start_pos = 1 + for i in range(len(link_max_width_df)): + s += " VAR=" + link_max_width_df.header.iloc[i] + + if ( + self.links_mtc_df.dtypes.loc[link_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(link_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(link_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += link_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += "\n" + s += 'FILEI NODEI[1] = "{}",'.format(output_node_txt) + start_pos = 1 + for i in range(len(node_max_width_df)): + s += " VAR=" + node_max_width_df.header.iloc[i] + + if ( + self.nodes_mtc_df.dtypes.loc[node_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(node_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(node_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += node_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += '\n' + s += 'FILEO NETO = "complete_network.net"\n\n' + s += ' ZONES = {}\n\n'.format(self.parameters.zones) + s += '; Trim leading whitespace from string variables\n' + # todo: The below should be built above based on columns that are strings + s += ' phase=NODEMERGE\n' + s += ' county = LTRIM(county)\n' + s += ' endphase\n' + s += ' phase=LINKMERGE\n' + s += ' name = LTRIM(name)\n' + s += ' county = LTRIM(county)\n' + s += ' cntype = LTRIM(cntype)\n' + s += ' endphase\n' + s += '\nENDRUN\n' + + with open(os.path.join(output_dir, output_cube_network_script), "w") as f: + f.write(s) + + # run the cube script to create the cube network + import subprocess + env = copy.copy(os.environ) + cube_cmd = '"C:\\Program Files\\Citilabs\\CubeVoyager\\runtpp.exe" {}'.format(output_cube_network_script) + try: + WranglerLogger.info("Running [{}] in cwd [{}]".format(cube_cmd, output_dir)) + ret = subprocess.run(cube_cmd, cwd=output_dir, capture_output=True, check=True) + + WranglerLogger.info("return code: {}".format(ret.returncode)) + + for line in ret.stdout.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stdout: {}".format(line)) + + for line in ret.stderr.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stderr: {}".format(line)) + + except Exception as e: + WranglerLogger.error(e)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_modules/lasso/transit/index.html b/branch/remove_assignable/_modules/lasso/transit/index.html new file mode 100644 index 0000000..e075d49 --- /dev/null +++ b/branch/remove_assignable/_modules/lasso/transit/index.html @@ -0,0 +1,2037 @@ + + + + + + lasso.transit — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.transit

+"""Transit-related classes to parse, compare, and write standard and cube transit files.
+
+  Typical usage example:
+
+    tn = CubeTransit.create_from_cube(CUBE_DIR)
+    transit_change_list = tn.evaluate_differences(base_transit_network)
+
+    cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+    cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+"""
+import os
+import copy
+import csv
+import datetime, time
+from typing import Any, Dict, Optional, Union
+
+from lark import Lark, Transformer, v_args
+from pandas import DataFrame
+
+import pandas as pd
+import partridge as ptg
+import numpy as np
+
+from network_wrangler import TransitNetwork
+
+from .logger import WranglerLogger
+from .parameters import Parameters
+
+
[docs]class CubeTransit(object): + """Class for storing information about transit defined in Cube line + files. + + Has the capability to: + + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + + .. highlight:: python + + Typical usage example: + :: + tn = CubeTransit.create_from_cube(CUBE_DIR) + transit_change_list = tn.evaluate_differences(base_transit_network) + + Attributes: + lines (list): list of strings representing unique line names in + the cube network. + line_properties (dict): dictionary of line properties keyed by line name. Property + values are stored in a dictionary by property name. These + properties are directly read from the cube line files and haven't + been translated to standard transit values. + shapes (dict): dictionary of shapes + keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns: + - 'node_id' (int): positive integer of node id + - 'node' (int): node number, with negative indicating a non-stop + - 'stop' (boolean): indicates if it is a stop + - 'order' (int): order within this shape + program_type (str): Either PT or TRNBLD + parameters (Parameters): + Parameters instance that will be applied to this instance which + includes information about time periods and variables. + source_list (list): + List of cube line file sources that have been read and added. + diff_dict (dict): + """ + +
[docs] def __init__(self, parameters: Union[Parameters, dict] = {}): + """ + Constructor for CubeTransit + + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + WranglerLogger.debug("Creating a new Cube Transit instance") + + self.lines = [] + + self.line_properties = {} + self.shapes = {} + + self.program_type = None + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.source_list = [] + + self.diff_dict = Dict[str, Any]
+ +
[docs] def add_cube(self, transit_source: str): + """Reads a .lin file and adds it to existing TransitNetwork instance. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + """ + + """ + Figure out what kind of transit source it is + """ + + parser = Lark(TRANSIT_LINE_FILE_GRAMMAR, debug="debug", parser="lalr") + + if "NAME=" in transit_source: + WranglerLogger.debug("reading transit source as string") + self.source_list.append("input_str") + parse_tree = parser.parse(transit_source) + elif os.path.isfile(transit_source): + print("reading: {}".format(transit_source)) + with open(transit_source) as file: + WranglerLogger.debug( + "reading transit source: {}".format(transit_source) + ) + self.source_list.append(transit_source) + parse_tree = parser.parse(file.read()) + elif os.path.isdir(transit_source): + import glob + + for lin_file in glob.glob(os.path.join(transit_source, "*.LIN")): + self.add_cube(lin_file) + return + else: + msg = "{} not a valid transit line string, directory, or file" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("finished parsing cube line file") + # WranglerLogger.debug("--Parse Tree--\n {}".format(parse_tree.pretty())) + transformed_tree_data = CubeTransformer().transform(parse_tree) + # WranglerLogger.debug("--Transformed Parse Tree--\n {}".format(transformed_tree_data)) + + _line_data = transformed_tree_data["lines"] + + line_properties_dict = {k: v["line_properties"] for k, v in _line_data.items()} + line_shapes_dict = {k: v["line_shape"] for k, v in _line_data.items()} + new_lines = list(line_properties_dict.keys()) + """ + Before adding lines, check to see if any are overlapping with existing ones in the network + """ + + overlapping_lines = set(new_lines) & set(self.lines) + if overlapping_lines: + msg = "Overlapping lines found when adding from {}. \nSource files:\n{}\n{} Overlapping Lines of {} total new lines.\n-->{}".format( + transit_source, + "\n - ".join(self.source_list), + len(new_lines), + len(overlapping_lines), + overlapping_lines, + ) + print(msg) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.program_type = transformed_tree_data.get("program_type", None) + + self.lines += new_lines + self.line_properties.update(line_properties_dict) + self.shapes.update(line_shapes_dict) + + WranglerLogger.debug("Added lines to CubeTransit: \n".format(new_lines))
+ +
[docs] @staticmethod + def create_from_cube(transit_source: str, parameters: Optional[dict] = {}): + """ + Reads a cube .lin file and stores as TransitNetwork object. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + Returns: + A ::CubeTransit object created from the transit_source. + """ + + tn = CubeTransit(parameters) + tn.add_cube(transit_source) + + return tn
+ +
[docs] def evaluate_differences(self, base_transit): + """ + 1. Identifies what routes need to be updated, deleted, or added + 2. For routes being added or updated, identify if the time periods + have changed or if there are multiples, and make duplicate lines if so + 3. Create project card dictionaries for each change. + + Args: + base_transit (CubeTransit): an instance of this class for the base condition + + Returns: + A list of dictionaries containing project card changes + required to evaluate the differences between the base network + and this transit network instance. + """ + transit_change_list = [] + + """ + Identify what needs to be evaluated + """ + lines_to_update = [l for l in self.lines if l in base_transit.lines] + lines_to_delete = [l for l in base_transit.lines if l not in self.lines] + lines_to_add = [l for l in self.lines if l not in base_transit.lines] + + project_card_changes = [] + + """ + Evaluate Property Updates + """ + + for line in lines_to_update: + WranglerLogger.debug( + "Finding differences in time periods for: {}".format(line) + ) + + """ + Find any additional time periods that might need to add or delete. + """ + base_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + base_transit.line_properties[line] + ) + ) + + try: + assert len(base_cube_time_period_numbers) == 1 + except: + msg = "Base network line {} should only have one time period per route, but {} found".format( + line, base_cube_time_period_numbers + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + base_cube_time_period_number = base_cube_time_period_numbers[0] + + build_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + + time_periods_to_add = [ + tp + for tp in build_cube_time_period_numbers + if tp not in base_cube_time_period_numbers + ] + + for tp in time_periods_to_add: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + time_periods_to_delete = [ + tp + for tp in base_cube_time_period_numbers + if tp not in build_cube_time_period_numbers + ] + + for tp in time_periods_to_delete: + lines_to_delete.append(line) + + WranglerLogger.debug("Evaluating differences in: {}".format(line)) + updated_properties = self.evaluate_route_property_differences( + self.line_properties[line], + base_transit.line_properties[line], + base_cube_time_period_number, + ) + updated_shapes = CubeTransit.evaluate_route_shape_changes( + self.shapes[line].node, base_transit.shapes[line].node + ) + if updated_properties: + update_prop_card_dict = self.create_update_route_card_dict( + line, updated_properties + ) + project_card_changes.append(update_prop_card_dict) + + if updated_shapes: + update_shape_card_dict = self.create_update_route_card_dict( + line, updated_shapes + ) + project_card_changes.append(update_shape_card_dict) + + """ + Evaluate Deletions + """ + for line in lines_to_delete: + delete_card_dict = self.create_delete_route_card_dict( + line, base_transit.line_properties[line] + ) + project_card_changes.append(delete_card_dict) + + """ + Evaluate Additions + + First assess if need to add multiple routes if there are multiple time periods + """ + for line in lines_to_add: + time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + if len(time_period_numbers) > 1: + for tp in time_period_numbers[1:]: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + for line in lines_to_add: + add_card_dict = self.create_add_route_card_dict(line) + project_card_changes.append(add_card_dict) + + return project_card_changes
+ +
[docs] def add_additional_time_periods( + self, new_time_period_number: int, orig_line_name: str + ): + """ + Copies a route to another cube time period with appropriate + values for time-period-specific properties. + + New properties are stored under the new name in: + - ::self.shapes + - ::self.line_properties + + Args: + new_time_period_number (int): cube time period number + orig_line_name(str): name of the originating line, from which + the new line will copy its properties. + + Returns: + Line name with new time period. + """ + WranglerLogger.debug( + "adding time periods {} to line {}".format( + new_time_period_number, orig_line_name + ) + ) + + ( + route_id, + _init_time_period, + agency_id, + direction_id, + ) = CubeTransit.unpack_route_name(orig_line_name) + new_time_period_name = self.parameters.cube_time_periods[new_time_period_number] + new_tp_line_name = CubeTransit.build_route_name( + route_id=route_id, + time_period=new_time_period_name, + agency_id=agency_id, + direction_id=direction_id, + ) + + try: + assert new_tp_line_name not in self.lines + except: + msg = "Trying to add a new time period {} to line {}, but constructed name {} is already in line list.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + WrangerLogger.error(msg) + raise ValueError(msg) + + # copy to a new line and add it to list of lines to add + self.line_properties[new_tp_line_name] = copy.deepcopy( + self.line_properties[orig_line_name] + ) + self.shapes[new_tp_line_name] = copy.deepcopy(self.shapes[orig_line_name]) + self.line_properties[new_tp_line_name]["NAME"] = new_tp_line_name + + """ + Remove entries that aren't for this time period from the new line's properties list. + """ + this_time_period_properties_list = [ + p + "[" + str(new_time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + self.line_properties[new_tp_line_name].pop(k, None) + + """ + Remove entries for time period from the original line's properties list. + """ + for k in this_time_period_properties_list: + self.line_properties[orig_line_name].pop(k, None) + + """ + Add new line to list of lines to add. + """ + WranglerLogger.debug( + "Adding new time period {} for line {} as {}.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + ) + return new_tp_line_name
+ +
[docs] def create_update_route_card_dict(self, line: str, updated_properties_dict: dict): + """ + Creates a project card change formatted dictionary for updating + the line. + + Args: + line: name of line that is being updated + updated_properties_dict: dictionary of attributes to update as + 'property': <property name>, + 'set': <new property value> + + Returns: + A project card change-formatted dictionary for the attribute update. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.split("_")[-2].strip("d\"")), + "shape_id": line.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + "properties": updated_properties_dict, + } + WranglerLogger.debug( + "Updating {} route to changes:\n{}".format(line, str(update_card_dict)) + ) + + return update_card_dict
+ +
[docs] def create_delete_route_card_dict( + self, line: str, base_transit_line_properties_dict: dict + ): + """ + Creates a project card change formatted dictionary for deleting a line. + + Args: + line: name of line that is being deleted + base_transit_line_properties_dict: dictionary of cube-style + attribute values in order to find time periods and + start and end times. + + Returns: + A project card change-formatted dictionary for the route deletion. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + base_transit_line_properties_dict + ) + + delete_card_dict = { + "category": "Delete Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + } + WranglerLogger.debug( + "Deleting {} route to changes:\n{}".format(line, delete_card_dict) + ) + + return delete_card_dict
+ +
[docs] def create_add_route_card_dict(self, line: str): + """ + Creates a project card change formatted dictionary for adding + a route based on the information in self.route_properties for + the line. + + Args: + line: name of line that is being updated + + Returns: + A project card change-formatted dictionary for the route addition. + """ + start_time_str, end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + standard_properties = self.cube_properties_to_standard_properties( + self.line_properties[line] + ) + + routing_properties = { + "property": "routing", + "set": self.shapes[line]["node"].tolist(), + } + + add_card_dict = { + "category": "New Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('_')[-2]), + "start_time": start_time_str, + "end_time": end_time_str, + "agency_id": line.strip('_')[0], + }, + "properties": standard_properties + [routing_properties], + } + + WranglerLogger.debug( + "Adding {} route to changes:\n{}".format(line, add_card_dict) + ) + return add_card_dict
+ +
[docs] @staticmethod + def get_time_period_numbers_from_cube_properties(properties_list: list): + """ + Finds properties that are associated with time periods and the + returns the numbers in them. + + Args: + properties_list (list): list of all properties. + + Returns: + list of strings of the time period numbers found + """ + time_periods_list = [] + for p in properties_list: + if ("[" not in p) or ("]" not in p): + continue + tp_num = p.split("[")[1][0] + if tp_num and tp_num not in time_periods_list: + time_periods_list.append(tp_num) + return time_periods_list
+ +
[docs] @staticmethod + def build_route_name( + route_id: str = "", + time_period: str = "", + agency_id: str = 0, + direction_id: str = 1, + ): + """ + Create a route name by contatenating route, time period, agency, and direction + + Args: + route_id: i.e. 452-111 + time_period: i.e. pk + direction_id: i.e. 1 + agency_id: i.e. 0 + + Returns: + constructed line_name i.e. "0_452-111_452_pk1" + """ + + return ( + str(agency_id) + + "_" + + str(route_id) + + "_" + + str(route_id.split("-")[0]) + + "_" + + str(time_period) + + str(direction_id) + )
+ +
[docs] @staticmethod + def unpack_route_name(line_name: str): + """ + Unpacks route name into direction, route, agency, and time period info + + Args: + line_name (str): i.e. "0_452-111_452_pk1" + + Returns: + route_id (str): 452-111 + time_period (str): i.e. pk + direction_id (str) : i.e. 1 + agency_id (str) : i.e. 0 + """ + + line_name = line_name.strip('"') + + agency_id, route_id, _rtid, _tp_direction = line_name.split("_") + time_period = _tp_direction[0:-1] + direction_id = _tp_direction[-1] + + return route_id, time_period, agency_id, direction_id
+ +
[docs] def calculate_start_end_times(self, line_properties_dict: dict): + """ + Calculate the start and end times of the property change + WARNING: Doesn't take care of discongruous time periods!!!! + + Args: + line_properties_dict: dictionary of cube-flavor properties for a transit line + """ + start_time_m = 24 * 60 + end_time_m = 0 * 60 + + WranglerLogger.debug( + "parameters.time_period_properties_list: {}".format( + self.parameters.time_period_properties_list + ) + ) + current_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + line_properties_dict + ) + ) + + WranglerLogger.debug( + "current_cube_time_period_numbers:{}".format( + current_cube_time_period_numbers + ) + ) + + for tp in current_cube_time_period_numbers: + time_period_name = self.parameters.cube_time_periods[tp] + WranglerLogger.debug("time_period_name:{}".format(time_period_name)) + _start_time, _end_time = self.parameters.time_period_to_time[ + time_period_name + ] + + # change from "HH:MM" to integer # of seconds + _start_time_m = (int(_start_time.split(":")[0]) * 60) + int( + _start_time.split(":")[1] + ) + _end_time_m = (int(_end_time.split(":")[0]) * 60) + int( + _end_time.split(":")[1] + ) + + # find bounding start and end times + if _start_time_m < start_time_m: + start_time_m = _start_time_m + if _end_time_m > end_time_m: + end_time_m = _end_time_m + + if start_time_m > end_time_m: + msg = "Start time ({}) is after end time ({})".format( + start_time_m, end_time_m + ) + #WranglerLogger.error(msg) + #raise ValueError(msg) + + start_time_str = "{:02d}:{:02d}".format(*divmod(start_time_m, 60)) + end_time_str = "{:02d}:{:02d}".format(*divmod(end_time_m, 60)) + return start_time_str, end_time_str
+ +
[docs] @staticmethod + def cube_properties_to_standard_properties(cube_properties_dict: dict): + """ + Converts cube style properties to standard properties. + + This is most pertinent to time-period specific variables like headway, + and varibles that have stnadard units like headway, which is minutes + in cube and seconds in standard format. + + Args: + cube_properties_dict: <cube style property name> : <property value> + + Returns: + A list of dictionaries with values for `"property": <standard + style property name>, "set" : <property value with correct units>` + + """ + standard_properties_list = [] + for k, v in cube_properties_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + change_item["set"] = v * 60 + else: + change_item["property"] = k + change_item["set"] = v + standard_properties_list.append(change_item) + + return standard_properties_list
+ +
[docs] def evaluate_route_property_differences( + self, + properties_build: dict, + properties_base: dict, + time_period_number: str, + absolute: bool = True, + validate_base: bool = False, + ): + """ + Checks if any values have been updated or added for a specific + route and creates project card entries for each. + + Args: + properties_build: ::<property_name>: <property_value> + properties_base: ::<property_name>: <property_value> + time_period_number: time period to evaluate + absolute: if True, will use `set` command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway + validate_base: if True, will add the `existing` line in the project card + + Returns: + transit_change_list (list): a list of dictionary values suitable for writing to a project card + `{ + 'property': <property_name>, + 'set': <set value>, + 'change': <change from existing value>, + 'existing': <existing value to check>, + }` + + """ + + # Remove time period specific values for things that aren't part of the time period in question + this_time_period_properties_list = [ + p + "[" + str(time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + properties_build.pop(k, None) + properties_base.pop(k, None) + + difference_dict = dict( + set(properties_build.items()) ^ set(properties_base.items()) + ) + + # Iterate through properties list to build difference project card list + + properties_list = [] + for k, v in difference_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + + if absolute: + change_item["set"] = ( + v * 60 + ) # project cards are in secs, cube is in minutes + else: + change_item["change"] = ( + properties_build[k] - properties_base[k] + ) * 60 + if validate_base or not absolute: + change_item["existing"] = properties_base[k] * 60 + else: + change_item["property"] = k + change_item["set"] = v + if validate_base: + change_item["existing"] = properties_base[k] + + properties_list.append(change_item) + WranglerLogger.debug( + "Evaluated Route Changes: \n {})".format( + "\n".join(map(str, properties_list)) + ) + ) + return properties_list
+ +
[docs] @staticmethod + def evaluate_route_shape_changes( + shape_build: DataFrame, shape_base: DataFrame + ): + """ + Compares two route shapes and constructs returns list of changes + suitable for a project card. + + Args: + shape_build: DataFrame of the build-version of the route shape. + shape_base: dDataFrame of the base-version of the route shape. + + Returns: + List of shape changes formatted as a project card-change dictionary. + + """ + + if shape_build.equals(shape_base): + return None + + shape_change_list = [] + + base_node_list = shape_base.tolist() + build_node_list = shape_build.tolist() + + sort_len = max(len(base_node_list), len(build_node_list)) + + start_pos = None + end_pos = None + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + start_pos = i - 1 + break + if base_node_list[i] != build_node_list[i]: + start_pos = i + break + else: + continue + + j = -1 + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + end_pos = j + 1 + break + if base_node_list[j] != build_node_list[j]: + end_pos = j + break + else: + j -= 1 + + if start_pos or end_pos: + existing = base_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + set = build_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + + shape_change_list.append( + {"property": "routing", "existing": existing, "set": set} + ) + + return shape_change_list
+ + +
[docs]class StandardTransit(object): + """Holds a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's + Cube Line files. + + .. highlight:: python + Typical usage example: + :: + cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) + cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) + + Attributes: + feed: Partridge Feed object containing read-only access to GTFS feed + parameters (Parameters): Parameters instance containing information + about time periods and variables. + """ + +
[docs] def __init__(self, ptg_feed, parameters: Union[Parameters, dict] = {}): + """ + + Args: + ptg_feed: partridge feed object + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + self.feed = ptg_feed + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] @staticmethod + def fromTransitNetwork( + transit_network_object: TransitNetwork, parameters: Union[Parameters, dict] = {} + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + transit_network_object: Reference to an instance of TransitNetwork. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit + """ + return StandardTransit(transit_network_object.feed, parameters=parameters)
+ +
[docs] @staticmethod + def read_gtfs(gtfs_feed_dir: str, parameters: Union[Parameters, dict] = {}): + """ + Reads GTFS files from a directory and returns a StandardTransit + instance. + + Args: + gtfs_feed_dir: location of the GTFS files + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit instance + """ + return StandardTransit(ptg.load_feed(gtfs_feed_dir), parameters=parameters)
+ +
[docs] def write_as_cube_lin(self, outpath: str = None): + """ + Writes the gtfs feed as a cube line file after + converting gtfs properties to MetCouncil cube properties. + #MC + Args: + outpath: File location for output cube line file. + + """ + if not outpath: + outpath = os.path.join(self.parameters.scratch_location, "outtransit.lin") + trip_cube_df = self.route_properties_gtfs_to_cube(self) + + trip_cube_df["LIN"] = trip_cube_df.apply(self.cube_format, axis=1) + + l = trip_cube_df["LIN"].tolist() + + with open(outpath, "w") as f: + f.write("\n".join(l))
+ +
[docs] @staticmethod + def route_properties_gtfs_to_cube(self): + """ + Prepare gtfs for cube lin file. + #MC + Does the following operations: + 1. Combines route, frequency, trip, and shape information + 2. Converts time of day to time periods + 3. Calculates cube route name from gtfs route name and properties + 4. Assigns a cube-appropriate mode number + 5. Assigns a cube-appropriate operator number + + Returns: + trip_df (DataFrame): DataFrame of trips with cube-appropriate values for: + - NAME + - ONEWAY + - OPERATOR + - MODE + - HEADWAY + """ + WranglerLogger.info( + "Converting GTFS Standard Properties to MetCouncil's Cube Standard" + ) + metro_operator_dict = { + "0": 3, + "1": 3, + "2": 3, + "3": 4, + "4": 2, + "5": 5, + "6": 8, + "7": 1, + "8": 1, + "9": 10, + "10": 3, + "11": 9, + "12": 3, + "13": 4, + "14": 4, + "15": 3, + } + + shape_df = self.feed.shapes.copy() + trip_df = self.feed.trips.copy() + + """ + Add information from: routes, frequencies, and routetype to trips_df + """ + trip_df = pd.merge(trip_df, self.feed.routes, how="left", on="route_id") + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + inv_cube_time_periods_map = { + v: k for k, v in self.parameters.cube_time_periods.items() + } + trip_df["tod_num"] = trip_df.tod_name.map(inv_cube_time_periods_map) + trip_df["tod_name"] = trip_df.tod_name.map( + self.parameters.cube_time_periods_name + ) + + trip_df["NAME"] = trip_df.apply( + lambda x: x.agency_id + + "_" + + x.route_id + + "_" + + x.route_short_name + + "_" + + x.tod_name + + str(x.direction_id), + axis=1, + ) + + trip_df["LONGNAME"] = trip_df["route_long_name"] + trip_df["HEADWAY"] = (trip_df["headway_secs"] / 60).astype(int) + trip_df["MODE"] = trip_df.apply(self.calculate_cube_mode, axis=1) + trip_df["ONEWAY"] = "T" + trip_df["OPERATOR"] = trip_df["agency_id"].map(metro_operator_dict) + + return trip_df
+ +
[docs] def calculate_cube_mode(self, row): + """ + Assigns a cube mode number by following logic. + #MC + For rail, uses GTFS route_type variable: + https://developers.google.com/transit/gtfs/reference + + :: + # route_type : cube_mode + route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9} # Rail + + For buses, uses route id numbers and route name to find + express and suburban buses as follows: + + :: + if not cube_mode: + if 'express' in row['LONGNAME'].lower(): + cube_mode = 7 # Express + elif int(row['route_id'].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + Args: + row: A DataFrame row with route_type, route_long_name, and route_id + + Returns: + cube mode number + """ + # route_type : cube_mode + route_type_to_cube_mode = { + 0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9, + } # Rail + + cube_mode = route_type_to_cube_mode[row["route_type"]] + + if not cube_mode: + if "express" in row["route_long_name"].lower(): + cube_mode = 7 # Express + elif int(row["route_id"].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + return cube_mode
+ +
[docs] def time_to_cube_time_period( + self, start_time_secs: int, as_str: bool = True, verbose: bool = False + ): + """ + Converts seconds from midnight to the cube time period. + + Args: + start_time_secs: start time for transit trip in seconds + from midnight + as_str: if True, returns the time period as a string, + otherwise returns a numeric time period + + Returns: + this_tp_num: if as_str is False, returns the numeric + time period + this_tp: if as_str is True, returns the Cube time period + name abbreviation + """ + from .util import hhmmss_to_datetime, secs_to_datetime + + # set initial time as the time that spans midnight + + start_time_dt = secs_to_datetime(start_time_secs) + + # set initial time as the time that spans midnight + this_tp = "NA" + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + _dt_end_time = hhmmss_to_datetime(_end_time) + if _dt_start_time > _dt_end_time: + this_tp = tp_name + break + + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + if start_time_dt >= _dt_start_time: + this_time = _dt_start_time + this_tp = tp_name + + if verbose: + WranglerLogger.debug( + "Finding Cube Time Period from Start Time: \ + \n - start_time_sec: {} \ + \n - start_time_dt: {} \ + \n - this_tp: {}".format( + start_time_secs, start_time_dt, this_tp + ) + ) + + if as_str: + return this_tp + + name_to_num = {v: k for k, v in self.parameters.cube_time_periods.items()} + this_tp_num = name_to_num.get(this_tp) + + if not this_tp_num: + msg = ( + "Cannot find time period number in {} for time period name: {}".format( + name_to_num, this_tp + ) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + return this_tp_num
+ +
[docs] def shape_gtfs_to_dict_list(self, trip_id: str, shape_id: str, add_nntime: bool): + """ + This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of + stepping through the routed nodes and corresponding them with shape nodes. + + TODO: eliminate this necessity by tagging the stop nodes in the shapes to begin with when + the transit routing on the roadway network is first performed. + + As such, I'm copying the code from StandardTransit.shape_gtfs_to_cube() with minimal modifications. + + Args: + trip_id of the trip in question + shape_id of the trip in question + Returns: + list of dict records with columns: + trip_id + shape_id + shape_pt_sequence + shape_mode_node_id + is_stop + access + stop_sequence + """ + # get the stop times for this route + # https://developers.google.com/transit/gtfs/reference#stop_timestxt + trip_stop_times_df = self.feed.stop_times.loc[ self.feed.stop_times.trip_id == trip_id, + ['trip_id','arrival_time','departure_time','stop_id','stop_sequence','pickup_type','drop_off_type']].copy() + trip_stop_times_df.sort_values(by='stop_sequence', inplace=True) + trip_stop_times_df.reset_index(drop=True, inplace=True) + # print("trip_stop_times_df:\n{}".format(trip_stop_times_df)) + # print("trip_stop_times_df.dtypes:\n{}".format(trip_stop_times_df.dtypes)) + # trip_stop_times_df: + # trip_id arrival_time departure_time stop_id stop_sequence pickup_type drop_off_type + # 0 10007 0 0 7781 1 0 NaN + # 1 10007 120 120 7845 2 0 NaN + # 2 10007 300 300 7790 3 0 NaN + # 3 10007 360 360 7854 4 0 NaN + # 4 10007 390 390 7951 5 0 NaN + # 5 10007 720 720 7950 6 0 NaN + # 6 10007 810 810 7850 7 0 NaN + # 7 10007 855 855 7945 8 0 NaN + # 8 10007 900 900 7803 9 0 NaN + # 9 10007 930 930 7941 10 0 NaN + # trip_stop_times_df.dtypes: + # trip_id object + # arrival_time object + # departure_time object + # stop_id object + # stop_sequence int64 + # pickup_type object + # drop_off_type object + + # get the shapes for this route + # https://developers.google.com/transit/gtfs/reference#shapestxt + trip_node_df = self.feed.shapes.loc[self.feed.shapes.shape_id == shape_id].copy() + trip_node_df.sort_values(by="shape_pt_sequence", inplace = True) + trip_node_df.reset_index(drop=True, inplace=True) + # print("trip_node_df.head(20):\n{}".format(trip_node_df.head(20))) + # print("trip_node_df.dtypes:\n{}".format(trip_node_df.dtypes)) + # trip_node_df: + # shape_id shape_pt_sequence shape_osm_node_id shape_shst_node_id shape_model_node_id shape_pt_lat shape_pt_lon + # 0 696 1 1429334016 35cb440c505534e8aedbd3a286b70eab 2139625 NaN NaN + # 1 696 2 444242480 39e263722d5849b3c732b48734671400 2164862 NaN NaN + # 2 696 3 5686705779 4c41c608c35f457079fd673bce5556e5 2169898 NaN NaN + # 3 696 4 3695761874 d0f5b2173189bbb1b5dbaa78a004e8c4 2021876 NaN NaN + # 4 696 5 1433982749 60726971f0fb359a57e9d8df30bf384b 2002078 NaN NaN + # 5 696 6 1433982740 634c301424647d5883191edf522180e3 2156807 NaN NaN + # 6 696 7 4915736746 f03c3d7f1aa0358a91c165f53dac1e20 2145185 NaN NaN + # 7 696 8 65604864 68b8df24f1572d267ecf834107741393 2120788 NaN NaN + # 8 696 9 65604866 e412a013ad45af6649fa1b396f74c127 2066513 NaN NaN + # 9 696 10 956664242 657e1602aa8585383ed058f28f7811ed 2006476 NaN NaN + # 10 696 11 291642561 726b03cced023a6459d7333885927208 2133933 NaN NaN + # 11 696 12 291642583 709a0c00811f213f7476349a2c002003 2159991 NaN NaN + # 12 696 13 291642745 c5aaab62e0c78c34d93ee57795f06953 2165343 NaN NaN + # 13 696 14 5718664845 c7f1f4aa88887071a0d28154fc84604b 2007965 NaN NaN + # 14 696 15 291642692 0ef007a79b391e8ba98daf4985f26f9b 2160569 NaN NaN + # 15 696 16 5718664843 2ce63288e77747abc3a4124f0e28efcf 2047955 NaN NaN + # 16 696 17 3485537279 ec0c8eb524f41072a9fd87ecfd45e15f 2169094 NaN NaN + # 17 696 18 5718664419 57ca23828db4adea39355a92fb0fc3ff 2082102 NaN NaN + # 18 696 19 5718664417 4aba41268ada1058ee58e99a84e28d37 2019974 NaN NaN + # 19 696 20 65545418 d4f815a2f6da6c95d2f032a3cd61020c 2025374 NaN NaN # trip_node_df.dtypes: + # shape_id object + # shape_pt_sequence int64 + # shape_osm_node_id object + # shape_shst_node_id object + # shape_model_node_id object + # shape_pt_lat object + # shape_pt_lon object + + # we only need: shape_id, shape_pt_sequence, shape_model_node_id + trip_node_df = trip_node_df[['shape_id','shape_pt_sequence','shape_model_node_id']] + + if 'trip_id' in self.feed.stops.columns: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on=['trip_id', "stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + # print("trip_stop_times_df:\n{}".format(trip_stop_times_df)) + # print("trip_stop_times_df.dtypes:\n{}".format(trip_stop_times_df.dtypes)) + # trip_stop_times_df.dtypes: + # trip_id object + # arrival_time object + # departure_time object + # stop_id object + # stop_sequence int64 + # pickup_type object + # drop_off_type object + # stop_name object + # stop_lat float64 + # stop_lon float64 + # zone_id object + # agency_raw_name object + # stop_code object + # location_type float64 + # parent_station object + # stop_desc object + # stop_url object + # stop_timezone object + # wheelchair_boarding float64 + # platform_code object + # position object + # direction object + # * used by routes object + # osm_node_id object + # shst_node_id object + # model_node_id object + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + # this is the same as shape_gtfs_to_cube but we'll build up a list of dicts with shape/stop information + shape_stop_dict_list = [] + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + # add this stop to shape_stop_df + node_dict = trip_node_df.iloc[nodeIdx].to_dict() + node_dict['trip_id' ] = trip_id + node_dict['is_stop' ] = True + node_dict['access' ] = access_v + node_dict['stop_sequence'] = stop_seq + shape_stop_dict_list.append(node_dict) + + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + # add this stop to shape_stop_df + node_dict = trip_node_df.iloc[nodeIdx].to_dict() + node_dict['trip_id'] = trip_id + node_dict['is_stop'] = False + shape_stop_dict_list.append(node_dict) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + # print("node_list_str: {}".format(node_list_str)) + return shape_stop_dict_list
+ +
[docs] def shape_gtfs_to_cube(self, row, add_nntime = False): + """ + Creates a list of nodes that for the route in appropriate + cube format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a string representation of the node list + for a route in cube format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + if 'trip_id' in self.feed.stops.columns: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on=['trip_id', "stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + return node_list_str
+ + +
[docs] def cube_format(self, row): + """ + Creates a string represnting the route in cube line file notation. + #MC + Args: + row: row of a DataFrame representing a cube-formatted trip, with the Attributes + trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR + + Returns: + string representation of route in cube line file notation + """ + + s = '\nLINE NAME="{}",'.format(row.NAME) + s += '\n LONGNAME="{}",'.format(row.LONGNAME) + s += "\n HEADWAY[{}]={},".format(row.tod_num, row.HEADWAY) + s += "\n MODE={},".format(row.MODE) + s += "\n ONEWAY={},".format(row.ONEWAY) + s += "\n OPERATOR={},".format(row.OPERATOR) + s += "\n NODES={}".format(self.shape_gtfs_to_cube(row)) + + return s
+ +
[docs] def shape_gtfs_to_emme(self, trip_row): + """ + Creates transit segment for the trips in appropriate + emme format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a dataframe representation of the transit segment + for a trip in emme format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == trip_row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == trip_row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # node list + stop_seq = 0 + nntimes = [] + allow_alightings=[] + allow_boardings=[] + stop_names=[] + + if trip_row.TM2_line_haul_name in ["Light rail", "Heavy rail", "Commuter rail", "Ferry service"]: + add_nntime = True + else: + add_nntime = False + + for nodeIdx in range(len(trip_node_list)): + + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + nntimes.append(nntime_v) + else: + nntimes.append(0) + + pickup_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"pickup_type"].iloc[0] + if pickup_type in [1, "1"]: + allow_alightings.append(0) + else: + allow_alightings.append(1) + + drop_off_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"drop_off_type"].iloc[0] + if drop_off_type in [1, "1"]: + allow_boardings.append(0) + else: + allow_boardings.append(1) + + stop_name = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"stop_name"].iloc[0] + stop_names.append(stop_name) + + else: + nntimes.append(0) + allow_alightings.append(0) + allow_boardings.append(0) + stop_names.append("") + + trip_node_df['time_minutes'] = nntimes + trip_node_df['allow_alightings'] = allow_alightings + trip_node_df['allow_boardings'] = allow_boardings + trip_node_df['stop_name'] = stop_names + trip_node_df['line_id'] = trip_row['line_id'] + trip_node_df['node_id'] = trip_node_df['shape_model_node_id'].astype(int) + trip_node_df['stop_order'] = trip_node_df['shape_pt_sequence'] + + return trip_node_df
+ +
[docs] def evaluate_differences(self, transit_changes): + """ + Compare changes from the transit_changes dataframe with the standard transit network + returns the project card changes in dictionary format + """ + + # simple properties change + trip_df = self.feed.trips.copy() + + mode_crosswalk = pd.read_csv(self.parameters.mode_crosswalk_file) + mode_crosswalk.drop_duplicates(subset = ["agency_raw_name", "route_type", "is_express_bus"], inplace = True) + + trip_df = pd.merge(trip_df, self.feed.routes.drop("agency_raw_name", axis = 1), how="left", on="route_id") + + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod"] = trip_df.start_time.apply(self.time_to_cube_time_period, as_str = False) + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + + trip_df["headway_minutes"] = (trip_df["headway_secs"] / 60).astype(int) + + trip_df = pd.merge(trip_df, self.feed.agency[["agency_name", "agency_raw_name", "agency_id"]], how = "left", on = ["agency_raw_name", "agency_id"]) + + # identify express bus + # moved this here from top since this StandardTransit shouldn't depend on mtc... + from .mtc import _is_express_bus + trip_df["is_express_bus"] = trip_df.apply(lambda x: _is_express_bus(x), axis = 1) + trip_df.drop("agency_name", axis = 1 , inplace = True) + + trip_df = pd.merge( + trip_df, + mode_crosswalk.drop("agency_id", axis = 1), + how = "left", + on = ["agency_raw_name", "route_type", "is_express_bus"] + ) + + trip_df["line_id"] = trip_df.apply( + lambda x: str(x.TM2_operator) + + "_" + + str(x.route_id) + + "_" + + x.tod_name + + "_" + + "d" + + str(int(x.direction_id)) + + "_s" + + x.shape_id, + axis=1, + ) + + trip_df["line_id"] = trip_df["line_id"].str.slice(stop = 28) + + project_card_changes = [] + + # lines updated + transit_changes['line_id'] = transit_changes.apply( + lambda x: '-'.join(x['element_id'].split('-')[:-3]) if + x['object'] == 'TRANSIT_STOP' else + x['element_id'], + axis = 1 + ) + + lines_updated_df = transit_changes[ + (transit_changes['operation'] == 'C') & + (transit_changes['line_id'].isin(trip_df['line_id'].tolist())) + ].copy() + + ######################### + # simple property changes + ######################### + + property_changes_df = lines_updated_df[ + lines_updated_df.object == 'TRANSIT_LINE' + ].copy() + + property_attribute_list = ['headway_secs'] + + for index, row in property_changes_df.iterrows(): + line_id = row['line_id'] + properties_list = [] + change_item = {} + for c in property_attribute_list: + existing_value = int(trip_df[ + trip_df['line_id'] == line_id + ][c].iloc[0]) + + change_item["existing"] = existing_value + + if c == 'headway_secs': + change_item["set"] = row['headway'] * 60 + else: + change_item["set"] = row[c] + + change_item["property"] = c + + properties_list.append(change_item) + + property_changes_df.loc[index, 'properties'] = properties_list + + ############### + # shape changes + ############### + + shape_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_SHAPE']) + ].copy() + + for index, row in shape_changes_df.iterrows(): + line_id = row.line_id + + # get base shape + trip_row = trip_df[trip_df.line_id == line_id].copy().squeeze() + + base_shape = self.shape_gtfs_to_emme( + trip_row=trip_row + ) + base_shape['shape_model_node_id'] = base_shape['shape_model_node_id'].astype(int) + + # get build shape + build_shape = row.new_itinerary + + updated_shapes = CubeTransit.evaluate_route_shape_changes( + shape_base = base_shape.shape_model_node_id, + shape_build = pd.Series(row.new_itinerary) + ) + updated_shapes[0]['property'] = 'shapes' + shape_changes_df.loc[index, 'properties'] = updated_shapes + + ############## + # stop changes + ############## + stop_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_STOP']) + ].copy() + + stop_attribute_list = ['allow_alightings', 'allow_boardings'] + + stop_changes_df = stop_changes_df.groupby( + ['line_id','i_node'] + )[stop_attribute_list].last().reset_index() + + stop_attribute_changes_df = pd.DataFrame() + + for attribute in stop_attribute_list: + + attribute_df = stop_changes_df.groupby( + ['line_id', attribute] + )['i_node'].apply(list).reset_index() + attribute_df['properties'] = attribute_df.apply( + lambda x: { + 'property' : attribute if x[attribute] == True else 'no_'+attribute.split('_')[-1], + 'set': x['i_node']}, + axis = 1 + ) + + stop_attribute_changes_df = pd.concat( + [stop_attribute_changes_df, + attribute_df[['line_id', 'properties']]], + sort = False, + ignore_index = True + ) + + ############## + # combine all transit changes + ############## + transit_changes_df = pd.concat( + [ + property_changes_df, + shape_changes_df, + stop_attribute_changes_df + ], + sort = False, + ignore_index = True + ) + + # groupby line_id + transit_changes_df = transit_changes_df.groupby( + ['line_id'] + )['properties'].apply(list).reset_index() + + # create change items by line_id + for index, row in transit_changes_df.iterrows(): + line_id = row['line_id'] + base_start_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[0] + + base_end_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[1] + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line_id.split("_")[1], + "direction_id": int(line_id.split("_")[-2].strip("d\"")), + "shape_id": line_id.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str + }, + "properties": row['properties'], + } + + project_card_changes.append(update_card_dict) + + return project_card_changes
+ +class CubeTransformer(Transformer): + """A lark-parsing Transformer which transforms the parse-tree to + a dictionary. + + .. highlight:: python + Typical usage example: + :: + transformed_tree_data = CubeTransformer().transform(parse_tree) + + Attributes: + line_order (int): a dynamic counter to hold the order of the nodes within + a route shape + lines_list (list): a list of the line names + """ + + def __init__(self): + self.line_order = 0 + self.lines_list = [] + + def lines(self, line): + # WranglerLogger.debug("lines: \n {}".format(line)) + + # This MUST be a tuple because it returns to start in the tree + lines = {k: v for k, v in line} + return ("lines", lines) + + @v_args(inline=True) + def program_type_line(self, PROGRAM_TYPE, whitespace=None): + # WranglerLogger.debug("program_type_line:{}".format(PROGRAM_TYPE)) + self.program_type = PROGRAM_TYPE.value + + # This MUST be a tuple because it returns to start in the tree + return ("program_type", PROGRAM_TYPE.value) + + @v_args(inline=True) + def line(self, lin_attributes, nodes): + # WranglerLogger.debug("line...attributes:\n {}".format(lin_attributes)) + # WranglerLogger.debug("line...nodes:\n {}".format(nodes)) + lin_name = lin_attributes["NAME"] + + self.line_order = 0 + # WranglerLogger.debug("parsing: {}".format(lin_name)) + + return (lin_name, {"line_properties": lin_attributes, "line_shape": nodes}) + + @v_args(inline=True) + def lin_attributes(self, *lin_attr): + lin_attr = {k: v for (k, v) in lin_attr} + # WranglerLogger.debug("lin_attributes: {}".format(lin_attr)) + return lin_attr + + @v_args(inline=True) + def lin_attr(self, lin_attr_name, attr_value, SEMICOLON_COMMENT=None): + # WranglerLogger.debug("lin_attr {}: {}".format(lin_attr_name, attr_value)) + return lin_attr_name, attr_value + + def lin_attr_name(self, args): + attr_name = args[0].value.upper() + # WranglerLogger.debug(".......args {}".format(args)) + if attr_name in ["FREQ", "HEADWAY"]: + attr_name = attr_name + "[" + str(args[2]) + "]" + return attr_name + + def attr_value(self, attr_value): + try: + return int(attr_value[0].value) + except: + return attr_value[0].value + + def nodes(self, lin_node): + lin_node = DataFrame(lin_node) + # WranglerLogger.debug("nodes:\n {}".format(lin_node)) + + return lin_node + + @v_args(inline=True) + def lin_node(self, NODE_NUM, SEMICOLON_COMMENT=None, *lin_nodeattr): + self.line_order += 1 + n = int(NODE_NUM.value) + return {"node_id": abs(n), "node": n, "stop": n > 0, "order": self.line_order} + + start = dict + + +TRANSIT_LINE_FILE_GRAMMAR = r""" + +start : program_type_line? lines +WHITESPACE : /[ \t\r\n]/+ +STRING : /("(?!"").*?(?<!\\)(\\\\)*?"|'(?!'').*?(?<!\\)(\\\\)*?')/i +SEMICOLON_COMMENT : /;[^\n]*/ +BOOLEAN : "T"i | "F"i +program_type_line : ";;<<" PROGRAM_TYPE ">><<LINE>>;;" WHITESPACE? +PROGRAM_TYPE : "PT" | "TRNBUILD" + +lines : line* +line : "LINE" lin_attributes nodes + +lin_attributes : lin_attr+ +lin_attr : lin_attr_name "=" attr_value "," SEMICOLON_COMMENT* +TIME_PERIOD : "1".."5" +!lin_attr_name : "allstops"i + | "color"i + | ("freq"i "[" TIME_PERIOD "]") + | ("headway"i "[" TIME_PERIOD "]") + | "mode"i + | "name"i + | "oneway"i + | "owner"i + | "runtime"i + | "timefac"i + | "xyspeed"i + | "longname"i + | "shortname"i + | ("usera1"i) + | ("usera2"i) + | "circular"i + | "vehicletype"i + | "operator"i + | "faresystem"i + +attr_value : BOOLEAN | STRING | SIGNED_INT | FLOAT + +nodes : lin_node+ +lin_node : ("N" | "NODES")? "="? NODE_NUM ","? SEMICOLON_COMMENT? lin_nodeattr* +NODE_NUM : SIGNED_INT +lin_nodeattr : lin_nodeattr_name "=" attr_value ","? SEMICOLON_COMMENT* +!lin_nodeattr_name : "access_c"i + | "access"i + | "delay"i + | "xyspeed"i + | "timefac"i + | "nntime"i + | "time"i + +operator : SEMICOLON_COMMENT* "OPERATOR" opmode_attr* SEMICOLON_COMMENT* +mode : SEMICOLON_COMMENT* "MODE" opmode_attr* SEMICOLON_COMMENT* +opmode_attr : ( (opmode_attr_name "=" attr_value) ","? ) +opmode_attr_name : "number" | "name" | "longname" + +%import common.SIGNED_INT +%import common.FLOAT +%import common.WS +%ignore WS + +""" +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_modules/lasso/util/index.html b/branch/remove_assignable/_modules/lasso/util/index.html new file mode 100644 index 0000000..4b81954 --- /dev/null +++ b/branch/remove_assignable/_modules/lasso/util/index.html @@ -0,0 +1,256 @@ + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.util

+from functools import partial
+import pyproj
+from shapely.ops import transform
+from shapely.geometry import Point, Polygon
+import re
+from unidecode import unidecode
+
+
[docs]def get_shared_streets_intersection_hash(lat, long, osm_node_id=None): + """ + Calculated per: + https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565 + Expected in/out + -93.0965985, 44.952112199999995 osm_node_id = 954734870 + 69f13f881649cb21ee3b359730790bb9 + + """ + import hashlib + + message = "Intersection {0:.5f} {0:.5f}".format(long, lat) + if osm_node_id: + message += " {}".format(osm_node_id) + unhashed = message.encode("utf-8") + hash = hashlib.md5(unhashed).hexdigest() + return hash
+ + +
[docs]def hhmmss_to_datetime(hhmmss_str: str): + """ + Creates a datetime time object from a string of hh:mm:ss + + Args: + hhmmss_str: string of hh:mm:ss + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = datetime.time(*[int(i) for i in hhmmss_str.split(":")]) + + return dt
+ + +
[docs]def secs_to_datetime(secs: int): + """ + Creates a datetime time object from a seconds from midnight + + Args: + secs: seconds from midnight + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = (datetime.datetime.min + datetime.timedelta(seconds=secs)).time() + + return dt
+ + +
[docs]def geodesic_point_buffer(lat, lon, meters): + """ + creates circular buffer polygon for node + + Args: + lat: node lat + lon: node lon + meters: buffer distance, radius of circle + Returns: + Polygon + """ + proj_wgs84 = pyproj.Proj('+proj=longlat +datum=WGS84') + # Azimuthal equidistant projection + aeqd_proj = '+proj=aeqd +lat_0={lat} +lon_0={lon} +x_0=0 +y_0=0' + project = partial( + pyproj.transform, + pyproj.Proj(aeqd_proj.format(lat=lat, lon=lon)), + proj_wgs84) + buf = Point(0, 0).buffer(meters) # distance in meters + return Polygon(transform(project, buf).exterior.coords[:])
+ +
[docs]def create_locationreference(node, link): + node['X'] = node['geometry'].apply(lambda p: p.x) + node['Y'] = node['geometry'].apply(lambda p: p.y) + node['point'] = [list(xy) for xy in zip(node.X, node.Y)] + node_dict = dict(zip(node.model_node_id, node.point)) + + link['A_point'] = link['A'].map(node_dict) + link['B_point'] = link['B'].map(node_dict) + link['locationReferences'] = link.apply(lambda x: [{'sequence':1, + 'point': x['A_point'], + 'distanceToNextRef':x['length'], + 'bearing' : 0, + 'intersectionId':x['fromIntersectionId']}, + {'sequence':2, + 'point': x['B_point'], + 'intersectionId':x['toIntersectionId']}], + axis = 1)
+ +
[docs]def column_name_to_parts(c, parameters=None): + + if not parameters: + from .parameters import Parameters + + parameters = Parameters() + + if c[0:2] == "ML": + managed = True + else: + managed = False + + time_period = None + category = None + + if c.split("_")[0] not in parameters.properties_to_split.keys(): + return c, None, None, managed + + tps = parameters.time_period_to_time.keys() + cats = parameters.categories.keys() + + if c.split("_")[-1] in tps: + time_period = c.split("_")[-1] + base_name = c.split(time_period)[-2][:-1] + if c.split("_")[-2] in cats: + category = c.split("_")[-2] + base_name = c.split(category)[-2][:-1] + elif c.split("_")[-1] in cats: + category = c.split("_")[-1] + base_name = c.split(category)[-2][:-1] + else: + msg = "Can't split property correctly: {}".format(c) + WranglerLogger.error(msg) + + return base_name, time_period, category, managed
+ +
[docs]def shorten_name(name): + if type(name) == str: + name_list = name.split(',') + else: + name_list = name + name_list = [re.sub(r'\W+', ' ', c).replace('nan', '').strip(' ') for c in name_list] + + name_list = list(set(name_list)) + #name_list.remove('') + + name_new = ' '.join(name_list).strip(' ') + + # convert non english character to english + name_new = unidecode(name_new) + + return name_new
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_modules/shapely/geometry/point/index.html b/branch/remove_assignable/_modules/shapely/geometry/point/index.html new file mode 100644 index 0000000..57e05b8 --- /dev/null +++ b/branch/remove_assignable/_modules/shapely/geometry/point/index.html @@ -0,0 +1,252 @@ + + + + + + shapely.geometry.point — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.point

+"""Points and related utilities
+"""
+import numpy as np
+
+import shapely
+from shapely.errors import DimensionError
+from shapely.geometry.base import BaseGeometry
+
+__all__ = ["Point"]
+
+
+
[docs]class Point(BaseGeometry): + """ + A geometry type that represents a single coordinate with + x,y and possibly z values. + + A point is a zero-dimensional feature and has zero length and zero area. + + Parameters + ---------- + args : float, or sequence of floats + The coordinates can either be passed as a single parameter, or as + individual float values using multiple parameters: + + 1) 1 parameter: a sequence or array-like of with 2 or 3 values. + 2) 2 or 3 parameters (float): x, y, and possibly z. + + Attributes + ---------- + x, y, z : float + Coordinate values + + Examples + -------- + Constructing the Point using separate parameters for x and y: + + >>> p = Point(1.0, -1.0) + + Constructing the Point using a list of x, y coordinates: + + >>> p = Point([1.0, -1.0]) + >>> print(p) + POINT (1 -1) + >>> p.y + -1.0 + >>> p.x + 1.0 + """ + + __slots__ = [] + + def __new__(self, *args): + if len(args) == 0: + # empty geometry + # TODO better constructor + return shapely.from_wkt("POINT EMPTY") + elif len(args) > 3: + raise TypeError(f"Point() takes at most 3 arguments ({len(args)} given)") + elif len(args) == 1: + coords = args[0] + if isinstance(coords, Point): + return coords + + # Accept either (x, y) or [(x, y)] + if not hasattr(coords, "__getitem__"): # generators + coords = list(coords) + coords = np.asarray(coords).squeeze() + else: + # 2 or 3 args + coords = np.array(args).squeeze() + + if coords.ndim > 1: + raise ValueError( + f"Point() takes only scalar or 1-size vector arguments, got {args}" + ) + if not np.issubdtype(coords.dtype, np.number): + coords = [float(c) for c in coords] + geom = shapely.points(coords) + if not isinstance(geom, Point): + raise ValueError("Invalid values passed to Point constructor") + return geom + + # Coordinate getters and setters + + @property + def x(self): + """Return x coordinate.""" + return float(shapely.get_x(self)) + + @property + def y(self): + """Return y coordinate.""" + return float(shapely.get_y(self)) + + @property + def z(self): + """Return z coordinate.""" + if not shapely.has_z(self): + raise DimensionError("This point has no z coordinate.") + # return shapely.get_z(self) -> get_z only supported for GEOS 3.7+ + return self.coords[0][2] + + @property + def __geo_interface__(self): + return {"type": "Point", "coordinates": self.coords[0]} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG circle element for the Point geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG circle diameter. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + return ( + '<circle cx="{0.x}" cy="{0.y}" r="{1}" ' + 'stroke="#555555" stroke-width="{2}" fill="{3}" opacity="{4}" />' + ).format(self, 3.0 * scale_factor, 1.0 * scale_factor, fill_color, opacity)
+ + @property + def xy(self): + """Separate arrays of X and Y coordinate values + + Example: + >>> x, y = Point(0, 0).xy + >>> list(x) + [0.0] + >>> list(y) + [0.0] + """ + return self.coords.xy
+ + +shapely.lib.registry[0] = Point +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_modules/shapely/geometry/polygon/index.html b/branch/remove_assignable/_modules/shapely/geometry/polygon/index.html new file mode 100644 index 0000000..3f094ba --- /dev/null +++ b/branch/remove_assignable/_modules/shapely/geometry/polygon/index.html @@ -0,0 +1,462 @@ + + + + + + shapely.geometry.polygon — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.polygon

+"""Polygons and their linear ring components
+"""
+
+import numpy as np
+
+import shapely
+from shapely.algorithms.cga import is_ccw_impl, signed_area
+from shapely.errors import TopologicalError
+from shapely.geometry.base import BaseGeometry
+from shapely.geometry.linestring import LineString
+from shapely.geometry.point import Point
+
+__all__ = ["orient", "Polygon", "LinearRing"]
+
+
+def _unpickle_linearring(wkb):
+    linestring = shapely.from_wkb(wkb)
+    srid = shapely.get_srid(linestring)
+    linearring = shapely.linearrings(shapely.get_coordinates(linestring))
+    if srid:
+        linearring = shapely.set_srid(linearring, srid)
+    return linearring
+
+
+class LinearRing(LineString):
+    """
+    A geometry type composed of one or more line segments
+    that forms a closed loop.
+
+    A LinearRing is a closed, one-dimensional feature.
+    A LinearRing that crosses itself or touches itself at a single point is
+    invalid and operations on it may fail.
+
+    Parameters
+    ----------
+    coordinates : sequence
+        A sequence of (x, y [,z]) numeric coordinate pairs or triples, or
+        an array-like with shape (N, 2) or (N, 3).
+        Also can be a sequence of Point objects.
+
+    Notes
+    -----
+    Rings are automatically closed. There is no need to specify a final
+    coordinate pair identical to the first.
+
+    Examples
+    --------
+    Construct a square ring.
+
+    >>> ring = LinearRing( ((0, 0), (0, 1), (1 ,1 ), (1 , 0)) )
+    >>> ring.is_closed
+    True
+    >>> list(ring.coords)
+    [(0.0, 0.0), (0.0, 1.0), (1.0, 1.0), (1.0, 0.0), (0.0, 0.0)]
+    >>> ring.length
+    4.0
+
+    """
+
+    __slots__ = []
+
+    def __new__(self, coordinates=None):
+        if coordinates is None:
+            # empty geometry
+            # TODO better way?
+            return shapely.from_wkt("LINEARRING EMPTY")
+        elif isinstance(coordinates, LineString):
+            if type(coordinates) == LinearRing:
+                # return original objects since geometries are immutable
+                return coordinates
+            elif not coordinates.is_valid:
+                raise TopologicalError("An input LineString must be valid.")
+            else:
+                # LineString
+                # TODO convert LineString to LinearRing more directly?
+                coordinates = coordinates.coords
+
+        else:
+            if hasattr(coordinates, "__array__"):
+                coordinates = np.asarray(coordinates)
+            if isinstance(coordinates, np.ndarray) and np.issubdtype(
+                coordinates.dtype, np.number
+            ):
+                pass
+            else:
+                # check coordinates on points
+                def _coords(o):
+                    if isinstance(o, Point):
+                        return o.coords[0]
+                    else:
+                        return [float(c) for c in o]
+
+                coordinates = np.array([_coords(o) for o in coordinates])
+                if not np.issubdtype(coordinates.dtype, np.number):
+                    # conversion of coords to 2D array failed, this might be due
+                    # to inconsistent coordinate dimensionality
+                    raise ValueError("Inconsistent coordinate dimensionality")
+
+        if len(coordinates) == 0:
+            # empty geometry
+            # TODO better constructor + should shapely.linearrings handle this?
+            return shapely.from_wkt("LINEARRING EMPTY")
+
+        geom = shapely.linearrings(coordinates)
+        if not isinstance(geom, LinearRing):
+            raise ValueError("Invalid values passed to LinearRing constructor")
+        return geom
+
+    @property
+    def __geo_interface__(self):
+        return {"type": "LinearRing", "coordinates": tuple(self.coords)}
+
+    def __reduce__(self):
+        """WKB doesn't differentiate between LineString and LinearRing so we
+        need to move the coordinate sequence into the correct geometry type"""
+        return (_unpickle_linearring, (shapely.to_wkb(self, include_srid=True),))
+
+    @property
+    def is_ccw(self):
+        """True is the ring is oriented counter clock-wise"""
+        return bool(is_ccw_impl()(self))
+
+    @property
+    def is_simple(self):
+        """True if the geometry is simple, meaning that any self-intersections
+        are only at boundary points, else False"""
+        return bool(shapely.is_simple(self))
+
+
+shapely.lib.registry[2] = LinearRing
+
+
+class InteriorRingSequence:
+
+    _parent = None
+    _ndim = None
+    _index = 0
+    _length = 0
+
+    def __init__(self, parent):
+        self._parent = parent
+        self._ndim = parent._ndim
+
+    def __iter__(self):
+        self._index = 0
+        self._length = self.__len__()
+        return self
+
+    def __next__(self):
+        if self._index < self._length:
+            ring = self._get_ring(self._index)
+            self._index += 1
+            return ring
+        else:
+            raise StopIteration
+
+    def __len__(self):
+        return shapely.get_num_interior_rings(self._parent)
+
+    def __getitem__(self, key):
+        m = self.__len__()
+        if isinstance(key, int):
+            if key + m < 0 or key >= m:
+                raise IndexError("index out of range")
+            if key < 0:
+                i = m + key
+            else:
+                i = key
+            return self._get_ring(i)
+        elif isinstance(key, slice):
+            res = []
+            start, stop, stride = key.indices(m)
+            for i in range(start, stop, stride):
+                res.append(self._get_ring(i))
+            return res
+        else:
+            raise TypeError("key must be an index or slice")
+
+    def _get_ring(self, i):
+        return shapely.get_interior_ring(self._parent, i)
+
+
+
[docs]class Polygon(BaseGeometry): + """ + A geometry type representing an area that is enclosed by a linear ring. + + A polygon is a two-dimensional feature and has a non-zero area. It may + have one or more negative-space "holes" which are also bounded by linear + rings. If any rings cross each other, the feature is invalid and + operations on it may fail. + + Parameters + ---------- + shell : sequence + A sequence of (x, y [,z]) numeric coordinate pairs or triples, or + an array-like with shape (N, 2) or (N, 3). + Also can be a sequence of Point objects. + holes : sequence + A sequence of objects which satisfy the same requirements as the + shell parameters above + + Attributes + ---------- + exterior : LinearRing + The ring which bounds the positive space of the polygon. + interiors : sequence + A sequence of rings which bound all existing holes. + + Examples + -------- + Create a square polygon with no holes + + >>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.)) + >>> polygon = Polygon(coords) + >>> polygon.area + 1.0 + """ + + __slots__ = [] + + def __new__(self, shell=None, holes=None): + if shell is None: + # empty geometry + # TODO better way? + return shapely.from_wkt("POLYGON EMPTY") + elif isinstance(shell, Polygon): + # return original objects since geometries are immutable + return shell + else: + shell = LinearRing(shell) + + if holes is not None: + if len(holes) == 0: + # shapely constructor cannot handle holes=[] + holes = None + else: + holes = [LinearRing(ring) for ring in holes] + + geom = shapely.polygons(shell, holes=holes) + if not isinstance(geom, Polygon): + raise ValueError("Invalid values passed to Polygon constructor") + return geom + + @property + def exterior(self): + return shapely.get_exterior_ring(self) + + @property + def interiors(self): + if self.is_empty: + return [] + return InteriorRingSequence(self) + + @property + def coords(self): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not" + ) + + def __eq__(self, other): + if not isinstance(other, BaseGeometry): + return NotImplemented + if not isinstance(other, Polygon): + return False + check_empty = (self.is_empty, other.is_empty) + if all(check_empty): + return True + elif any(check_empty): + return False + my_coords = [self.exterior.coords] + [ + interior.coords for interior in self.interiors + ] + other_coords = [other.exterior.coords] + [ + interior.coords for interior in other.interiors + ] + if not len(my_coords) == len(other_coords): + return False + # equal_nan=False is the default, but not yet available for older numpy + return np.all( + [ + np.array_equal(left, right) # , equal_nan=False) + for left, right in zip(my_coords, other_coords) + ] + ) + + def __hash__(self): + return super().__hash__() + + @property + def __geo_interface__(self): + if self.exterior == LinearRing(): + coords = [] + else: + coords = [tuple(self.exterior.coords)] + for hole in self.interiors: + coords.append(tuple(hole.coords)) + return {"type": "Polygon", "coordinates": tuple(coords)} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG path element for the Polygon geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG stroke-width. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + exterior_coords = [["{},{}".format(*c) for c in self.exterior.coords]] + interior_coords = [ + ["{},{}".format(*c) for c in interior.coords] for interior in self.interiors + ] + path = " ".join( + [ + "M {} L {} z".format(coords[0], " L ".join(coords[1:])) + for coords in exterior_coords + interior_coords + ] + ) + return ( + '<path fill-rule="evenodd" fill="{2}" stroke="#555555" ' + 'stroke-width="{0}" opacity="{3}" d="{1}" />' + ).format(2.0 * scale_factor, path, fill_color, opacity)
+ +
[docs] @classmethod + def from_bounds(cls, xmin, ymin, xmax, ymax): + """Construct a `Polygon()` from spatial bounds.""" + return cls([(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin)])
+ + +shapely.lib.registry[3] = Polygon + + +def orient(polygon, sign=1.0): + s = float(sign) + rings = [] + ring = polygon.exterior + if signed_area(ring) / s >= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + for ring in polygon.interiors: + if signed_area(ring) / s <= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + return Polygon(rings[0], rings[1:]) +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_modules/shapely/ops/index.html b/branch/remove_assignable/_modules/shapely/ops/index.html new file mode 100644 index 0000000..48045cb --- /dev/null +++ b/branch/remove_assignable/_modules/shapely/ops/index.html @@ -0,0 +1,845 @@ + + + + + + shapely.ops — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.ops

+"""Support for various GEOS geometry operations
+"""
+
+from warnings import warn
+
+import shapely
+from shapely.algorithms.polylabel import polylabel  # noqa
+from shapely.errors import GeometryTypeError, ShapelyDeprecationWarning
+from shapely.geometry import (
+    GeometryCollection,
+    LineString,
+    MultiLineString,
+    MultiPoint,
+    Point,
+    Polygon,
+    shape,
+)
+from shapely.geometry.base import BaseGeometry, BaseMultipartGeometry
+from shapely.geometry.polygon import orient as orient_
+from shapely.prepared import prep
+
+__all__ = [
+    "cascaded_union",
+    "linemerge",
+    "operator",
+    "polygonize",
+    "polygonize_full",
+    "transform",
+    "unary_union",
+    "triangulate",
+    "voronoi_diagram",
+    "split",
+    "nearest_points",
+    "validate",
+    "snap",
+    "shared_paths",
+    "clip_by_rect",
+    "orient",
+    "substring",
+]
+
+
+class CollectionOperator:
+    def shapeup(self, ob):
+        if isinstance(ob, BaseGeometry):
+            return ob
+        else:
+            try:
+                return shape(ob)
+            except (ValueError, AttributeError):
+                return LineString(ob)
+
+    def polygonize(self, lines):
+        """Creates polygons from a source of lines
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        collection = shapely.polygonize(obs)
+        return collection.geoms
+
+    def polygonize_full(self, lines):
+        """Creates polygons from a source of lines, returning the polygons
+        and leftover geometries.
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+
+        Returns a tuple of objects: (polygons, cut edges, dangles, invalid ring
+        lines). Each are a geometry collection.
+
+        Dangles are edges which have one or both ends which are not incident on
+        another edge endpoint. Cut edges are connected at both ends but do not
+        form part of polygon. Invalid ring lines form rings which are invalid
+        (bowties, etc).
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        return shapely.polygonize_full(obs)
+
+    def linemerge(self, lines, directed=False):
+        """Merges all connected lines from a source
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.  Returns a
+        LineString or MultiLineString when lines are not contiguous.
+        """
+        source = None
+        if getattr(lines, "geom_type", None) == "MultiLineString":
+            source = lines
+        elif hasattr(lines, "geoms"):
+            # other Multi geometries
+            source = MultiLineString([ls.coords for ls in lines.geoms])
+        elif hasattr(lines, "__iter__"):
+            try:
+                source = MultiLineString([ls.coords for ls in lines])
+            except AttributeError:
+                source = MultiLineString(lines)
+        if source is None:
+            raise ValueError(f"Cannot linemerge {lines}")
+        return shapely.line_merge(source, directed=directed)
+
+    def cascaded_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        .. deprecated:: 1.8
+            This function was superseded by :meth:`unary_union`.
+        """
+        warn(
+            "The 'cascaded_union()' function is deprecated. "
+            "Use 'unary_union()' instead.",
+            ShapelyDeprecationWarning,
+            stacklevel=2,
+        )
+        return shapely.union_all(geoms, axis=None)
+
+    def unary_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        Usually used to convert a collection into the smallest set of polygons
+        that cover the same area.
+        """
+        return shapely.union_all(geoms, axis=None)
+
+
+operator = CollectionOperator()
+polygonize = operator.polygonize
+polygonize_full = operator.polygonize_full
+linemerge = operator.linemerge
+cascaded_union = operator.cascaded_union
+unary_union = operator.unary_union
+
+
+def triangulate(geom, tolerance=0.0, edges=False):
+    """Creates the Delaunay triangulation and returns a list of geometries
+
+    The source may be any geometry type. All vertices of the geometry will be
+    used as the points of the triangulation.
+
+    From the GEOS documentation:
+    tolerance is the snapping tolerance used to improve the robustness of
+    the triangulation computation. A tolerance of 0.0 specifies that no
+    snapping will take place.
+
+    If edges is False, a list of Polygons (triangles) will be returned.
+    Otherwise the list of LineString edges is returned.
+
+    """
+    collection = shapely.delaunay_triangles(geom, tolerance=tolerance, only_edges=edges)
+    return [g for g in collection.geoms]
+
+
+def voronoi_diagram(geom, envelope=None, tolerance=0.0, edges=False):
+    """
+    Constructs a Voronoi Diagram [1] from the given geometry.
+    Returns a list of geometries.
+
+    Parameters
+    ----------
+    geom: geometry
+        the input geometry whose vertices will be used to calculate
+        the final diagram.
+    envelope: geometry, None
+        clipping envelope for the returned diagram, automatically
+        determined if None. The diagram will be clipped to the larger
+        of this envelope or an envelope surrounding the sites.
+    tolerance: float, 0.0
+        sets the snapping tolerance used to improve the robustness
+        of the computation. A tolerance of 0.0 specifies that no
+        snapping will take place.
+    edges: bool, False
+        If False, return regions as polygons. Else, return only
+        edges e.g. LineStrings.
+
+    GEOS documentation can be found at [2]
+
+    Returns
+    -------
+    GeometryCollection
+        geometries representing the Voronoi regions.
+
+    Notes
+    -----
+    The tolerance `argument` can be finicky and is known to cause the
+    algorithm to fail in several cases. If you're using `tolerance`
+    and getting a failure, try removing it. The test cases in
+    tests/test_voronoi_diagram.py show more details.
+
+
+    References
+    ----------
+    [1] https://en.wikipedia.org/wiki/Voronoi_diagram
+    [2] https://geos.osgeo.org/doxygen/geos__c_8h_source.html  (line 730)
+    """
+    try:
+        result = shapely.voronoi_polygons(
+            geom, tolerance=tolerance, extend_to=envelope, only_edges=edges
+        )
+    except shapely.GEOSException as err:
+        errstr = "Could not create Voronoi Diagram with the specified inputs "
+        errstr += f"({err!s})."
+        if tolerance:
+            errstr += " Try running again with default tolerance value."
+        raise ValueError(errstr) from err
+
+    if result.geom_type != "GeometryCollection":
+        return GeometryCollection([result])
+    return result
+
+
+def validate(geom):
+    return shapely.is_valid_reason(geom)
+
+
+
[docs]def transform(func, geom): + """Applies `func` to all coordinates of `geom` and returns a new + geometry of the same type from the transformed coordinates. + + `func` maps x, y, and optionally z to output xp, yp, zp. The input + parameters may iterable types like lists or arrays or single values. + The output shall be of the same type. Scalars in, scalars out. + Lists in, lists out. + + For example, here is an identity function applicable to both types + of input. + + def id_func(x, y, z=None): + return tuple(filter(None, [x, y, z])) + + g2 = transform(id_func, g1) + + Using pyproj >= 2.1, this example will accurately project Shapely geometries: + + import pyproj + + wgs84 = pyproj.CRS('EPSG:4326') + utm = pyproj.CRS('EPSG:32618') + + project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform + + g2 = transform(project, g1) + + Note that the always_xy kwarg is required here as Shapely geometries only support + X,Y coordinate ordering. + + Lambda expressions such as the one in + + g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1) + + also satisfy the requirements for `func`. + """ + if geom.is_empty: + return geom + if geom.geom_type in ("Point", "LineString", "LinearRing", "Polygon"): + + # First we try to apply func to x, y, z sequences. When func is + # optimized for sequences, this is the fastest, though zipping + # the results up to go back into the geometry constructors adds + # extra cost. + try: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)(zip(*func(*zip(*geom.coords)))) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)(zip(*func(*zip(*geom.exterior.coords)))) + holes = list( + type(ring)(zip(*func(*zip(*ring.coords)))) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + # A func that assumes x, y, z are single values will likely raise a + # TypeError, in which case we'll try again. + except TypeError: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)([func(*c) for c in geom.coords]) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)([func(*c) for c in geom.exterior.coords]) + holes = list( + type(ring)([func(*c) for c in ring.coords]) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + elif geom.geom_type.startswith("Multi") or geom.geom_type == "GeometryCollection": + return type(geom)([transform(func, part) for part in geom.geoms]) + else: + raise GeometryTypeError(f"Type {geom.geom_type!r} not recognized")
+ + +def nearest_points(g1, g2): + """Returns the calculated nearest points in the input geometries + + The points are returned in the same order as the input geometries. + """ + seq = shapely.shortest_line(g1, g2) + if seq is None: + if g1.is_empty: + raise ValueError("The first input geometry is empty") + else: + raise ValueError("The second input geometry is empty") + + p1 = shapely.get_point(seq, 0) + p2 = shapely.get_point(seq, 1) + return (p1, p2) + + +def snap(g1, g2, tolerance): + """ + Snaps an input geometry (g1) to reference (g2) geometry's vertices. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + tolerance : float + The snapping tolerance + + Refer to :func:`shapely.snap` for full documentation. + """ + + return shapely.snap(g1, g2, tolerance) + + +def shared_paths(g1, g2): + """Find paths shared between the two given lineal geometries + + Returns a GeometryCollection with two elements: + - First element is a MultiLineString containing shared paths with the + same direction for both inputs. + - Second element is a MultiLineString containing shared paths with the + opposite direction for the two inputs. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + """ + if not isinstance(g1, LineString): + raise GeometryTypeError("First geometry must be a LineString") + if not isinstance(g2, LineString): + raise GeometryTypeError("Second geometry must be a LineString") + return shapely.shared_paths(g1, g2) + + +class SplitOp: + @staticmethod + def _split_polygon_with_line(poly, splitter): + """Split a Polygon with a LineString""" + if not isinstance(poly, Polygon): + raise GeometryTypeError("First argument must be a Polygon") + if not isinstance(splitter, LineString): + raise GeometryTypeError("Second argument must be a LineString") + + union = poly.boundary.union(splitter) + + # greatly improves split performance for big geometries with many + # holes (the following contains checks) with minimal overhead + # for common cases + poly = prep(poly) + + # some polygonized geometries may be holes, we do not want them + # that's why we test if the original polygon (poly) contains + # an inner point of polygonized geometry (pg) + return [ + pg for pg in polygonize(union) if poly.contains(pg.representative_point()) + ] + + @staticmethod + def _split_line_with_line(line, splitter): + """Split a LineString with another (Multi)LineString or (Multi)Polygon""" + + # if splitter is a polygon, pick it's boundary + if splitter.geom_type in ("Polygon", "MultiPolygon"): + splitter = splitter.boundary + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, LineString) and not isinstance( + splitter, MultiLineString + ): + raise GeometryTypeError( + "Second argument must be either a LineString or a MultiLineString" + ) + + # | s\l | Interior | Boundary | Exterior | + # |----------|----------|----------|----------| + # | Interior | 0 or F | * | * | At least one of these two must be 0 + # | Boundary | 0 or F | * | * | So either '0********' or '[0F]**0*****' + # | Exterior | * | * | * | No overlapping interiors ('1********') + relation = splitter.relate(line) + if relation[0] == "1": + # The lines overlap at some segment (linear intersection of interiors) + raise ValueError("Input geometry segment overlaps with the splitter.") + elif relation[0] == "0" or relation[3] == "0": + # The splitter crosses or touches the line's interior --> return multilinestring from the split + return line.difference(splitter) + else: + # The splitter does not cross or touch the line's interior --> return collection with identity line + return [line] + + @staticmethod + def _split_line_with_point(line, splitter): + """Split a LineString with a Point""" + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, Point): + raise GeometryTypeError("Second argument must be a Point") + + # check if point is in the interior of the line + if not line.relate_pattern(splitter, "0********"): + # point not on line interior --> return collection with single identity line + # (REASONING: Returning a list with the input line reference and creating a + # GeometryCollection at the general split function prevents unnecessary copying + # of linestrings in multipoint splitting function) + return [line] + elif line.coords[0] == splitter.coords[0]: + # if line is a closed ring the previous test doesn't behave as desired + return [line] + + # point is on line, get the distance from the first point on line + distance_on_line = line.project(splitter) + coords = list(line.coords) + # split the line at the point and create two new lines + current_position = 0.0 + for i in range(len(coords) - 1): + point1 = coords[i] + point2 = coords[i + 1] + dx = point1[0] - point2[0] + dy = point1[1] - point2[1] + segment_length = (dx**2 + dy**2) ** 0.5 + current_position += segment_length + if distance_on_line == current_position: + # splitter is exactly on a vertex + return [LineString(coords[: i + 2]), LineString(coords[i + 1 :])] + elif distance_on_line < current_position: + # splitter is between two vertices + return [ + LineString(coords[: i + 1] + [splitter.coords[0]]), + LineString([splitter.coords[0]] + coords[i + 1 :]), + ] + return [line] + + @staticmethod + def _split_line_with_multipoint(line, splitter): + """Split a LineString with a MultiPoint""" + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, MultiPoint): + raise GeometryTypeError("Second argument must be a MultiPoint") + + chunks = [line] + for pt in splitter.geoms: + new_chunks = [] + for chunk in filter(lambda x: not x.is_empty, chunks): + # add the newly split 2 lines or the same line if not split + new_chunks.extend(SplitOp._split_line_with_point(chunk, pt)) + chunks = new_chunks + + return chunks + + @staticmethod + def split(geom, splitter): + """ + Splits a geometry by another geometry and returns a collection of geometries. This function is the theoretical + opposite of the union of the split geometry parts. If the splitter does not split the geometry, a collection + with a single geometry equal to the input geometry is returned. + The function supports: + - Splitting a (Multi)LineString by a (Multi)Point or (Multi)LineString or (Multi)Polygon + - Splitting a (Multi)Polygon by a LineString + + It may be convenient to snap the splitter with low tolerance to the geometry. For example in the case + of splitting a line by a point, the point must be exactly on the line, for the line to be correctly split. + When splitting a line by a polygon, the boundary of the polygon is used for the operation. + When splitting a line by another line, a ValueError is raised if the two overlap at some segment. + + Parameters + ---------- + geom : geometry + The geometry to be split + splitter : geometry + The geometry that will split the input geom + + Example + ------- + >>> pt = Point((1, 1)) + >>> line = LineString([(0,0), (2,2)]) + >>> result = split(line, pt) + >>> result.wkt + 'GEOMETRYCOLLECTION (LINESTRING (0 0, 1 1), LINESTRING (1 1, 2 2))' + """ + + if geom.geom_type in ("MultiLineString", "MultiPolygon"): + return GeometryCollection( + [i for part in geom.geoms for i in SplitOp.split(part, splitter).geoms] + ) + + elif geom.geom_type == "LineString": + if splitter.geom_type in ( + "LineString", + "MultiLineString", + "Polygon", + "MultiPolygon", + ): + split_func = SplitOp._split_line_with_line + elif splitter.geom_type == "Point": + split_func = SplitOp._split_line_with_point + elif splitter.geom_type == "MultiPoint": + split_func = SplitOp._split_line_with_multipoint + else: + raise GeometryTypeError( + f"Splitting a LineString with a {splitter.geom_type} is not supported" + ) + + elif geom.geom_type == "Polygon": + if splitter.geom_type == "LineString": + split_func = SplitOp._split_polygon_with_line + else: + raise GeometryTypeError( + f"Splitting a Polygon with a {splitter.geom_type} is not supported" + ) + + else: + raise GeometryTypeError( + f"Splitting {geom.geom_type} geometry is not supported" + ) + + return GeometryCollection(split_func(geom, splitter)) + + +split = SplitOp.split + + +def substring(geom, start_dist, end_dist, normalized=False): + """Return a line segment between specified distances along a LineString + + Negative distance values are taken as measured in the reverse + direction from the end of the geometry. Out-of-range index + values are handled by clamping them to the valid range of values. + + If the start distance equals the end distance, a Point is returned. + + If the start distance is actually beyond the end distance, then the + reversed substring is returned such that the start distance is + at the first coordinate. + + Parameters + ---------- + geom : LineString + The geometry to get a substring of. + start_dist : float + The distance along `geom` of the start of the substring. + end_dist : float + The distance along `geom` of the end of the substring. + normalized : bool, False + Whether the distance parameters are interpreted as a + fraction of the geometry's length. + + Returns + ------- + Union[Point, LineString] + The substring between `start_dist` and `end_dist` or a Point + if they are at the same location. + + Raises + ------ + TypeError + If `geom` is not a LineString. + + Examples + -------- + >>> from shapely.geometry import LineString + >>> from shapely.ops import substring + >>> ls = LineString((i, 0) for i in range(6)) + >>> ls.wkt + 'LINESTRING (0 0, 1 0, 2 0, 3 0, 4 0, 5 0)' + >>> substring(ls, start_dist=1, end_dist=3).wkt + 'LINESTRING (1 0, 2 0, 3 0)' + >>> substring(ls, start_dist=3, end_dist=1).wkt + 'LINESTRING (3 0, 2 0, 1 0)' + >>> substring(ls, start_dist=1, end_dist=-3).wkt + 'LINESTRING (1 0, 2 0)' + >>> substring(ls, start_dist=0.2, end_dist=-0.6, normalized=True).wkt + 'LINESTRING (1 0, 2 0)' + + Returning a `Point` when `start_dist` and `end_dist` are at the + same location. + + >>> substring(ls, 2.5, -2.5).wkt + 'POINT (2.5 0)' + """ + + if not isinstance(geom, LineString): + raise GeometryTypeError( + "Can only calculate a substring of LineString geometries. " + f"A {geom.geom_type} was provided." + ) + + # Filter out cases in which to return a point + if start_dist == end_dist: + return geom.interpolate(start_dist, normalized) + elif not normalized and start_dist >= geom.length and end_dist >= geom.length: + return geom.interpolate(geom.length, normalized) + elif not normalized and -start_dist >= geom.length and -end_dist >= geom.length: + return geom.interpolate(0, normalized) + elif normalized and start_dist >= 1 and end_dist >= 1: + return geom.interpolate(1, normalized) + elif normalized and -start_dist >= 1 and -end_dist >= 1: + return geom.interpolate(0, normalized) + + if normalized: + start_dist *= geom.length + end_dist *= geom.length + + # Filter out cases where distances meet at a middle point from opposite ends. + if start_dist < 0 < end_dist and abs(start_dist) + end_dist == geom.length: + return geom.interpolate(end_dist) + elif end_dist < 0 < start_dist and abs(end_dist) + start_dist == geom.length: + return geom.interpolate(start_dist) + + start_point = geom.interpolate(start_dist) + end_point = geom.interpolate(end_dist) + + if start_dist < 0: + start_dist = geom.length + start_dist # Values may still be negative, + if end_dist < 0: # but only in the out-of-range + end_dist = geom.length + end_dist # sense, not the wrap-around sense. + + reverse = start_dist > end_dist + if reverse: + start_dist, end_dist = end_dist, start_dist + + if start_dist < 0: + start_dist = 0 # to avoid duplicating the first vertex + + if reverse: + vertex_list = [tuple(*end_point.coords)] + else: + vertex_list = [tuple(*start_point.coords)] + + coords = list(geom.coords) + current_distance = 0 + for p1, p2 in zip(coords, coords[1:]): + if start_dist < current_distance < end_dist: + vertex_list.append(p1) + elif current_distance >= end_dist: + break + + current_distance += ((p2[0] - p1[0]) ** 2 + (p2[1] - p1[1]) ** 2) ** 0.5 + + if reverse: + vertex_list.append(tuple(*start_point.coords)) + # reverse direction result + vertex_list = reversed(vertex_list) + else: + vertex_list.append(tuple(*end_point.coords)) + + return LineString(vertex_list) + + +def clip_by_rect(geom, xmin, ymin, xmax, ymax): + """Returns the portion of a geometry within a rectangle + + The geometry is clipped in a fast but possibly dirty way. The output is + not guaranteed to be valid. No exceptions will be raised for topological + errors. + + Parameters + ---------- + geom : geometry + The geometry to be clipped + xmin : float + Minimum x value of the rectangle + ymin : float + Minimum y value of the rectangle + xmax : float + Maximum x value of the rectangle + ymax : float + Maximum y value of the rectangle + + Notes + ----- + Requires GEOS >= 3.5.0 + New in 1.7. + """ + if geom.is_empty: + return geom + return shapely.clip_by_rect(geom, xmin, ymin, xmax, ymax) + + +def orient(geom, sign=1.0): + """A properly oriented copy of the given geometry. + + The signed area of the result will have the given sign. A sign of + 1.0 means that the coordinates of the product's exterior rings will + be oriented counter-clockwise. + + Parameters + ---------- + geom : Geometry + The original geometry. May be a Polygon, MultiPolygon, or + GeometryCollection. + sign : float, optional. + The sign of the result's signed area. + + Returns + ------- + Geometry + + """ + if isinstance(geom, BaseMultipartGeometry): + return geom.__class__( + list( + map( + lambda geom: orient(geom, sign), + geom.geoms, + ) + ) + ) + if isinstance(geom, (Polygon,)): + return orient_(geom, sign) + return geom +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/_sources/_generated/lasso.CubeTransit.rst.txt b/branch/remove_assignable/_sources/_generated/lasso.CubeTransit.rst.txt new file mode 100644 index 0000000..e24b49e --- /dev/null +++ b/branch/remove_assignable/_sources/_generated/lasso.CubeTransit.rst.txt @@ -0,0 +1,36 @@ +lasso.CubeTransit +================= + +.. currentmodule:: lasso + +.. autoclass:: CubeTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~CubeTransit.__init__ + ~CubeTransit.add_additional_time_periods + ~CubeTransit.add_cube + ~CubeTransit.build_route_name + ~CubeTransit.calculate_start_end_times + ~CubeTransit.create_add_route_card_dict + ~CubeTransit.create_delete_route_card_dict + ~CubeTransit.create_from_cube + ~CubeTransit.create_update_route_card_dict + ~CubeTransit.cube_properties_to_standard_properties + ~CubeTransit.evaluate_differences + ~CubeTransit.evaluate_route_property_differences + ~CubeTransit.evaluate_route_shape_changes + ~CubeTransit.get_time_period_numbers_from_cube_properties + ~CubeTransit.unpack_route_name + + + + + + \ No newline at end of file diff --git a/branch/remove_assignable/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt b/branch/remove_assignable/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt new file mode 100644 index 0000000..29190d8 --- /dev/null +++ b/branch/remove_assignable/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt @@ -0,0 +1,90 @@ +lasso.ModelRoadwayNetwork +========================= + +.. currentmodule:: lasso + +.. autoclass:: ModelRoadwayNetwork + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~ModelRoadwayNetwork.__init__ + ~ModelRoadwayNetwork.add_counts + ~ModelRoadwayNetwork.add_incident_link_data_to_nodes + ~ModelRoadwayNetwork.add_new_roadway_feature_change + ~ModelRoadwayNetwork.add_variable_using_shst_reference + ~ModelRoadwayNetwork.addition_map + ~ModelRoadwayNetwork.apply + ~ModelRoadwayNetwork.apply_managed_lane_feature_change + ~ModelRoadwayNetwork.apply_python_calculation + ~ModelRoadwayNetwork.apply_roadway_feature_change + ~ModelRoadwayNetwork.assess_connectivity + ~ModelRoadwayNetwork.build_selection_key + ~ModelRoadwayNetwork.calculate_area_type + ~ModelRoadwayNetwork.calculate_centroidconnect + ~ModelRoadwayNetwork.calculate_county + ~ModelRoadwayNetwork.calculate_distance + ~ModelRoadwayNetwork.calculate_mpo + ~ModelRoadwayNetwork.calculate_use + ~ModelRoadwayNetwork.convert_int + ~ModelRoadwayNetwork.create_ML_variable + ~ModelRoadwayNetwork.create_calculated_variables + ~ModelRoadwayNetwork.create_dummy_connector_links + ~ModelRoadwayNetwork.create_hov_corridor_variable + ~ModelRoadwayNetwork.create_managed_lane_network + ~ModelRoadwayNetwork.create_managed_variable + ~ModelRoadwayNetwork.dataframe_to_fixed_width + ~ModelRoadwayNetwork.delete_roadway_feature_change + ~ModelRoadwayNetwork.deletion_map + ~ModelRoadwayNetwork.fill_na + ~ModelRoadwayNetwork.from_RoadwayNetwork + ~ModelRoadwayNetwork.get_attribute + ~ModelRoadwayNetwork.get_managed_lane_node_ids + ~ModelRoadwayNetwork.get_modal_graph + ~ModelRoadwayNetwork.get_modal_links_nodes + ~ModelRoadwayNetwork.get_property_by_time_period_and_group + ~ModelRoadwayNetwork.identify_segment + ~ModelRoadwayNetwork.identify_segment_endpoints + ~ModelRoadwayNetwork.is_network_connected + ~ModelRoadwayNetwork.load_transform_network + ~ModelRoadwayNetwork.network_connection_plot + ~ModelRoadwayNetwork.orig_dest_nodes_foreign_key + ~ModelRoadwayNetwork.ox_graph + ~ModelRoadwayNetwork.path_search + ~ModelRoadwayNetwork.read + ~ModelRoadwayNetwork.read_match_result + ~ModelRoadwayNetwork.rename_variables_for_dbf + ~ModelRoadwayNetwork.roadway_net_to_gdf + ~ModelRoadwayNetwork.roadway_standard_to_met_council_network + ~ModelRoadwayNetwork.select_roadway_features + ~ModelRoadwayNetwork.selection_has_unique_link_id + ~ModelRoadwayNetwork.selection_map + ~ModelRoadwayNetwork.shortest_path + ~ModelRoadwayNetwork.split_properties_by_time_period_and_category + ~ModelRoadwayNetwork.update_distance + ~ModelRoadwayNetwork.validate_link_schema + ~ModelRoadwayNetwork.validate_node_schema + ~ModelRoadwayNetwork.validate_properties + ~ModelRoadwayNetwork.validate_selection + ~ModelRoadwayNetwork.validate_shape_schema + ~ModelRoadwayNetwork.validate_uniqueness + ~ModelRoadwayNetwork.write + ~ModelRoadwayNetwork.write_roadway_as_fixedwidth + ~ModelRoadwayNetwork.write_roadway_as_shp + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~ModelRoadwayNetwork.CALCULATED_VALUES + + \ No newline at end of file diff --git a/branch/remove_assignable/_sources/_generated/lasso.Parameters.rst.txt b/branch/remove_assignable/_sources/_generated/lasso.Parameters.rst.txt new file mode 100644 index 0000000..28d2c86 --- /dev/null +++ b/branch/remove_assignable/_sources/_generated/lasso.Parameters.rst.txt @@ -0,0 +1,31 @@ +lasso.Parameters +================ + +.. currentmodule:: lasso + +.. autoclass:: Parameters + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Parameters.__init__ + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Parameters.cube_time_periods + ~Parameters.properties_to_split + ~Parameters.county_link_range_dict + ~Parameters.zones + + \ No newline at end of file diff --git a/branch/remove_assignable/_sources/_generated/lasso.Project.rst.txt b/branch/remove_assignable/_sources/_generated/lasso.Project.rst.txt new file mode 100644 index 0000000..e6e6bcc --- /dev/null +++ b/branch/remove_assignable/_sources/_generated/lasso.Project.rst.txt @@ -0,0 +1,42 @@ +lasso.Project +============= + +.. currentmodule:: lasso + +.. autoclass:: Project + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Project.__init__ + ~Project.add_highway_changes + ~Project.add_transit_changes + ~Project.create_project + ~Project.determine_roadway_network_changes_compatability + ~Project.emme_id_to_wrangler_id + ~Project.emme_name_to_wrangler_name + ~Project.evaluate_changes + ~Project.get_object_from_network_build_command + ~Project.get_operation_from_network_build_command + ~Project.read_logfile + ~Project.read_network_build_file + ~Project.write_project_card + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Project.CALCULATED_VALUES + ~Project.DEFAULT_PROJECT_NAME + ~Project.STATIC_VALUES + + \ No newline at end of file diff --git a/branch/remove_assignable/_sources/_generated/lasso.StandardTransit.rst.txt b/branch/remove_assignable/_sources/_generated/lasso.StandardTransit.rst.txt new file mode 100644 index 0000000..4fae048 --- /dev/null +++ b/branch/remove_assignable/_sources/_generated/lasso.StandardTransit.rst.txt @@ -0,0 +1,33 @@ +lasso.StandardTransit +===================== + +.. currentmodule:: lasso + +.. autoclass:: StandardTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~StandardTransit.__init__ + ~StandardTransit.calculate_cube_mode + ~StandardTransit.cube_format + ~StandardTransit.evaluate_differences + ~StandardTransit.fromTransitNetwork + ~StandardTransit.read_gtfs + ~StandardTransit.route_properties_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_dict_list + ~StandardTransit.shape_gtfs_to_emme + ~StandardTransit.time_to_cube_time_period + ~StandardTransit.write_as_cube_lin + + + + + + \ No newline at end of file diff --git a/branch/remove_assignable/_sources/_generated/lasso.logger.rst.txt b/branch/remove_assignable/_sources/_generated/lasso.logger.rst.txt new file mode 100644 index 0000000..2054273 --- /dev/null +++ b/branch/remove_assignable/_sources/_generated/lasso.logger.rst.txt @@ -0,0 +1,29 @@ +lasso.logger +============ + +.. automodule:: lasso.logger + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + setupLogging + + + + + + + + + + + + + diff --git a/branch/remove_assignable/_sources/_generated/lasso.util.rst.txt b/branch/remove_assignable/_sources/_generated/lasso.util.rst.txt new file mode 100644 index 0000000..95fecf8 --- /dev/null +++ b/branch/remove_assignable/_sources/_generated/lasso.util.rst.txt @@ -0,0 +1,35 @@ +lasso.util +========== + +.. automodule:: lasso.util + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + column_name_to_parts + create_locationreference + geodesic_point_buffer + get_shared_streets_intersection_hash + hhmmss_to_datetime + secs_to_datetime + shorten_name + + + + + + + + + + + + + diff --git a/branch/remove_assignable/_sources/autodoc.rst.txt b/branch/remove_assignable/_sources/autodoc.rst.txt new file mode 100644 index 0000000..7e48d58 --- /dev/null +++ b/branch/remove_assignable/_sources/autodoc.rst.txt @@ -0,0 +1,29 @@ +Lasso Classes and Functions +==================================== + +.. automodule:: lasso + :no-members: + :no-undoc-members: + :no-inherited-members: + :no-show-inheritance: + + +Base Classes +-------------- +.. autosummary:: + :toctree: _generated + :nosignatures: + + CubeTransit + StandardTransit + ModelRoadwayNetwork + Project + Parameters + +Utils and Functions +-------------------- +.. autosummary:: + :toctree: _generated + + util + logger diff --git a/branch/remove_assignable/_sources/index.rst.txt b/branch/remove_assignable/_sources/index.rst.txt new file mode 100644 index 0000000..616853c --- /dev/null +++ b/branch/remove_assignable/_sources/index.rst.txt @@ -0,0 +1,36 @@ +.. lasso documentation master file, created by + sphinx-quickstart on Thu Dec 5 15:43:28 2019. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +Welcome to lasso's documentation! +================================= + +This package of utilities is a wrapper around the +`network_wrangler `_ package +for MetCouncil and MTC. It aims to have the following functionality: + +1. parse Cube log files and base highway networks and create ProjectCards + for Network Wrangler +2. parse two Cube transit line files and create ProjectCards for NetworkWrangler +3. refine Network Wrangler highway networks to contain specific variables and + settings for the respective agency and export them to a format that can + be read in by Citilab's Cube software. + +.. toctree:: + :maxdepth: 3 + :caption: Contents: + + starting + setup + running + autodoc + + + +Indices and tables +================== + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/branch/remove_assignable/_sources/running.md.txt b/branch/remove_assignable/_sources/running.md.txt new file mode 100644 index 0000000..e139dc8 --- /dev/null +++ b/branch/remove_assignable/_sources/running.md.txt @@ -0,0 +1,12 @@ +# Running Lasso + +## Create project files + + +## Create a scenario + + +## Exporting networks + + +## Auditing and Reporting diff --git a/branch/remove_assignable/_sources/setup.md.txt b/branch/remove_assignable/_sources/setup.md.txt new file mode 100644 index 0000000..e77d463 --- /dev/null +++ b/branch/remove_assignable/_sources/setup.md.txt @@ -0,0 +1,9 @@ +# Setup + +### Projects + +### Parameters + +### Settings + +### Additional Data Files diff --git a/branch/remove_assignable/_sources/starting.md.txt b/branch/remove_assignable/_sources/starting.md.txt new file mode 100644 index 0000000..8886f95 --- /dev/null +++ b/branch/remove_assignable/_sources/starting.md.txt @@ -0,0 +1,292 @@ +# Starting Out + +## Installation + +If you are managing multiple python versions, we suggest using [`virtualenv`](https://virtualenv.pypa.io/en/latest/) or [`conda`](https://conda.io/en/latest/) virtual environments. + +Example using a conda environment (recommended) and using the package manager [pip](https://pip.pypa.io/en/stable/) to install Lasso from the source on GitHub. + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/Lasso@master +``` + +Lasso will install `network_wrangler` from the [PyPi](https://pypi.org/project/network-wrangler/) repository because it is included in Lasso's `requirements.txt`. + +#### Bleeding Edge +If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the `develop` branch of + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/network_wrangler@develop +pip install git+https://github.com/wsp-sag/Lasso@develop +``` + +#### From Clone +If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in [editable mode](https://pip.pypa.io/en/stable/reference/pip_install/?highlight=editable#editable-installs). + +**if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!** + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas osmnx -n +conda activate +git clone https://github.com/wsp-sag/Lasso +git clone https://github.com/wsp-sag/network_wrangler +cd network_wrangler +pip install -e . +cd .. +cd Lasso +pip install -e . +``` + +Notes: + +1. The -e installs it in editable mode. +2. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per [these directions on github](https://help.github.com/en/articles/fork-a-repo). +3. if you wanted to install from a specific tag/version number or branch, replace `@master` with `@` or `@tag` +4. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler + +If you are going to be doing Lasso development, we also recommend: + - a good IDE such as [Atom](http://atom.io), VS Code, Sublime Text, etc. + with Python syntax highlighting turned on. + - [GitHub Desktop](https://desktop.github.com/) to locally update your clones + +## Brief Intro + +Lasso is a 'wrapper' around the [Network Wrangler](http://wsp-sag.github.io/network_wrangler) utility. + +Both Lasso and NetworkWrangler are built around the following data schemas: + - [`roadway network`], which is based on a mashup of Open Street Map and [Shared Streets](http://sharedstreets.io). In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category. + - [`transit network`], which is based on a frequency-based implementation of the csv-based GTFS; and + - [`project card`], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml. + +In addition, Lasso utilizes the following data schemas: + + - [`MetCouncil Model Roadway Network Schema`], which adds data fields to the `roadway network` schema that MetCouncil uses in their travel model including breaking out data fields by time period. + - [`MetCouncil Model Transit Network Schema`], which uses the Cube PublicTransport format, and + - [`Cube Log Files`], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler. + - [`Cube public transport line files`], which define a set of transit lines in the cube software. + +### Components +Network Wrangler has the following atomic parts: + + - _RoadwayNetwork_ object, which represents the `roadway network` data as GeoDataFrames; + - _TransitNetwork_ object, which represents the `transit network` data as DataFrames; + - _ProjectCard_ object, which represents the data of the `project card`. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.; + - _Scenario_ object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network. + +In addition, Lasso has the following atomic parts: + + - _Project_ object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file. + - _ModelRoadwayNetwork_ object is a subclass of `RoadwayNetwork` and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube. + - _StandardTransit_, an object for holding a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files. + - _CubeTransit_, an object for storing information about transit defined in `Cube public transport line files` + . Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries. + - _Parameters_, A class representing all the parameters defining the networks + including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + +#### RoadwayNetwork + +Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames `links_df`, `nodes_df`, and `shapes_df`. + +```python +net = RoadwayNetwork.read( + link_filename=MY_LINK_FILE, + node_filename=MY_NODE_FILE, + shape_filename=MY_SHAPE_FILE, + shape_foreign_key ='shape_id', + + ) +my_selection = { + "link": [{"name": ["I 35E"]}], + "A": {"osm_node_id": "961117623"}, # start searching for segments at A + "B": {"osm_node_id": "2564047368"}, +} +net.select_roadway_features(my_selection) + +my_change = [ + { + 'property': 'lanes', + 'existing': 1, + 'set': 2, + }, + { + 'property': 'drive_access', + 'set': 0, + }, +] + +my_net.apply_roadway_feature_change( + my_net.select_roadway_features(my_selection), + my_change +) + +ml_net = net.create_managed_lane_network(in_place=False) + +ml_net.is_network_connected(mode="drive")) + +_, disconnected_nodes = ml_net.assess_connectivity( + mode="walk", + ignore_end_nodes=True +) +ml_net.write(filename=my_out_prefix, path=my_dir) +``` +#### TransitNetwork + +#### ProjectCard + +#### Scenario + +Manages sets of project cards and tiering from a base scenario/set of networks. + +```python + +my_base_scenario = { + "road_net": RoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ), + "transit_net": TransitNetwork.read(STPAUL_DIR), +} + +card_filenames = [ + "3_multiple_roadway_attribute_change.yml", + "multiple_changes.yml", + "4_simple_managed_lane.yml", +] + +project_card_directory = os.path.join(STPAUL_DIR, "project_cards") + +project_cards_list = [ + ProjectCard.read(os.path.join(project_card_directory, filename), validate=False) + for filename in card_filenames +] + +my_scenario = Scenario.create_scenario( + base_scenario=my_base_scenario, + project_cards_list=project_cards_list, +) +my_scenario.check_scenario_requisites() + +my_scenario.apply_all_projects() + +my_scenario.scenario_summary() +``` + +#### Project +Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network; + +```python + +test_project = Project.create_project( + base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + +test_project.evaluate_changes() + +test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + +``` + +#### ModelRoadwayNetwork +A subclass of network_wrangler's RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema. + +```Python + +net = ModelRoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ) + +net.write_roadway_as_fixedwidth() + +``` + +#### StandardTransit +Translates the standard GTFS data to MetCouncil's Cube Line files. + +```Python +cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) +cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) +``` + +#### CubeTransit +Used by the project class and has the capability to: + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + +```python +tn = CubeTransit.create_from_cube(CUBE_DIR) +transit_change_list = tn.evaluate_differences(base_transit_network) +``` + +#### Parameters +Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary. + +```Python +# read parameters from a yaml configuration file +# could also provide as a key/value pair +with open(config_file) as f: + my_config = yaml.safe_load(f) + +# provide parameters at instantiation of ModelRoadwayNetwork +model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork( + my_scenario.road_net, parameters=my_config.get("my_parameters", {}) + ) +# network written with direction from the parameters given +model_road_net.write_roadway_as_shp() + +``` + +### Typical Workflow + +Workflows in Lasso and Network Wrangler typically accomplish one of two goals: +1. Create Project Cards to document network changes as a result of either transit or roadway projects. +2. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network. + +#### Project Cards from Transit LIN Files + + +#### Project Cards from Cube LOG Files + + +#### Model Network Files for a Scenario + + + +## Running Quickstart Jupyter Notebooks + +To learn basic lasso functionality, please refer to the following jupyter notebooks in the `/notebooks` directory: + + - `Lasso Project Card Creation Quickstart.ipynb` + - `Lasso Scenario Creation Quickstart.ipynb` + + Jupyter notebooks can be started by activating the lasso conda environment and typing `jupyter notebook`: + + ```bash + conda activate + jupyter notebook + ``` diff --git a/branch/remove_assignable/_static/_sphinx_javascript_frameworks_compat.js b/branch/remove_assignable/_static/_sphinx_javascript_frameworks_compat.js new file mode 100644 index 0000000..8141580 --- /dev/null +++ b/branch/remove_assignable/_static/_sphinx_javascript_frameworks_compat.js @@ -0,0 +1,123 @@ +/* Compatability shim for jQuery and underscores.js. + * + * Copyright Sphinx contributors + * Released under the two clause BSD licence + */ + +/** + * small helper function to urldecode strings + * + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent#Decoding_query_parameters_from_a_URL + */ +jQuery.urldecode = function(x) { + if (!x) { + return x + } + return decodeURIComponent(x.replace(/\+/g, ' ')); +}; + +/** + * small helper function to urlencode strings + */ +jQuery.urlencode = encodeURIComponent; + +/** + * This function returns the parsed url parameters of the + * current request. Multiple values per key are supported, + * it will always return arrays of strings for the value parts. + */ +jQuery.getQueryParameters = function(s) { + if (typeof s === 'undefined') + s = document.location.search; + var parts = s.substr(s.indexOf('?') + 1).split('&'); + var result = {}; + for (var i = 0; i < parts.length; i++) { + var tmp = parts[i].split('=', 2); + var key = jQuery.urldecode(tmp[0]); + var value = jQuery.urldecode(tmp[1]); + if (key in result) + result[key].push(value); + else + result[key] = [value]; + } + return result; +}; + +/** + * highlight a given string on a jquery object by wrapping it in + * span elements with the given class name. + */ +jQuery.fn.highlightText = function(text, className) { + function highlight(node, addItems) { + if (node.nodeType === 3) { + var val = node.nodeValue; + var pos = val.toLowerCase().indexOf(text); + if (pos >= 0 && + !jQuery(node.parentNode).hasClass(className) && + !jQuery(node.parentNode).hasClass("nohighlight")) { + var span; + var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.className = className; + } + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + node.parentNode.insertBefore(span, node.parentNode.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling)); + node.nodeValue = val.substr(0, pos); + if (isInSVG) { + var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); + var bbox = node.parentElement.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute('class', className); + addItems.push({ + "parent": node.parentNode, + "target": rect}); + } + } + } + else if (!jQuery(node).is("button, select, textarea")) { + jQuery.each(node.childNodes, function() { + highlight(this, addItems); + }); + } + } + var addItems = []; + var result = this.each(function() { + highlight(this, addItems); + }); + for (var i = 0; i < addItems.length; ++i) { + jQuery(addItems[i].parent).before(addItems[i].target); + } + return result; +}; + +/* + * backward compatibility for jQuery.browser + * This will be supported until firefox bug is fixed. + */ +if (!jQuery.browser) { + jQuery.uaMatch = function(ua) { + ua = ua.toLowerCase(); + + var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || + /(webkit)[ \/]([\w.]+)/.exec(ua) || + /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || + /(msie) ([\w.]+)/.exec(ua) || + ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || + []; + + return { + browser: match[ 1 ] || "", + version: match[ 2 ] || "0" + }; + }; + jQuery.browser = {}; + jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; +} diff --git a/branch/remove_assignable/_static/basic.css b/branch/remove_assignable/_static/basic.css new file mode 100644 index 0000000..cfc60b8 --- /dev/null +++ b/branch/remove_assignable/_static/basic.css @@ -0,0 +1,921 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 230px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li p.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 360px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, figure.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, figure.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, figure.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, figure.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar, +aside.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} + +nav.contents, +aside.topic, +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ + +nav.contents, +aside.topic, +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +aside.sidebar > :last-child, +nav.contents > :last-child, +aside.topic > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +aside.sidebar::after, +nav.contents::after, +aside.topic::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure, figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption, figcaption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number, +figcaption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text, +figcaption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + +/* -- object description styles --------------------------------------------- */ + +.sig { + font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; +} + +.sig-name, code.descname { + background-color: transparent; + font-weight: bold; +} + +.sig-name { + font-size: 1.1em; +} + +code.descname { + font-size: 1.2em; +} + +.sig-prename, code.descclassname { + background-color: transparent; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.sig-param.n { + font-style: italic; +} + +/* C++ specific styling */ + +.sig-inline.c-texpr, +.sig-inline.cpp-texpr { + font-family: unset; +} + +.sig.c .k, .sig.c .kt, +.sig.cpp .k, .sig.cpp .kt { + color: #0033B3; +} + +.sig.c .m, +.sig.cpp .m { + color: #1750EB; +} + +.sig.c .s, .sig.c .sc, +.sig.cpp .s, .sig.cpp .sc { + color: #067D17; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} + +aside.footnote > span, +div.citation > span { + float: left; +} +aside.footnote > span:last-of-type, +div.citation > span:last-of-type { + padding-right: 0.5em; +} +aside.footnote > p { + margin-left: 2em; +} +div.citation > p { + margin-left: 4em; +} +aside.footnote > p:last-of-type, +div.citation > p:last-of-type { + margin-bottom: 0em; +} +aside.footnote > p:last-of-type:after, +div.citation > p:last-of-type:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +.sig dd { + margin-top: 0px; + margin-bottom: 0px; +} + +.sig dl { + margin-top: 0px; + margin-bottom: 0px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0 0.5em; + content: ":"; + display: inline-block; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +.translated { + background-color: rgba(207, 255, 207, 0.2) +} + +.untranslated { + background-color: rgba(255, 207, 207, 0.2) +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; + white-space: nowrap; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +span.linenos, +div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; + -webkit-user-select: text; /* Safari fallback only */ + -webkit-user-select: none; /* Chrome/Safari */ + -moz-user-select: none; /* Firefox */ + -ms-user-select: none; /* IE10+ */ +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/branch/remove_assignable/_static/css/badge_only.css b/branch/remove_assignable/_static/css/badge_only.css new file mode 100644 index 0000000..c718cee --- /dev/null +++ b/branch/remove_assignable/_static/css/badge_only.css @@ -0,0 +1 @@ +.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-style:normal;font-weight:400;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#FontAwesome) format("svg")}.fa:before{font-family:FontAwesome;font-style:normal;font-weight:400;line-height:1}.fa:before,a .fa{text-decoration:inherit}.fa:before,a .fa,li .fa{display:inline-block}li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before,.icon-book:before{content:"\f02d"}.fa-caret-down:before,.icon-caret-down:before{content:"\f0d7"}.fa-caret-up:before,.icon-caret-up:before{content:"\f0d8"}.fa-caret-left:before,.icon-caret-left:before{content:"\f0d9"}.fa-caret-right:before,.icon-caret-right:before{content:"\f0da"}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60}.rst-versions .rst-current-version:after{clear:both;content:"";display:block}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}} \ No newline at end of file diff --git a/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Bold.woff b/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Bold.woff2 b/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Regular.woff b/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Regular.woff2 b/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.eot b/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.eot new file mode 100644 index 0000000..e9f60ca Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.eot differ diff --git a/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.svg b/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.svg new file mode 100644 index 0000000..855c845 --- /dev/null +++ b/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.ttf b/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 0000000..35acda2 Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.ttf differ diff --git a/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.woff b/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.woff new file mode 100644 index 0000000..400014a Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.woff differ diff --git a/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.woff2 b/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 0000000..4d13fc6 Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/fontawesome-webfont.woff2 differ diff --git a/branch/remove_assignable/_static/css/fonts/lato-bold-italic.woff b/branch/remove_assignable/_static/css/fonts/lato-bold-italic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/lato-bold-italic.woff differ diff --git a/branch/remove_assignable/_static/css/fonts/lato-bold-italic.woff2 b/branch/remove_assignable/_static/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/lato-bold-italic.woff2 differ diff --git a/branch/remove_assignable/_static/css/fonts/lato-bold.woff b/branch/remove_assignable/_static/css/fonts/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/lato-bold.woff differ diff --git a/branch/remove_assignable/_static/css/fonts/lato-bold.woff2 b/branch/remove_assignable/_static/css/fonts/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/lato-bold.woff2 differ diff --git a/branch/remove_assignable/_static/css/fonts/lato-normal-italic.woff b/branch/remove_assignable/_static/css/fonts/lato-normal-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/lato-normal-italic.woff differ diff --git a/branch/remove_assignable/_static/css/fonts/lato-normal-italic.woff2 b/branch/remove_assignable/_static/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/lato-normal-italic.woff2 differ diff --git a/branch/remove_assignable/_static/css/fonts/lato-normal.woff b/branch/remove_assignable/_static/css/fonts/lato-normal.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/lato-normal.woff differ diff --git a/branch/remove_assignable/_static/css/fonts/lato-normal.woff2 b/branch/remove_assignable/_static/css/fonts/lato-normal.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/branch/remove_assignable/_static/css/fonts/lato-normal.woff2 differ diff --git a/branch/remove_assignable/_static/css/theme.css b/branch/remove_assignable/_static/css/theme.css new file mode 100644 index 0000000..19a446a --- /dev/null +++ b/branch/remove_assignable/_static/css/theme.css @@ -0,0 +1,4 @@ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .eqno .headerlink:before,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .eqno .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a button.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-left.toctree-expand,.wy-menu-vertical li button.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .eqno .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a button.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-right.toctree-expand,.wy-menu-vertical li button.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .eqno .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a button.pull-left.toctree-expand,.wy-menu-vertical li.on a button.pull-left.toctree-expand,.wy-menu-vertical li button.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .eqno .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a button.pull-right.toctree-expand,.wy-menu-vertical li.on a button.pull-right.toctree-expand,.wy-menu-vertical li button.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li button.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content .eqno .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content .eqno a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content p a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li a button.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content .eqno .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content p .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li button.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content .eqno .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a button.toctree-expand,.btn .wy-menu-vertical li.on a button.toctree-expand,.btn .wy-menu-vertical li button.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content .eqno .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a button.toctree-expand,.nav .wy-menu-vertical li.on a button.toctree-expand,.nav .wy-menu-vertical li button.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .eqno .btn .headerlink,.rst-content .eqno .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p .btn .headerlink,.rst-content p .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn button.toctree-expand,.wy-menu-vertical li.current>a .btn button.toctree-expand,.wy-menu-vertical li.current>a .nav button.toctree-expand,.wy-menu-vertical li .nav button.toctree-expand,.wy-menu-vertical li.on a .btn button.toctree-expand,.wy-menu-vertical li.on a .nav button.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .eqno .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li button.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .eqno .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li button.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .eqno .btn .fa-large.headerlink,.rst-content .eqno .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p .btn .fa-large.headerlink,.rst-content p .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn button.fa-large.toctree-expand,.wy-menu-vertical li .nav button.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .eqno .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li button.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .eqno .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li button.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .eqno .btn .fa-spin.headerlink,.rst-content .eqno .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p .btn .fa-spin.headerlink,.rst-content p .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn button.fa-spin.toctree-expand,.wy-menu-vertical li .nav button.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content .eqno .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li button.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content .eqno .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li button.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content .eqno .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li button.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content .eqno .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini button.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.rst-content section ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.rst-content section ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.rst-content section ul li p:last-child,.rst-content section ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.rst-content section ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.rst-content section ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.rst-content section ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content .section ol.arabic,.rst-content .toctree-wrapper ol,.rst-content .toctree-wrapper ol.arabic,.rst-content section ol,.rst-content section ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol.arabic li,.rst-content .section ol li,.rst-content .toctree-wrapper ol.arabic li,.rst-content .toctree-wrapper ol li,.rst-content section ol.arabic li,.rst-content section ol li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol.arabic li ul,.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content .toctree-wrapper ol.arabic li ul,.rst-content .toctree-wrapper ol li p:last-child,.rst-content .toctree-wrapper ol li ul,.rst-content section ol.arabic li ul,.rst-content section ol li p:last-child,.rst-content section ol li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol.arabic li ul li,.rst-content .section ol li ul li,.rst-content .toctree-wrapper ol.arabic li ul li,.rst-content .toctree-wrapper ol li ul li,.rst-content section ol.arabic li ul li,.rst-content section ol li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs>li{display:inline-block;padding-top:5px}.wy-breadcrumbs>li.wy-breadcrumbs-aside{float:right}.rst-content .wy-breadcrumbs>li code,.rst-content .wy-breadcrumbs>li tt,.wy-breadcrumbs>li .rst-content tt,.wy-breadcrumbs>li code{all:inherit;color:inherit}.breadcrumb-item:before{content:"/";color:#bbb;font-size:13px;padding:0 6px 0 3px}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li button.toctree-expand{display:block;float:left;margin-left:-1.2em;line-height:18px;color:#4d4d4d;border:none;background:none;padding:0}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover button.toctree-expand,.wy-menu-vertical li.on a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand{display:block;line-height:18px;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{padding:.4045em 1.618em .4045em 4.045em}.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{padding:.4045em 1.618em .4045em 5.663em}.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a{padding:.4045em 1.618em .4045em 7.281em}.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a{padding:.4045em 1.618em .4045em 8.899em}.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a{padding:.4045em 1.618em .4045em 10.517em}.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a{padding:.4045em 1.618em .4045em 12.135em}.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a{padding:.4045em 1.618em .4045em 13.753em}.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a{padding:.4045em 1.618em .4045em 15.371em}.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 1.618em .4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 button.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 button.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover button.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active button.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em;max-width:100%}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .eqno .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content .eqno .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li button.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version button.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content .toctree-wrapper>p.caption,.rst-content h1,.rst-content h2,.rst-content h3,.rst-content h4,.rst-content h5,.rst-content h6{margin-bottom:24px}.rst-content img{max-width:100%;height:auto}.rst-content div.figure,.rst-content figure{margin-bottom:24px}.rst-content div.figure .caption-text,.rst-content figure .caption-text{font-style:italic}.rst-content div.figure p:last-child.caption,.rst-content figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center,.rst-content figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img,.rst-content section>a>img,.rst-content section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp,.rst-content div.highlight span.linenos{user-select:none;pointer-events:none}.rst-content div.highlight span.linenos{display:inline-block;padding-left:0;padding-right:12px;margin-right:12px;border-right:1px solid #e6e9ea}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li,.rst-content .toctree-wrapper ol.loweralpha,.rst-content .toctree-wrapper ol.loweralpha>li,.rst-content section ol.loweralpha,.rst-content section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li,.rst-content .toctree-wrapper ol.upperalpha,.rst-content .toctree-wrapper ol.upperalpha>li,.rst-content section ol.upperalpha,.rst-content section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*,.rst-content .toctree-wrapper ol li>*,.rst-content .toctree-wrapper ul li>*,.rst-content section ol li>*,.rst-content section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child,.rst-content .toctree-wrapper ol li>:first-child,.rst-content .toctree-wrapper ul li>:first-child,.rst-content section ol li>:first-child,.rst-content section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child,.rst-content .toctree-wrapper ol li>p,.rst-content .toctree-wrapper ol li>p:last-child,.rst-content .toctree-wrapper ul li>p,.rst-content .toctree-wrapper ul li>p:last-child,.rst-content section ol li>p,.rst-content section ol li>p:last-child,.rst-content section ul li>p,.rst-content section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child,.rst-content .toctree-wrapper ol li>p:only-child,.rst-content .toctree-wrapper ol li>p:only-child:last-child,.rst-content .toctree-wrapper ul li>p:only-child,.rst-content .toctree-wrapper ul li>p:only-child:last-child,.rst-content section ol li>p:only-child,.rst-content section ol li>p:only-child:last-child,.rst-content section ul li>p:only-child,.rst-content section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul,.rst-content .toctree-wrapper ol li>ol,.rst-content .toctree-wrapper ol li>ul,.rst-content .toctree-wrapper ul li>ol,.rst-content .toctree-wrapper ul li>ul,.rst-content section ol li>ol,.rst-content section ol li>ul,.rst-content section ul li>ol,.rst-content section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul,.rst-content .toctree-wrapper ol.simple li>*,.rst-content .toctree-wrapper ol.simple li ol,.rst-content .toctree-wrapper ol.simple li ul,.rst-content .toctree-wrapper ul.simple li>*,.rst-content .toctree-wrapper ul.simple li ol,.rst-content .toctree-wrapper ul.simple li ul,.rst-content section ol.simple li>*,.rst-content section ol.simple li ol,.rst-content section ol.simple li ul,.rst-content section ul.simple li>*,.rst-content section ul.simple li ol,.rst-content section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink{opacity:0;font-size:14px;font-family:FontAwesome;margin-left:.5em}.rst-content .code-block-caption .headerlink:focus,.rst-content .code-block-caption:hover .headerlink,.rst-content .eqno .headerlink:focus,.rst-content .eqno:hover .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink:focus,.rst-content .toctree-wrapper>p.caption:hover .headerlink,.rst-content dl dt .headerlink:focus,.rst-content dl dt:hover .headerlink,.rst-content h1 .headerlink:focus,.rst-content h1:hover .headerlink,.rst-content h2 .headerlink:focus,.rst-content h2:hover .headerlink,.rst-content h3 .headerlink:focus,.rst-content h3:hover .headerlink,.rst-content h4 .headerlink:focus,.rst-content h4:hover .headerlink,.rst-content h5 .headerlink:focus,.rst-content h5:hover .headerlink,.rst-content h6 .headerlink:focus,.rst-content h6:hover .headerlink,.rst-content p.caption .headerlink:focus,.rst-content p.caption:hover .headerlink,.rst-content p .headerlink:focus,.rst-content p:hover .headerlink,.rst-content table>caption .headerlink:focus,.rst-content table>caption:hover .headerlink{opacity:1}.rst-content p a{overflow-wrap:anywhere}.rst-content .wy-table td p,.rst-content .wy-table td ul,.rst-content .wy-table th p,.rst-content .wy-table th ul,.rst-content table.docutils td p,.rst-content table.docutils td ul,.rst-content table.docutils th p,.rst-content table.docutils th ul,.rst-content table.field-list td p,.rst-content table.field-list td ul,.rst-content table.field-list th p,.rst-content table.field-list th ul{font-size:inherit}.rst-content .btn:focus{outline:2px solid}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .citation-reference>span.fn-bracket,.rst-content .footnote-reference>span.fn-bracket{display:none}.rst-content .hlist{width:100%}.rst-content dl dt span.classifier:before{content:" : "}.rst-content dl dt span.classifier-delimiter{display:none!important}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:auto minmax(80%,95%)}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{display:inline-grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{display:grid;grid-template-columns:auto auto minmax(.65rem,auto) minmax(40%,95%)}html.writer-html5 .rst-content aside.citation>span.label,html.writer-html5 .rst-content aside.footnote>span.label,html.writer-html5 .rst-content div.citation>span.label{grid-column-start:1;grid-column-end:2}html.writer-html5 .rst-content aside.citation>span.backrefs,html.writer-html5 .rst-content aside.footnote>span.backrefs,html.writer-html5 .rst-content div.citation>span.backrefs{grid-column-start:2;grid-column-end:3;grid-row-start:1;grid-row-end:3}html.writer-html5 .rst-content aside.citation>p,html.writer-html5 .rst-content aside.footnote>p,html.writer-html5 .rst-content div.citation>p{grid-column-start:4;grid-column-end:5}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{margin-bottom:24px}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.citation>dt>span.brackets:before,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.citation>dt>span.brackets:after,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a{word-break:keep-all}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a:not(:first-child):before,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.citation>dd p,html.writer-html5 .rst-content dl.footnote>dd p{font-size:.9rem}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{padding-left:1rem;padding-right:1rem;font-size:.9rem;line-height:1.2rem}html.writer-html5 .rst-content aside.citation p,html.writer-html5 .rst-content aside.footnote p,html.writer-html5 .rst-content div.citation p{font-size:.9rem;line-height:1.2rem;margin-bottom:12px}html.writer-html5 .rst-content aside.citation span.backrefs,html.writer-html5 .rst-content aside.footnote span.backrefs,html.writer-html5 .rst-content div.citation span.backrefs{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content aside.citation span.backrefs>a,html.writer-html5 .rst-content aside.footnote span.backrefs>a,html.writer-html5 .rst-content div.citation span.backrefs>a{word-break:keep-all}html.writer-html5 .rst-content aside.citation span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content aside.footnote span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content div.citation span.backrefs>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content aside.citation span.label,html.writer-html5 .rst-content aside.footnote span.label,html.writer-html5 .rst-content div.citation span.label{line-height:1.2rem}html.writer-html5 .rst-content aside.citation-list,html.writer-html5 .rst-content aside.footnote-list,html.writer-html5 .rst-content div.citation-list{margin-bottom:24px}html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content aside.footnote-list aside.footnote,html.writer-html5 .rst-content div.citation-list>div.citation,html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content aside.footnote-list aside.footnote code,html.writer-html5 .rst-content aside.footnote-list aside.footnote tt,html.writer-html5 .rst-content aside.footnote code,html.writer-html5 .rst-content aside.footnote tt,html.writer-html5 .rst-content div.citation-list>div.citation code,html.writer-html5 .rst-content div.citation-list>div.citation tt,html.writer-html5 .rst-content dl.citation code,html.writer-html5 .rst-content dl.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c;white-space:normal}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040;overflow-wrap:normal}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}.rst-content dl dd>ol:last-child,.rst-content dl dd>p:last-child,.rst-content dl dd>table:last-child,.rst-content dl dd>ul:last-child{margin-bottom:0}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px;max-width:100%}html.writer-html4 .rst-content dl:not(.docutils) .k,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .k{font-style:italic}html.writer-html4 .rst-content dl:not(.docutils) .descclassname,html.writer-html4 .rst-content dl:not(.docutils) .descname,html.writer-html4 .rst-content dl:not(.docutils) .sig-name,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .sig-name{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#000}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel,.rst-content .menuselection{font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content .guilabel,.rst-content .menuselection{border:1px solid #7fbbe3;background:#e7f2fa}.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>.kbd,.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>kbd{color:inherit;font-size:80%;background-color:#fff;border:1px solid #a6a6a6;border-radius:4px;box-shadow:0 2px grey;padding:2.4px 6px;margin:auto 0}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} \ No newline at end of file diff --git a/branch/remove_assignable/_static/doctools.js b/branch/remove_assignable/_static/doctools.js new file mode 100644 index 0000000..d06a71d --- /dev/null +++ b/branch/remove_assignable/_static/doctools.js @@ -0,0 +1,156 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Base JavaScript utilities for all Sphinx HTML documentation. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +const BLACKLISTED_KEY_CONTROL_ELEMENTS = new Set([ + "TEXTAREA", + "INPUT", + "SELECT", + "BUTTON", +]); + +const _ready = (callback) => { + if (document.readyState !== "loading") { + callback(); + } else { + document.addEventListener("DOMContentLoaded", callback); + } +}; + +/** + * Small JavaScript module for the documentation. + */ +const Documentation = { + init: () => { + Documentation.initDomainIndexTable(); + Documentation.initOnKeyListeners(); + }, + + /** + * i18n support + */ + TRANSLATIONS: {}, + PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), + LOCALE: "unknown", + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext: (string) => { + const translated = Documentation.TRANSLATIONS[string]; + switch (typeof translated) { + case "undefined": + return string; // no translation + case "string": + return translated; // translation exists + default: + return translated[0]; // (singular, plural) translation tuple exists + } + }, + + ngettext: (singular, plural, n) => { + const translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated !== "undefined") + return translated[Documentation.PLURAL_EXPR(n)]; + return n === 1 ? singular : plural; + }, + + addTranslations: (catalog) => { + Object.assign(Documentation.TRANSLATIONS, catalog.messages); + Documentation.PLURAL_EXPR = new Function( + "n", + `return (${catalog.plural_expr})` + ); + Documentation.LOCALE = catalog.locale; + }, + + /** + * helper function to focus on search bar + */ + focusSearchBar: () => { + document.querySelectorAll("input[name=q]")[0]?.focus(); + }, + + /** + * Initialise the domain index toggle buttons + */ + initDomainIndexTable: () => { + const toggler = (el) => { + const idNumber = el.id.substr(7); + const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); + if (el.src.substr(-9) === "minus.png") { + el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; + toggledRows.forEach((el) => (el.style.display = "none")); + } else { + el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; + toggledRows.forEach((el) => (el.style.display = "")); + } + }; + + const togglerElements = document.querySelectorAll("img.toggler"); + togglerElements.forEach((el) => + el.addEventListener("click", (event) => toggler(event.currentTarget)) + ); + togglerElements.forEach((el) => (el.style.display = "")); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); + }, + + initOnKeyListeners: () => { + // only install a listener if it is really needed + if ( + !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && + !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS + ) + return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.altKey || event.ctrlKey || event.metaKey) return; + + if (!event.shiftKey) { + switch (event.key) { + case "ArrowLeft": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const prevLink = document.querySelector('link[rel="prev"]'); + if (prevLink && prevLink.href) { + window.location.href = prevLink.href; + event.preventDefault(); + } + break; + case "ArrowRight": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const nextLink = document.querySelector('link[rel="next"]'); + if (nextLink && nextLink.href) { + window.location.href = nextLink.href; + event.preventDefault(); + } + break; + } + } + + // some keyboard layouts may need Shift to get / + switch (event.key) { + case "/": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.focusSearchBar(); + event.preventDefault(); + } + }); + }, +}; + +// quick alias for translations +const _ = Documentation.gettext; + +_ready(Documentation.init); diff --git a/branch/remove_assignable/_static/documentation_options.js b/branch/remove_assignable/_static/documentation_options.js new file mode 100644 index 0000000..c066c69 --- /dev/null +++ b/branch/remove_assignable/_static/documentation_options.js @@ -0,0 +1,14 @@ +var DOCUMENTATION_OPTIONS = { + URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'), + VERSION: '', + LANGUAGE: 'en', + COLLAPSE_INDEX: false, + BUILDER: 'dirhtml', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false, + SHOW_SEARCH_SUMMARY: true, + ENABLE_SEARCH_SHORTCUTS: true, +}; \ No newline at end of file diff --git a/branch/remove_assignable/_static/file.png b/branch/remove_assignable/_static/file.png new file mode 100644 index 0000000..a858a41 Binary files /dev/null and b/branch/remove_assignable/_static/file.png differ diff --git a/branch/remove_assignable/_static/graphviz.css b/branch/remove_assignable/_static/graphviz.css new file mode 100644 index 0000000..8d81c02 --- /dev/null +++ b/branch/remove_assignable/_static/graphviz.css @@ -0,0 +1,19 @@ +/* + * graphviz.css + * ~~~~~~~~~~~~ + * + * Sphinx stylesheet -- graphviz extension. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +img.graphviz { + border: 0; + max-width: 100%; +} + +object.graphviz { + max-width: 100%; +} diff --git a/branch/remove_assignable/_static/jquery.js b/branch/remove_assignable/_static/jquery.js new file mode 100644 index 0000000..c4c6022 --- /dev/null +++ b/branch/remove_assignable/_static/jquery.js @@ -0,0 +1,2 @@ +/*! jQuery v3.6.0 | (c) OpenJS Foundation and other contributors | jquery.org/license */ +!function(e,t){"use strict";"object"==typeof module&&"object"==typeof module.exports?module.exports=e.document?t(e,!0):function(e){if(!e.document)throw new Error("jQuery requires a window with a document");return t(e)}:t(e)}("undefined"!=typeof window?window:this,function(C,e){"use strict";var t=[],r=Object.getPrototypeOf,s=t.slice,g=t.flat?function(e){return t.flat.call(e)}:function(e){return t.concat.apply([],e)},u=t.push,i=t.indexOf,n={},o=n.toString,v=n.hasOwnProperty,a=v.toString,l=a.call(Object),y={},m=function(e){return"function"==typeof e&&"number"!=typeof e.nodeType&&"function"!=typeof e.item},x=function(e){return null!=e&&e===e.window},E=C.document,c={type:!0,src:!0,nonce:!0,noModule:!0};function b(e,t,n){var r,i,o=(n=n||E).createElement("script");if(o.text=e,t)for(r in c)(i=t[r]||t.getAttribute&&t.getAttribute(r))&&o.setAttribute(r,i);n.head.appendChild(o).parentNode.removeChild(o)}function w(e){return null==e?e+"":"object"==typeof e||"function"==typeof e?n[o.call(e)]||"object":typeof e}var f="3.6.0",S=function(e,t){return new S.fn.init(e,t)};function p(e){var t=!!e&&"length"in e&&e.length,n=w(e);return!m(e)&&!x(e)&&("array"===n||0===t||"number"==typeof t&&0+~]|"+M+")"+M+"*"),U=new RegExp(M+"|>"),X=new RegExp(F),V=new RegExp("^"+I+"$"),G={ID:new RegExp("^#("+I+")"),CLASS:new RegExp("^\\.("+I+")"),TAG:new RegExp("^("+I+"|[*])"),ATTR:new RegExp("^"+W),PSEUDO:new RegExp("^"+F),CHILD:new RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+M+"*(even|odd|(([+-]|)(\\d*)n|)"+M+"*(?:([+-]|)"+M+"*(\\d+)|))"+M+"*\\)|)","i"),bool:new RegExp("^(?:"+R+")$","i"),needsContext:new RegExp("^"+M+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+M+"*((?:-\\d)?\\d*)"+M+"*\\)|)(?=[^-]|$)","i")},Y=/HTML$/i,Q=/^(?:input|select|textarea|button)$/i,J=/^h\d$/i,K=/^[^{]+\{\s*\[native \w/,Z=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,ee=/[+~]/,te=new RegExp("\\\\[\\da-fA-F]{1,6}"+M+"?|\\\\([^\\r\\n\\f])","g"),ne=function(e,t){var n="0x"+e.slice(1)-65536;return t||(n<0?String.fromCharCode(n+65536):String.fromCharCode(n>>10|55296,1023&n|56320))},re=/([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g,ie=function(e,t){return t?"\0"===e?"\ufffd":e.slice(0,-1)+"\\"+e.charCodeAt(e.length-1).toString(16)+" ":"\\"+e},oe=function(){T()},ae=be(function(e){return!0===e.disabled&&"fieldset"===e.nodeName.toLowerCase()},{dir:"parentNode",next:"legend"});try{H.apply(t=O.call(p.childNodes),p.childNodes),t[p.childNodes.length].nodeType}catch(e){H={apply:t.length?function(e,t){L.apply(e,O.call(t))}:function(e,t){var n=e.length,r=0;while(e[n++]=t[r++]);e.length=n-1}}}function se(t,e,n,r){var i,o,a,s,u,l,c,f=e&&e.ownerDocument,p=e?e.nodeType:9;if(n=n||[],"string"!=typeof t||!t||1!==p&&9!==p&&11!==p)return n;if(!r&&(T(e),e=e||C,E)){if(11!==p&&(u=Z.exec(t)))if(i=u[1]){if(9===p){if(!(a=e.getElementById(i)))return n;if(a.id===i)return n.push(a),n}else if(f&&(a=f.getElementById(i))&&y(e,a)&&a.id===i)return n.push(a),n}else{if(u[2])return H.apply(n,e.getElementsByTagName(t)),n;if((i=u[3])&&d.getElementsByClassName&&e.getElementsByClassName)return H.apply(n,e.getElementsByClassName(i)),n}if(d.qsa&&!N[t+" "]&&(!v||!v.test(t))&&(1!==p||"object"!==e.nodeName.toLowerCase())){if(c=t,f=e,1===p&&(U.test(t)||z.test(t))){(f=ee.test(t)&&ye(e.parentNode)||e)===e&&d.scope||((s=e.getAttribute("id"))?s=s.replace(re,ie):e.setAttribute("id",s=S)),o=(l=h(t)).length;while(o--)l[o]=(s?"#"+s:":scope")+" "+xe(l[o]);c=l.join(",")}try{return H.apply(n,f.querySelectorAll(c)),n}catch(e){N(t,!0)}finally{s===S&&e.removeAttribute("id")}}}return g(t.replace($,"$1"),e,n,r)}function ue(){var r=[];return function e(t,n){return r.push(t+" ")>b.cacheLength&&delete e[r.shift()],e[t+" "]=n}}function le(e){return e[S]=!0,e}function ce(e){var t=C.createElement("fieldset");try{return!!e(t)}catch(e){return!1}finally{t.parentNode&&t.parentNode.removeChild(t),t=null}}function fe(e,t){var n=e.split("|"),r=n.length;while(r--)b.attrHandle[n[r]]=t}function pe(e,t){var n=t&&e,r=n&&1===e.nodeType&&1===t.nodeType&&e.sourceIndex-t.sourceIndex;if(r)return r;if(n)while(n=n.nextSibling)if(n===t)return-1;return e?1:-1}function de(t){return function(e){return"input"===e.nodeName.toLowerCase()&&e.type===t}}function he(n){return function(e){var t=e.nodeName.toLowerCase();return("input"===t||"button"===t)&&e.type===n}}function ge(t){return function(e){return"form"in e?e.parentNode&&!1===e.disabled?"label"in e?"label"in e.parentNode?e.parentNode.disabled===t:e.disabled===t:e.isDisabled===t||e.isDisabled!==!t&&ae(e)===t:e.disabled===t:"label"in e&&e.disabled===t}}function ve(a){return le(function(o){return o=+o,le(function(e,t){var n,r=a([],e.length,o),i=r.length;while(i--)e[n=r[i]]&&(e[n]=!(t[n]=e[n]))})})}function ye(e){return e&&"undefined"!=typeof e.getElementsByTagName&&e}for(e in d=se.support={},i=se.isXML=function(e){var t=e&&e.namespaceURI,n=e&&(e.ownerDocument||e).documentElement;return!Y.test(t||n&&n.nodeName||"HTML")},T=se.setDocument=function(e){var t,n,r=e?e.ownerDocument||e:p;return r!=C&&9===r.nodeType&&r.documentElement&&(a=(C=r).documentElement,E=!i(C),p!=C&&(n=C.defaultView)&&n.top!==n&&(n.addEventListener?n.addEventListener("unload",oe,!1):n.attachEvent&&n.attachEvent("onunload",oe)),d.scope=ce(function(e){return a.appendChild(e).appendChild(C.createElement("div")),"undefined"!=typeof e.querySelectorAll&&!e.querySelectorAll(":scope fieldset div").length}),d.attributes=ce(function(e){return e.className="i",!e.getAttribute("className")}),d.getElementsByTagName=ce(function(e){return e.appendChild(C.createComment("")),!e.getElementsByTagName("*").length}),d.getElementsByClassName=K.test(C.getElementsByClassName),d.getById=ce(function(e){return a.appendChild(e).id=S,!C.getElementsByName||!C.getElementsByName(S).length}),d.getById?(b.filter.ID=function(e){var t=e.replace(te,ne);return function(e){return e.getAttribute("id")===t}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n=t.getElementById(e);return n?[n]:[]}}):(b.filter.ID=function(e){var n=e.replace(te,ne);return function(e){var t="undefined"!=typeof e.getAttributeNode&&e.getAttributeNode("id");return t&&t.value===n}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n,r,i,o=t.getElementById(e);if(o){if((n=o.getAttributeNode("id"))&&n.value===e)return[o];i=t.getElementsByName(e),r=0;while(o=i[r++])if((n=o.getAttributeNode("id"))&&n.value===e)return[o]}return[]}}),b.find.TAG=d.getElementsByTagName?function(e,t){return"undefined"!=typeof t.getElementsByTagName?t.getElementsByTagName(e):d.qsa?t.querySelectorAll(e):void 0}:function(e,t){var n,r=[],i=0,o=t.getElementsByTagName(e);if("*"===e){while(n=o[i++])1===n.nodeType&&r.push(n);return r}return o},b.find.CLASS=d.getElementsByClassName&&function(e,t){if("undefined"!=typeof t.getElementsByClassName&&E)return t.getElementsByClassName(e)},s=[],v=[],(d.qsa=K.test(C.querySelectorAll))&&(ce(function(e){var t;a.appendChild(e).innerHTML="",e.querySelectorAll("[msallowcapture^='']").length&&v.push("[*^$]="+M+"*(?:''|\"\")"),e.querySelectorAll("[selected]").length||v.push("\\["+M+"*(?:value|"+R+")"),e.querySelectorAll("[id~="+S+"-]").length||v.push("~="),(t=C.createElement("input")).setAttribute("name",""),e.appendChild(t),e.querySelectorAll("[name='']").length||v.push("\\["+M+"*name"+M+"*="+M+"*(?:''|\"\")"),e.querySelectorAll(":checked").length||v.push(":checked"),e.querySelectorAll("a#"+S+"+*").length||v.push(".#.+[+~]"),e.querySelectorAll("\\\f"),v.push("[\\r\\n\\f]")}),ce(function(e){e.innerHTML="";var t=C.createElement("input");t.setAttribute("type","hidden"),e.appendChild(t).setAttribute("name","D"),e.querySelectorAll("[name=d]").length&&v.push("name"+M+"*[*^$|!~]?="),2!==e.querySelectorAll(":enabled").length&&v.push(":enabled",":disabled"),a.appendChild(e).disabled=!0,2!==e.querySelectorAll(":disabled").length&&v.push(":enabled",":disabled"),e.querySelectorAll("*,:x"),v.push(",.*:")})),(d.matchesSelector=K.test(c=a.matches||a.webkitMatchesSelector||a.mozMatchesSelector||a.oMatchesSelector||a.msMatchesSelector))&&ce(function(e){d.disconnectedMatch=c.call(e,"*"),c.call(e,"[s!='']:x"),s.push("!=",F)}),v=v.length&&new RegExp(v.join("|")),s=s.length&&new RegExp(s.join("|")),t=K.test(a.compareDocumentPosition),y=t||K.test(a.contains)?function(e,t){var n=9===e.nodeType?e.documentElement:e,r=t&&t.parentNode;return e===r||!(!r||1!==r.nodeType||!(n.contains?n.contains(r):e.compareDocumentPosition&&16&e.compareDocumentPosition(r)))}:function(e,t){if(t)while(t=t.parentNode)if(t===e)return!0;return!1},j=t?function(e,t){if(e===t)return l=!0,0;var n=!e.compareDocumentPosition-!t.compareDocumentPosition;return n||(1&(n=(e.ownerDocument||e)==(t.ownerDocument||t)?e.compareDocumentPosition(t):1)||!d.sortDetached&&t.compareDocumentPosition(e)===n?e==C||e.ownerDocument==p&&y(p,e)?-1:t==C||t.ownerDocument==p&&y(p,t)?1:u?P(u,e)-P(u,t):0:4&n?-1:1)}:function(e,t){if(e===t)return l=!0,0;var n,r=0,i=e.parentNode,o=t.parentNode,a=[e],s=[t];if(!i||!o)return e==C?-1:t==C?1:i?-1:o?1:u?P(u,e)-P(u,t):0;if(i===o)return pe(e,t);n=e;while(n=n.parentNode)a.unshift(n);n=t;while(n=n.parentNode)s.unshift(n);while(a[r]===s[r])r++;return r?pe(a[r],s[r]):a[r]==p?-1:s[r]==p?1:0}),C},se.matches=function(e,t){return se(e,null,null,t)},se.matchesSelector=function(e,t){if(T(e),d.matchesSelector&&E&&!N[t+" "]&&(!s||!s.test(t))&&(!v||!v.test(t)))try{var n=c.call(e,t);if(n||d.disconnectedMatch||e.document&&11!==e.document.nodeType)return n}catch(e){N(t,!0)}return 0":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(e){return e[1]=e[1].replace(te,ne),e[3]=(e[3]||e[4]||e[5]||"").replace(te,ne),"~="===e[2]&&(e[3]=" "+e[3]+" "),e.slice(0,4)},CHILD:function(e){return e[1]=e[1].toLowerCase(),"nth"===e[1].slice(0,3)?(e[3]||se.error(e[0]),e[4]=+(e[4]?e[5]+(e[6]||1):2*("even"===e[3]||"odd"===e[3])),e[5]=+(e[7]+e[8]||"odd"===e[3])):e[3]&&se.error(e[0]),e},PSEUDO:function(e){var t,n=!e[6]&&e[2];return G.CHILD.test(e[0])?null:(e[3]?e[2]=e[4]||e[5]||"":n&&X.test(n)&&(t=h(n,!0))&&(t=n.indexOf(")",n.length-t)-n.length)&&(e[0]=e[0].slice(0,t),e[2]=n.slice(0,t)),e.slice(0,3))}},filter:{TAG:function(e){var t=e.replace(te,ne).toLowerCase();return"*"===e?function(){return!0}:function(e){return e.nodeName&&e.nodeName.toLowerCase()===t}},CLASS:function(e){var t=m[e+" "];return t||(t=new RegExp("(^|"+M+")"+e+"("+M+"|$)"))&&m(e,function(e){return t.test("string"==typeof e.className&&e.className||"undefined"!=typeof e.getAttribute&&e.getAttribute("class")||"")})},ATTR:function(n,r,i){return function(e){var t=se.attr(e,n);return null==t?"!="===r:!r||(t+="","="===r?t===i:"!="===r?t!==i:"^="===r?i&&0===t.indexOf(i):"*="===r?i&&-1:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i;function j(e,n,r){return m(n)?S.grep(e,function(e,t){return!!n.call(e,t,e)!==r}):n.nodeType?S.grep(e,function(e){return e===n!==r}):"string"!=typeof n?S.grep(e,function(e){return-1)[^>]*|#([\w-]+))$/;(S.fn.init=function(e,t,n){var r,i;if(!e)return this;if(n=n||D,"string"==typeof e){if(!(r="<"===e[0]&&">"===e[e.length-1]&&3<=e.length?[null,e,null]:q.exec(e))||!r[1]&&t)return!t||t.jquery?(t||n).find(e):this.constructor(t).find(e);if(r[1]){if(t=t instanceof S?t[0]:t,S.merge(this,S.parseHTML(r[1],t&&t.nodeType?t.ownerDocument||t:E,!0)),N.test(r[1])&&S.isPlainObject(t))for(r in t)m(this[r])?this[r](t[r]):this.attr(r,t[r]);return this}return(i=E.getElementById(r[2]))&&(this[0]=i,this.length=1),this}return e.nodeType?(this[0]=e,this.length=1,this):m(e)?void 0!==n.ready?n.ready(e):e(S):S.makeArray(e,this)}).prototype=S.fn,D=S(E);var L=/^(?:parents|prev(?:Until|All))/,H={children:!0,contents:!0,next:!0,prev:!0};function O(e,t){while((e=e[t])&&1!==e.nodeType);return e}S.fn.extend({has:function(e){var t=S(e,this),n=t.length;return this.filter(function(){for(var e=0;e\x20\t\r\n\f]*)/i,he=/^$|^module$|\/(?:java|ecma)script/i;ce=E.createDocumentFragment().appendChild(E.createElement("div")),(fe=E.createElement("input")).setAttribute("type","radio"),fe.setAttribute("checked","checked"),fe.setAttribute("name","t"),ce.appendChild(fe),y.checkClone=ce.cloneNode(!0).cloneNode(!0).lastChild.checked,ce.innerHTML="",y.noCloneChecked=!!ce.cloneNode(!0).lastChild.defaultValue,ce.innerHTML="",y.option=!!ce.lastChild;var ge={thead:[1,"","
"],col:[2,"","
"],tr:[2,"","
"],td:[3,"","
"],_default:[0,"",""]};function ve(e,t){var n;return n="undefined"!=typeof e.getElementsByTagName?e.getElementsByTagName(t||"*"):"undefined"!=typeof e.querySelectorAll?e.querySelectorAll(t||"*"):[],void 0===t||t&&A(e,t)?S.merge([e],n):n}function ye(e,t){for(var n=0,r=e.length;n",""]);var me=/<|&#?\w+;/;function xe(e,t,n,r,i){for(var o,a,s,u,l,c,f=t.createDocumentFragment(),p=[],d=0,h=e.length;d\s*$/g;function je(e,t){return A(e,"table")&&A(11!==t.nodeType?t:t.firstChild,"tr")&&S(e).children("tbody")[0]||e}function De(e){return e.type=(null!==e.getAttribute("type"))+"/"+e.type,e}function qe(e){return"true/"===(e.type||"").slice(0,5)?e.type=e.type.slice(5):e.removeAttribute("type"),e}function Le(e,t){var n,r,i,o,a,s;if(1===t.nodeType){if(Y.hasData(e)&&(s=Y.get(e).events))for(i in Y.remove(t,"handle events"),s)for(n=0,r=s[i].length;n").attr(n.scriptAttrs||{}).prop({charset:n.scriptCharset,src:n.url}).on("load error",i=function(e){r.remove(),i=null,e&&t("error"===e.type?404:200,e.type)}),E.head.appendChild(r[0])},abort:function(){i&&i()}}});var _t,zt=[],Ut=/(=)\?(?=&|$)|\?\?/;S.ajaxSetup({jsonp:"callback",jsonpCallback:function(){var e=zt.pop()||S.expando+"_"+wt.guid++;return this[e]=!0,e}}),S.ajaxPrefilter("json jsonp",function(e,t,n){var r,i,o,a=!1!==e.jsonp&&(Ut.test(e.url)?"url":"string"==typeof e.data&&0===(e.contentType||"").indexOf("application/x-www-form-urlencoded")&&Ut.test(e.data)&&"data");if(a||"jsonp"===e.dataTypes[0])return r=e.jsonpCallback=m(e.jsonpCallback)?e.jsonpCallback():e.jsonpCallback,a?e[a]=e[a].replace(Ut,"$1"+r):!1!==e.jsonp&&(e.url+=(Tt.test(e.url)?"&":"?")+e.jsonp+"="+r),e.converters["script json"]=function(){return o||S.error(r+" was not called"),o[0]},e.dataTypes[0]="json",i=C[r],C[r]=function(){o=arguments},n.always(function(){void 0===i?S(C).removeProp(r):C[r]=i,e[r]&&(e.jsonpCallback=t.jsonpCallback,zt.push(r)),o&&m(i)&&i(o[0]),o=i=void 0}),"script"}),y.createHTMLDocument=((_t=E.implementation.createHTMLDocument("").body).innerHTML="
",2===_t.childNodes.length),S.parseHTML=function(e,t,n){return"string"!=typeof e?[]:("boolean"==typeof t&&(n=t,t=!1),t||(y.createHTMLDocument?((r=(t=E.implementation.createHTMLDocument("")).createElement("base")).href=E.location.href,t.head.appendChild(r)):t=E),o=!n&&[],(i=N.exec(e))?[t.createElement(i[1])]:(i=xe([e],t,o),o&&o.length&&S(o).remove(),S.merge([],i.childNodes)));var r,i,o},S.fn.load=function(e,t,n){var r,i,o,a=this,s=e.indexOf(" ");return-1").append(S.parseHTML(e)).find(r):e)}).always(n&&function(e,t){a.each(function(){n.apply(this,o||[e.responseText,t,e])})}),this},S.expr.pseudos.animated=function(t){return S.grep(S.timers,function(e){return t===e.elem}).length},S.offset={setOffset:function(e,t,n){var r,i,o,a,s,u,l=S.css(e,"position"),c=S(e),f={};"static"===l&&(e.style.position="relative"),s=c.offset(),o=S.css(e,"top"),u=S.css(e,"left"),("absolute"===l||"fixed"===l)&&-1<(o+u).indexOf("auto")?(a=(r=c.position()).top,i=r.left):(a=parseFloat(o)||0,i=parseFloat(u)||0),m(t)&&(t=t.call(e,n,S.extend({},s))),null!=t.top&&(f.top=t.top-s.top+a),null!=t.left&&(f.left=t.left-s.left+i),"using"in t?t.using.call(e,f):c.css(f)}},S.fn.extend({offset:function(t){if(arguments.length)return void 0===t?this:this.each(function(e){S.offset.setOffset(this,t,e)});var e,n,r=this[0];return r?r.getClientRects().length?(e=r.getBoundingClientRect(),n=r.ownerDocument.defaultView,{top:e.top+n.pageYOffset,left:e.left+n.pageXOffset}):{top:0,left:0}:void 0},position:function(){if(this[0]){var e,t,n,r=this[0],i={top:0,left:0};if("fixed"===S.css(r,"position"))t=r.getBoundingClientRect();else{t=this.offset(),n=r.ownerDocument,e=r.offsetParent||n.documentElement;while(e&&(e===n.body||e===n.documentElement)&&"static"===S.css(e,"position"))e=e.parentNode;e&&e!==r&&1===e.nodeType&&((i=S(e).offset()).top+=S.css(e,"borderTopWidth",!0),i.left+=S.css(e,"borderLeftWidth",!0))}return{top:t.top-i.top-S.css(r,"marginTop",!0),left:t.left-i.left-S.css(r,"marginLeft",!0)}}},offsetParent:function(){return this.map(function(){var e=this.offsetParent;while(e&&"static"===S.css(e,"position"))e=e.offsetParent;return e||re})}}),S.each({scrollLeft:"pageXOffset",scrollTop:"pageYOffset"},function(t,i){var o="pageYOffset"===i;S.fn[t]=function(e){return $(this,function(e,t,n){var r;if(x(e)?r=e:9===e.nodeType&&(r=e.defaultView),void 0===n)return r?r[i]:e[t];r?r.scrollTo(o?r.pageXOffset:n,o?n:r.pageYOffset):e[t]=n},t,e,arguments.length)}}),S.each(["top","left"],function(e,n){S.cssHooks[n]=Fe(y.pixelPosition,function(e,t){if(t)return t=We(e,n),Pe.test(t)?S(e).position()[n]+"px":t})}),S.each({Height:"height",Width:"width"},function(a,s){S.each({padding:"inner"+a,content:s,"":"outer"+a},function(r,o){S.fn[o]=function(e,t){var n=arguments.length&&(r||"boolean"!=typeof e),i=r||(!0===e||!0===t?"margin":"border");return $(this,function(e,t,n){var r;return x(e)?0===o.indexOf("outer")?e["inner"+a]:e.document.documentElement["client"+a]:9===e.nodeType?(r=e.documentElement,Math.max(e.body["scroll"+a],r["scroll"+a],e.body["offset"+a],r["offset"+a],r["client"+a])):void 0===n?S.css(e,t,i):S.style(e,t,n,i)},s,n?e:void 0,n)}})}),S.each(["ajaxStart","ajaxStop","ajaxComplete","ajaxError","ajaxSuccess","ajaxSend"],function(e,t){S.fn[t]=function(e){return this.on(t,e)}}),S.fn.extend({bind:function(e,t,n){return this.on(e,null,t,n)},unbind:function(e,t){return this.off(e,null,t)},delegate:function(e,t,n,r){return this.on(t,e,n,r)},undelegate:function(e,t,n){return 1===arguments.length?this.off(e,"**"):this.off(t,e||"**",n)},hover:function(e,t){return this.mouseenter(e).mouseleave(t||e)}}),S.each("blur focus focusin focusout resize scroll click dblclick mousedown mouseup mousemove mouseover mouseout mouseenter mouseleave change select submit keydown keypress keyup contextmenu".split(" "),function(e,n){S.fn[n]=function(e,t){return 0",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=y.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=y.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),y.elements=c+" "+a,j(b)}function f(a){var b=x[a[v]];return b||(b={},w++,a[v]=w,x[w]=b),b}function g(a,c,d){if(c||(c=b),q)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():u.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||t.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),q)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return y.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(y,b.frag)}function j(a){a||(a=b);var d=f(a);return!y.shivCSS||p||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),q||i(a,d),a}function k(a){for(var b,c=a.getElementsByTagName("*"),e=c.length,f=RegExp("^(?:"+d().join("|")+")$","i"),g=[];e--;)b=c[e],f.test(b.nodeName)&&g.push(b.applyElement(l(b)));return g}function l(a){for(var b,c=a.attributes,d=c.length,e=a.ownerDocument.createElement(A+":"+a.nodeName);d--;)b=c[d],b.specified&&e.setAttribute(b.nodeName,b.nodeValue);return e.style.cssText=a.style.cssText,e}function m(a){for(var b,c=a.split("{"),e=c.length,f=RegExp("(^|[\\s,>+~])("+d().join("|")+")(?=[[\\s,>+~#.:]|$)","gi"),g="$1"+A+"\\:$2";e--;)b=c[e]=c[e].split("}"),b[b.length-1]=b[b.length-1].replace(f,g),c[e]=b.join("}");return c.join("{")}function n(a){for(var b=a.length;b--;)a[b].removeNode()}function o(a){function b(){clearTimeout(g._removeSheetTimer),d&&d.removeNode(!0),d=null}var d,e,g=f(a),h=a.namespaces,i=a.parentWindow;return!B||a.printShived?a:("undefined"==typeof h[A]&&h.add(A),i.attachEvent("onbeforeprint",function(){b();for(var f,g,h,i=a.styleSheets,j=[],l=i.length,n=Array(l);l--;)n[l]=i[l];for(;h=n.pop();)if(!h.disabled&&z.test(h.media)){try{f=h.imports,g=f.length}catch(o){g=0}for(l=0;g>l;l++)n.push(f[l]);try{j.push(h.cssText)}catch(o){}}j=m(j.reverse().join("")),e=k(a),d=c(a,j)}),i.attachEvent("onafterprint",function(){n(e),clearTimeout(g._removeSheetTimer),g._removeSheetTimer=setTimeout(b,500)}),a.printShived=!0,a)}var p,q,r="3.7.3",s=a.html5||{},t=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,u=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,v="_html5shiv",w=0,x={};!function(){try{var a=b.createElement("a");a.innerHTML="",p="hidden"in a,q=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){p=!0,q=!0}}();var y={elements:s.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:r,shivCSS:s.shivCSS!==!1,supportsUnknownElements:q,shivMethods:s.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=y,j(b);var z=/^$|\b(?:all|print)\b/,A="html5shiv",B=!q&&function(){var c=b.documentElement;return!("undefined"==typeof b.namespaces||"undefined"==typeof b.parentWindow||"undefined"==typeof c.applyElement||"undefined"==typeof c.removeNode||"undefined"==typeof a.attachEvent)}();y.type+=" print",y.shivPrint=o,o(b),"object"==typeof module&&module.exports&&(module.exports=y)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/remove_assignable/_static/js/html5shiv.min.js b/branch/remove_assignable/_static/js/html5shiv.min.js new file mode 100644 index 0000000..cd1c674 --- /dev/null +++ b/branch/remove_assignable/_static/js/html5shiv.min.js @@ -0,0 +1,4 @@ +/** +* @preserve HTML5 Shiv 3.7.3 | @afarkas @jdalton @jon_neal @rem | MIT/GPL2 Licensed +*/ +!function(a,b){function c(a,b){var c=a.createElement("p"),d=a.getElementsByTagName("head")[0]||a.documentElement;return c.innerHTML="x",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=t.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=t.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),t.elements=c+" "+a,j(b)}function f(a){var b=s[a[q]];return b||(b={},r++,a[q]=r,s[r]=b),b}function g(a,c,d){if(c||(c=b),l)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():p.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||o.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),l)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return t.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(t,b.frag)}function j(a){a||(a=b);var d=f(a);return!t.shivCSS||k||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),l||i(a,d),a}var k,l,m="3.7.3-pre",n=a.html5||{},o=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,p=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,q="_html5shiv",r=0,s={};!function(){try{var a=b.createElement("a");a.innerHTML="",k="hidden"in a,l=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){k=!0,l=!0}}();var t={elements:n.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:m,shivCSS:n.shivCSS!==!1,supportsUnknownElements:l,shivMethods:n.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=t,j(b),"object"==typeof module&&module.exports&&(module.exports=t)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/remove_assignable/_static/js/theme.js b/branch/remove_assignable/_static/js/theme.js new file mode 100644 index 0000000..1fddb6e --- /dev/null +++ b/branch/remove_assignable/_static/js/theme.js @@ -0,0 +1 @@ +!function(n){var e={};function t(i){if(e[i])return e[i].exports;var o=e[i]={i:i,l:!1,exports:{}};return n[i].call(o.exports,o,o.exports,t),o.l=!0,o.exports}t.m=n,t.c=e,t.d=function(n,e,i){t.o(n,e)||Object.defineProperty(n,e,{enumerable:!0,get:i})},t.r=function(n){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(n,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(n,"__esModule",{value:!0})},t.t=function(n,e){if(1&e&&(n=t(n)),8&e)return n;if(4&e&&"object"==typeof n&&n&&n.__esModule)return n;var i=Object.create(null);if(t.r(i),Object.defineProperty(i,"default",{enumerable:!0,value:n}),2&e&&"string"!=typeof n)for(var o in n)t.d(i,o,function(e){return n[e]}.bind(null,o));return i},t.n=function(n){var e=n&&n.__esModule?function(){return n.default}:function(){return n};return t.d(e,"a",e),e},t.o=function(n,e){return Object.prototype.hasOwnProperty.call(n,e)},t.p="",t(t.s=0)}([function(n,e,t){t(1),n.exports=t(3)},function(n,e,t){(function(){var e="undefined"!=typeof window?window.jQuery:t(2);n.exports.ThemeNav={navBar:null,win:null,winScroll:!1,winResize:!1,linkScroll:!1,winPosition:0,winHeight:null,docHeight:null,isRunning:!1,enable:function(n){var t=this;void 0===n&&(n=!0),t.isRunning||(t.isRunning=!0,e((function(e){t.init(e),t.reset(),t.win.on("hashchange",t.reset),n&&t.win.on("scroll",(function(){t.linkScroll||t.winScroll||(t.winScroll=!0,requestAnimationFrame((function(){t.onScroll()})))})),t.win.on("resize",(function(){t.winResize||(t.winResize=!0,requestAnimationFrame((function(){t.onResize()})))})),t.onResize()})))},enableSticky:function(){this.enable(!0)},init:function(n){n(document);var e=this;this.navBar=n("div.wy-side-scroll:first"),this.win=n(window),n(document).on("click","[data-toggle='wy-nav-top']",(function(){n("[data-toggle='wy-nav-shift']").toggleClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift")})).on("click",".wy-menu-vertical .current ul li a",(function(){var t=n(this);n("[data-toggle='wy-nav-shift']").removeClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift"),e.toggleCurrent(t),e.hashChange()})).on("click","[data-toggle='rst-current-version']",(function(){n("[data-toggle='rst-versions']").toggleClass("shift-up")})),n("table.docutils:not(.field-list,.footnote,.citation)").wrap("
"),n("table.docutils.footnote").wrap("
"),n("table.docutils.citation").wrap("
"),n(".wy-menu-vertical ul").not(".simple").siblings("a").each((function(){var t=n(this);expand=n(''),expand.on("click",(function(n){return e.toggleCurrent(t),n.stopPropagation(),!1})),t.prepend(expand)}))},reset:function(){var n=encodeURI(window.location.hash)||"#";try{var e=$(".wy-menu-vertical"),t=e.find('[href="'+n+'"]');if(0===t.length){var i=$('.document [id="'+n.substring(1)+'"]').closest("div.section");0===(t=e.find('[href="#'+i.attr("id")+'"]')).length&&(t=e.find('[href="#"]'))}if(t.length>0){$(".wy-menu-vertical .current").removeClass("current").attr("aria-expanded","false"),t.addClass("current").attr("aria-expanded","true"),t.closest("li.toctree-l1").parent().addClass("current").attr("aria-expanded","true");for(let n=1;n<=10;n++)t.closest("li.toctree-l"+n).addClass("current").attr("aria-expanded","true");t[0].scrollIntoView()}}catch(n){console.log("Error expanding nav for anchor",n)}},onScroll:function(){this.winScroll=!1;var n=this.win.scrollTop(),e=n+this.winHeight,t=this.navBar.scrollTop()+(n-this.winPosition);n<0||e>this.docHeight||(this.navBar.scrollTop(t),this.winPosition=n)},onResize:function(){this.winResize=!1,this.winHeight=this.win.height(),this.docHeight=$(document).height()},hashChange:function(){this.linkScroll=!0,this.win.one("hashchange",(function(){this.linkScroll=!1}))},toggleCurrent:function(n){var e=n.closest("li");e.siblings("li.current").removeClass("current").attr("aria-expanded","false"),e.siblings().find("li.current").removeClass("current").attr("aria-expanded","false");var t=e.find("> ul li");t.length&&(t.removeClass("current").attr("aria-expanded","false"),e.toggleClass("current").attr("aria-expanded",(function(n,e){return"true"==e?"false":"true"})))}},"undefined"!=typeof window&&(window.SphinxRtdTheme={Navigation:n.exports.ThemeNav,StickyNav:n.exports.ThemeNav}),function(){for(var n=0,e=["ms","moz","webkit","o"],t=0;t0 + var meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$"; // [C]VC[V] is m=1 + var mgr1 = "^(" + C + ")?" + V + C + V + C; // [C]VCVC... is m>1 + var s_v = "^(" + C + ")?" + v; // vowel in stem + + this.stemWord = function (w) { + var stem; + var suffix; + var firstch; + var origword = w; + + if (w.length < 3) + return w; + + var re; + var re2; + var re3; + var re4; + + firstch = w.substr(0,1); + if (firstch == "y") + w = firstch.toUpperCase() + w.substr(1); + + // Step 1a + re = /^(.+?)(ss|i)es$/; + re2 = /^(.+?)([^s])s$/; + + if (re.test(w)) + w = w.replace(re,"$1$2"); + else if (re2.test(w)) + w = w.replace(re2,"$1$2"); + + // Step 1b + re = /^(.+?)eed$/; + re2 = /^(.+?)(ed|ing)$/; + if (re.test(w)) { + var fp = re.exec(w); + re = new RegExp(mgr0); + if (re.test(fp[1])) { + re = /.$/; + w = w.replace(re,""); + } + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + re2 = new RegExp(s_v); + if (re2.test(stem)) { + w = stem; + re2 = /(at|bl|iz)$/; + re3 = new RegExp("([^aeiouylsz])\\1$"); + re4 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re2.test(w)) + w = w + "e"; + else if (re3.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + else if (re4.test(w)) + w = w + "e"; + } + } + + // Step 1c + re = /^(.+?)y$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(s_v); + if (re.test(stem)) + w = stem + "i"; + } + + // Step 2 + re = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step2list[suffix]; + } + + // Step 3 + re = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step3list[suffix]; + } + + // Step 4 + re = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/; + re2 = /^(.+?)(s|t)(ion)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + if (re.test(stem)) + w = stem; + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1] + fp[2]; + re2 = new RegExp(mgr1); + if (re2.test(stem)) + w = stem; + } + + // Step 5 + re = /^(.+?)e$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + re2 = new RegExp(meq1); + re3 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) + w = stem; + } + re = /ll$/; + re2 = new RegExp(mgr1); + if (re.test(w) && re2.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + + // and turn initial Y back to y + if (firstch == "y") + w = firstch.toLowerCase() + w.substr(1); + return w; + } +} + diff --git a/branch/remove_assignable/_static/minus.png b/branch/remove_assignable/_static/minus.png new file mode 100644 index 0000000..d96755f Binary files /dev/null and b/branch/remove_assignable/_static/minus.png differ diff --git a/branch/remove_assignable/_static/plus.png b/branch/remove_assignable/_static/plus.png new file mode 100644 index 0000000..7107cec Binary files /dev/null and b/branch/remove_assignable/_static/plus.png differ diff --git a/branch/remove_assignable/_static/pygments.css b/branch/remove_assignable/_static/pygments.css new file mode 100644 index 0000000..84ab303 --- /dev/null +++ b/branch/remove_assignable/_static/pygments.css @@ -0,0 +1,75 @@ +pre { line-height: 125%; } +td.linenos .normal { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +span.linenos { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +td.linenos .special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +span.linenos.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +.highlight .hll { background-color: #ffffcc } +.highlight { background: #f8f8f8; } +.highlight .c { color: #3D7B7B; font-style: italic } /* Comment */ +.highlight .err { border: 1px solid #FF0000 } /* Error */ +.highlight .k { color: #008000; font-weight: bold } /* Keyword */ +.highlight .o { color: #666666 } /* Operator */ +.highlight .ch { color: #3D7B7B; font-style: italic } /* Comment.Hashbang */ +.highlight .cm { color: #3D7B7B; font-style: italic } /* Comment.Multiline */ +.highlight .cp { color: #9C6500 } /* Comment.Preproc */ +.highlight .cpf { color: #3D7B7B; font-style: italic } /* Comment.PreprocFile */ +.highlight .c1 { color: #3D7B7B; font-style: italic } /* Comment.Single */ +.highlight .cs { color: #3D7B7B; font-style: italic } /* Comment.Special */ +.highlight .gd { color: #A00000 } /* Generic.Deleted */ +.highlight .ge { font-style: italic } /* Generic.Emph */ +.highlight .ges { font-weight: bold; font-style: italic } /* Generic.EmphStrong */ +.highlight .gr { color: #E40000 } /* Generic.Error */ +.highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */ +.highlight .gi { color: #008400 } /* Generic.Inserted */ +.highlight .go { color: #717171 } /* Generic.Output */ +.highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */ +.highlight .gs { font-weight: bold } /* Generic.Strong */ +.highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */ +.highlight .gt { color: #0044DD } /* Generic.Traceback */ +.highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */ +.highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */ +.highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */ +.highlight .kp { color: #008000 } /* Keyword.Pseudo */ +.highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */ +.highlight .kt { color: #B00040 } /* Keyword.Type */ +.highlight .m { color: #666666 } /* Literal.Number */ +.highlight .s { color: #BA2121 } /* Literal.String */ +.highlight .na { color: #687822 } /* Name.Attribute */ +.highlight .nb { color: #008000 } /* Name.Builtin */ +.highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */ +.highlight .no { color: #880000 } /* Name.Constant */ +.highlight .nd { color: #AA22FF } /* Name.Decorator */ +.highlight .ni { color: #717171; font-weight: bold } /* Name.Entity */ +.highlight .ne { color: #CB3F38; font-weight: bold } /* Name.Exception */ +.highlight .nf { color: #0000FF } /* Name.Function */ +.highlight .nl { color: #767600 } /* Name.Label */ +.highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */ +.highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */ +.highlight .nv { color: #19177C } /* Name.Variable */ +.highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */ +.highlight .w { color: #bbbbbb } /* Text.Whitespace */ +.highlight .mb { color: #666666 } /* Literal.Number.Bin */ +.highlight .mf { color: #666666 } /* Literal.Number.Float */ +.highlight .mh { color: #666666 } /* Literal.Number.Hex */ +.highlight .mi { color: #666666 } /* Literal.Number.Integer */ +.highlight .mo { color: #666666 } /* Literal.Number.Oct */ +.highlight .sa { color: #BA2121 } /* Literal.String.Affix */ +.highlight .sb { color: #BA2121 } /* Literal.String.Backtick */ +.highlight .sc { color: #BA2121 } /* Literal.String.Char */ +.highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */ +.highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */ +.highlight .s2 { color: #BA2121 } /* Literal.String.Double */ +.highlight .se { color: #AA5D1F; font-weight: bold } /* Literal.String.Escape */ +.highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */ +.highlight .si { color: #A45A77; font-weight: bold } /* Literal.String.Interpol */ +.highlight .sx { color: #008000 } /* Literal.String.Other */ +.highlight .sr { color: #A45A77 } /* Literal.String.Regex */ +.highlight .s1 { color: #BA2121 } /* Literal.String.Single */ +.highlight .ss { color: #19177C } /* Literal.String.Symbol */ +.highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */ +.highlight .fm { color: #0000FF } /* Name.Function.Magic */ +.highlight .vc { color: #19177C } /* Name.Variable.Class */ +.highlight .vg { color: #19177C } /* Name.Variable.Global */ +.highlight .vi { color: #19177C } /* Name.Variable.Instance */ +.highlight .vm { color: #19177C } /* Name.Variable.Magic */ +.highlight .il { color: #666666 } /* Literal.Number.Integer.Long */ \ No newline at end of file diff --git a/branch/remove_assignable/_static/searchtools.js b/branch/remove_assignable/_static/searchtools.js new file mode 100644 index 0000000..97d56a7 --- /dev/null +++ b/branch/remove_assignable/_static/searchtools.js @@ -0,0 +1,566 @@ +/* + * searchtools.js + * ~~~~~~~~~~~~~~~~ + * + * Sphinx JavaScript utilities for the full-text search. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +/** + * Simple result scoring code. + */ +if (typeof Scorer === "undefined") { + var Scorer = { + // Implement the following function to further tweak the score for each result + // The function takes a result array [docname, title, anchor, descr, score, filename] + // and returns the new score. + /* + score: result => { + const [docname, title, anchor, descr, score, filename] = result + return score + }, + */ + + // query matches the full name of an object + objNameMatch: 11, + // or matches in the last dotted part of the object name + objPartialMatch: 6, + // Additive scores depending on the priority of the object + objPrio: { + 0: 15, // used to be importantResults + 1: 5, // used to be objectResults + 2: -5, // used to be unimportantResults + }, + // Used when the priority is not in the mapping. + objPrioDefault: 0, + + // query found in title + title: 15, + partialTitle: 7, + // query found in terms + term: 5, + partialTerm: 2, + }; +} + +const _removeChildren = (element) => { + while (element && element.lastChild) element.removeChild(element.lastChild); +}; + +/** + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions#escaping + */ +const _escapeRegExp = (string) => + string.replace(/[.*+\-?^${}()|[\]\\]/g, "\\$&"); // $& means the whole matched string + +const _displayItem = (item, searchTerms) => { + const docBuilder = DOCUMENTATION_OPTIONS.BUILDER; + const docUrlRoot = DOCUMENTATION_OPTIONS.URL_ROOT; + const docFileSuffix = DOCUMENTATION_OPTIONS.FILE_SUFFIX; + const docLinkSuffix = DOCUMENTATION_OPTIONS.LINK_SUFFIX; + const showSearchSummary = DOCUMENTATION_OPTIONS.SHOW_SEARCH_SUMMARY; + + const [docName, title, anchor, descr, score, _filename] = item; + + let listItem = document.createElement("li"); + let requestUrl; + let linkUrl; + if (docBuilder === "dirhtml") { + // dirhtml builder + let dirname = docName + "/"; + if (dirname.match(/\/index\/$/)) + dirname = dirname.substring(0, dirname.length - 6); + else if (dirname === "index/") dirname = ""; + requestUrl = docUrlRoot + dirname; + linkUrl = requestUrl; + } else { + // normal html builders + requestUrl = docUrlRoot + docName + docFileSuffix; + linkUrl = docName + docLinkSuffix; + } + let linkEl = listItem.appendChild(document.createElement("a")); + linkEl.href = linkUrl + anchor; + linkEl.dataset.score = score; + linkEl.innerHTML = title; + if (descr) + listItem.appendChild(document.createElement("span")).innerHTML = + " (" + descr + ")"; + else if (showSearchSummary) + fetch(requestUrl) + .then((responseData) => responseData.text()) + .then((data) => { + if (data) + listItem.appendChild( + Search.makeSearchSummary(data, searchTerms) + ); + }); + Search.output.appendChild(listItem); +}; +const _finishSearch = (resultCount) => { + Search.stopPulse(); + Search.title.innerText = _("Search Results"); + if (!resultCount) + Search.status.innerText = Documentation.gettext( + "Your search did not match any documents. Please make sure that all words are spelled correctly and that you've selected enough categories." + ); + else + Search.status.innerText = _( + `Search finished, found ${resultCount} page(s) matching the search query.` + ); +}; +const _displayNextItem = ( + results, + resultCount, + searchTerms +) => { + // results left, load the summary and display it + // this is intended to be dynamic (don't sub resultsCount) + if (results.length) { + _displayItem(results.pop(), searchTerms); + setTimeout( + () => _displayNextItem(results, resultCount, searchTerms), + 5 + ); + } + // search finished, update title and status message + else _finishSearch(resultCount); +}; + +/** + * Default splitQuery function. Can be overridden in ``sphinx.search`` with a + * custom function per language. + * + * The regular expression works by splitting the string on consecutive characters + * that are not Unicode letters, numbers, underscores, or emoji characters. + * This is the same as ``\W+`` in Python, preserving the surrogate pair area. + */ +if (typeof splitQuery === "undefined") { + var splitQuery = (query) => query + .split(/[^\p{Letter}\p{Number}_\p{Emoji_Presentation}]+/gu) + .filter(term => term) // remove remaining empty strings +} + +/** + * Search Module + */ +const Search = { + _index: null, + _queued_query: null, + _pulse_status: -1, + + htmlToText: (htmlString) => { + const htmlElement = new DOMParser().parseFromString(htmlString, 'text/html'); + htmlElement.querySelectorAll(".headerlink").forEach((el) => { el.remove() }); + const docContent = htmlElement.querySelector('[role="main"]'); + if (docContent !== undefined) return docContent.textContent; + console.warn( + "Content block not found. Sphinx search tries to obtain it via '[role=main]'. Could you check your theme or template." + ); + return ""; + }, + + init: () => { + const query = new URLSearchParams(window.location.search).get("q"); + document + .querySelectorAll('input[name="q"]') + .forEach((el) => (el.value = query)); + if (query) Search.performSearch(query); + }, + + loadIndex: (url) => + (document.body.appendChild(document.createElement("script")).src = url), + + setIndex: (index) => { + Search._index = index; + if (Search._queued_query !== null) { + const query = Search._queued_query; + Search._queued_query = null; + Search.query(query); + } + }, + + hasIndex: () => Search._index !== null, + + deferQuery: (query) => (Search._queued_query = query), + + stopPulse: () => (Search._pulse_status = -1), + + startPulse: () => { + if (Search._pulse_status >= 0) return; + + const pulse = () => { + Search._pulse_status = (Search._pulse_status + 1) % 4; + Search.dots.innerText = ".".repeat(Search._pulse_status); + if (Search._pulse_status >= 0) window.setTimeout(pulse, 500); + }; + pulse(); + }, + + /** + * perform a search for something (or wait until index is loaded) + */ + performSearch: (query) => { + // create the required interface elements + const searchText = document.createElement("h2"); + searchText.textContent = _("Searching"); + const searchSummary = document.createElement("p"); + searchSummary.classList.add("search-summary"); + searchSummary.innerText = ""; + const searchList = document.createElement("ul"); + searchList.classList.add("search"); + + const out = document.getElementById("search-results"); + Search.title = out.appendChild(searchText); + Search.dots = Search.title.appendChild(document.createElement("span")); + Search.status = out.appendChild(searchSummary); + Search.output = out.appendChild(searchList); + + const searchProgress = document.getElementById("search-progress"); + // Some themes don't use the search progress node + if (searchProgress) { + searchProgress.innerText = _("Preparing search..."); + } + Search.startPulse(); + + // index already loaded, the browser was quick! + if (Search.hasIndex()) Search.query(query); + else Search.deferQuery(query); + }, + + /** + * execute search (requires search index to be loaded) + */ + query: (query) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + const allTitles = Search._index.alltitles; + const indexEntries = Search._index.indexentries; + + // stem the search terms and add them to the correct list + const stemmer = new Stemmer(); + const searchTerms = new Set(); + const excludedTerms = new Set(); + const highlightTerms = new Set(); + const objectTerms = new Set(splitQuery(query.toLowerCase().trim())); + splitQuery(query.trim()).forEach((queryTerm) => { + const queryTermLower = queryTerm.toLowerCase(); + + // maybe skip this "word" + // stopwords array is from language_data.js + if ( + stopwords.indexOf(queryTermLower) !== -1 || + queryTerm.match(/^\d+$/) + ) + return; + + // stem the word + let word = stemmer.stemWord(queryTermLower); + // select the correct list + if (word[0] === "-") excludedTerms.add(word.substr(1)); + else { + searchTerms.add(word); + highlightTerms.add(queryTermLower); + } + }); + + if (SPHINX_HIGHLIGHT_ENABLED) { // set in sphinx_highlight.js + localStorage.setItem("sphinx_highlight_terms", [...highlightTerms].join(" ")) + } + + // console.debug("SEARCH: searching for:"); + // console.info("required: ", [...searchTerms]); + // console.info("excluded: ", [...excludedTerms]); + + // array of [docname, title, anchor, descr, score, filename] + let results = []; + _removeChildren(document.getElementById("search-progress")); + + const queryLower = query.toLowerCase(); + for (const [title, foundTitles] of Object.entries(allTitles)) { + if (title.toLowerCase().includes(queryLower) && (queryLower.length >= title.length/2)) { + for (const [file, id] of foundTitles) { + let score = Math.round(100 * queryLower.length / title.length) + results.push([ + docNames[file], + titles[file] !== title ? `${titles[file]} > ${title}` : title, + id !== null ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // search for explicit entries in index directives + for (const [entry, foundEntries] of Object.entries(indexEntries)) { + if (entry.includes(queryLower) && (queryLower.length >= entry.length/2)) { + for (const [file, id] of foundEntries) { + let score = Math.round(100 * queryLower.length / entry.length) + results.push([ + docNames[file], + titles[file], + id ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // lookup as object + objectTerms.forEach((term) => + results.push(...Search.performObjectSearch(term, objectTerms)) + ); + + // lookup as search terms in fulltext + results.push(...Search.performTermsSearch(searchTerms, excludedTerms)); + + // let the scorer override scores with a custom scoring function + if (Scorer.score) results.forEach((item) => (item[4] = Scorer.score(item))); + + // now sort the results by score (in opposite order of appearance, since the + // display function below uses pop() to retrieve items) and then + // alphabetically + results.sort((a, b) => { + const leftScore = a[4]; + const rightScore = b[4]; + if (leftScore === rightScore) { + // same score: sort alphabetically + const leftTitle = a[1].toLowerCase(); + const rightTitle = b[1].toLowerCase(); + if (leftTitle === rightTitle) return 0; + return leftTitle > rightTitle ? -1 : 1; // inverted is intentional + } + return leftScore > rightScore ? 1 : -1; + }); + + // remove duplicate search results + // note the reversing of results, so that in the case of duplicates, the highest-scoring entry is kept + let seen = new Set(); + results = results.reverse().reduce((acc, result) => { + let resultStr = result.slice(0, 4).concat([result[5]]).map(v => String(v)).join(','); + if (!seen.has(resultStr)) { + acc.push(result); + seen.add(resultStr); + } + return acc; + }, []); + + results = results.reverse(); + + // for debugging + //Search.lastresults = results.slice(); // a copy + // console.info("search results:", Search.lastresults); + + // print the results + _displayNextItem(results, results.length, searchTerms); + }, + + /** + * search for object names + */ + performObjectSearch: (object, objectTerms) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const objects = Search._index.objects; + const objNames = Search._index.objnames; + const titles = Search._index.titles; + + const results = []; + + const objectSearchCallback = (prefix, match) => { + const name = match[4] + const fullname = (prefix ? prefix + "." : "") + name; + const fullnameLower = fullname.toLowerCase(); + if (fullnameLower.indexOf(object) < 0) return; + + let score = 0; + const parts = fullnameLower.split("."); + + // check for different match types: exact matches of full name or + // "last name" (i.e. last dotted part) + if (fullnameLower === object || parts.slice(-1)[0] === object) + score += Scorer.objNameMatch; + else if (parts.slice(-1)[0].indexOf(object) > -1) + score += Scorer.objPartialMatch; // matches in last name + + const objName = objNames[match[1]][2]; + const title = titles[match[0]]; + + // If more than one term searched for, we require other words to be + // found in the name/title/description + const otherTerms = new Set(objectTerms); + otherTerms.delete(object); + if (otherTerms.size > 0) { + const haystack = `${prefix} ${name} ${objName} ${title}`.toLowerCase(); + if ( + [...otherTerms].some((otherTerm) => haystack.indexOf(otherTerm) < 0) + ) + return; + } + + let anchor = match[3]; + if (anchor === "") anchor = fullname; + else if (anchor === "-") anchor = objNames[match[1]][1] + "-" + fullname; + + const descr = objName + _(", in ") + title; + + // add custom score for some objects according to scorer + if (Scorer.objPrio.hasOwnProperty(match[2])) + score += Scorer.objPrio[match[2]]; + else score += Scorer.objPrioDefault; + + results.push([ + docNames[match[0]], + fullname, + "#" + anchor, + descr, + score, + filenames[match[0]], + ]); + }; + Object.keys(objects).forEach((prefix) => + objects[prefix].forEach((array) => + objectSearchCallback(prefix, array) + ) + ); + return results; + }, + + /** + * search for full-text terms in the index + */ + performTermsSearch: (searchTerms, excludedTerms) => { + // prepare search + const terms = Search._index.terms; + const titleTerms = Search._index.titleterms; + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + + const scoreMap = new Map(); + const fileMap = new Map(); + + // perform the search on the required terms + searchTerms.forEach((word) => { + const files = []; + const arr = [ + { files: terms[word], score: Scorer.term }, + { files: titleTerms[word], score: Scorer.title }, + ]; + // add support for partial matches + if (word.length > 2) { + const escapedWord = _escapeRegExp(word); + Object.keys(terms).forEach((term) => { + if (term.match(escapedWord) && !terms[word]) + arr.push({ files: terms[term], score: Scorer.partialTerm }); + }); + Object.keys(titleTerms).forEach((term) => { + if (term.match(escapedWord) && !titleTerms[word]) + arr.push({ files: titleTerms[word], score: Scorer.partialTitle }); + }); + } + + // no match but word was a required one + if (arr.every((record) => record.files === undefined)) return; + + // found search word in contents + arr.forEach((record) => { + if (record.files === undefined) return; + + let recordFiles = record.files; + if (recordFiles.length === undefined) recordFiles = [recordFiles]; + files.push(...recordFiles); + + // set score for the word in each file + recordFiles.forEach((file) => { + if (!scoreMap.has(file)) scoreMap.set(file, {}); + scoreMap.get(file)[word] = record.score; + }); + }); + + // create the mapping + files.forEach((file) => { + if (fileMap.has(file) && fileMap.get(file).indexOf(word) === -1) + fileMap.get(file).push(word); + else fileMap.set(file, [word]); + }); + }); + + // now check if the files don't contain excluded terms + const results = []; + for (const [file, wordList] of fileMap) { + // check if all requirements are matched + + // as search terms with length < 3 are discarded + const filteredTermCount = [...searchTerms].filter( + (term) => term.length > 2 + ).length; + if ( + wordList.length !== searchTerms.size && + wordList.length !== filteredTermCount + ) + continue; + + // ensure that none of the excluded terms is in the search result + if ( + [...excludedTerms].some( + (term) => + terms[term] === file || + titleTerms[term] === file || + (terms[term] || []).includes(file) || + (titleTerms[term] || []).includes(file) + ) + ) + break; + + // select one (max) score for the file. + const score = Math.max(...wordList.map((w) => scoreMap.get(file)[w])); + // add result to the result list + results.push([ + docNames[file], + titles[file], + "", + null, + score, + filenames[file], + ]); + } + return results; + }, + + /** + * helper function to return a node containing the + * search summary for a given text. keywords is a list + * of stemmed words. + */ + makeSearchSummary: (htmlText, keywords) => { + const text = Search.htmlToText(htmlText); + if (text === "") return null; + + const textLower = text.toLowerCase(); + const actualStartPosition = [...keywords] + .map((k) => textLower.indexOf(k.toLowerCase())) + .filter((i) => i > -1) + .slice(-1)[0]; + const startWithContext = Math.max(actualStartPosition - 120, 0); + + const top = startWithContext === 0 ? "" : "..."; + const tail = startWithContext + 240 < text.length ? "..." : ""; + + let summary = document.createElement("p"); + summary.classList.add("context"); + summary.textContent = top + text.substr(startWithContext, 240).trim() + tail; + + return summary; + }, +}; + +_ready(Search.init); diff --git a/branch/remove_assignable/_static/sphinx_highlight.js b/branch/remove_assignable/_static/sphinx_highlight.js new file mode 100644 index 0000000..aae669d --- /dev/null +++ b/branch/remove_assignable/_static/sphinx_highlight.js @@ -0,0 +1,144 @@ +/* Highlighting utilities for Sphinx HTML documentation. */ +"use strict"; + +const SPHINX_HIGHLIGHT_ENABLED = true + +/** + * highlight a given string on a node by wrapping it in + * span elements with the given class name. + */ +const _highlight = (node, addItems, text, className) => { + if (node.nodeType === Node.TEXT_NODE) { + const val = node.nodeValue; + const parent = node.parentNode; + const pos = val.toLowerCase().indexOf(text); + if ( + pos >= 0 && + !parent.classList.contains(className) && + !parent.classList.contains("nohighlight") + ) { + let span; + + const closestNode = parent.closest("body, svg, foreignObject"); + const isInSVG = closestNode && closestNode.matches("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.classList.add(className); + } + + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + parent.insertBefore( + span, + parent.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling + ) + ); + node.nodeValue = val.substr(0, pos); + + if (isInSVG) { + const rect = document.createElementNS( + "http://www.w3.org/2000/svg", + "rect" + ); + const bbox = parent.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute("class", className); + addItems.push({ parent: parent, target: rect }); + } + } + } else if (node.matches && !node.matches("button, select, textarea")) { + node.childNodes.forEach((el) => _highlight(el, addItems, text, className)); + } +}; +const _highlightText = (thisNode, text, className) => { + let addItems = []; + _highlight(thisNode, addItems, text, className); + addItems.forEach((obj) => + obj.parent.insertAdjacentElement("beforebegin", obj.target) + ); +}; + +/** + * Small JavaScript module for the documentation. + */ +const SphinxHighlight = { + + /** + * highlight the search words provided in localstorage in the text + */ + highlightSearchWords: () => { + if (!SPHINX_HIGHLIGHT_ENABLED) return; // bail if no highlight + + // get and clear terms from localstorage + const url = new URL(window.location); + const highlight = + localStorage.getItem("sphinx_highlight_terms") + || url.searchParams.get("highlight") + || ""; + localStorage.removeItem("sphinx_highlight_terms") + url.searchParams.delete("highlight"); + window.history.replaceState({}, "", url); + + // get individual terms from highlight string + const terms = highlight.toLowerCase().split(/\s+/).filter(x => x); + if (terms.length === 0) return; // nothing to do + + // There should never be more than one element matching "div.body" + const divBody = document.querySelectorAll("div.body"); + const body = divBody.length ? divBody[0] : document.querySelector("body"); + window.setTimeout(() => { + terms.forEach((term) => _highlightText(body, term, "highlighted")); + }, 10); + + const searchBox = document.getElementById("searchbox"); + if (searchBox === null) return; + searchBox.appendChild( + document + .createRange() + .createContextualFragment( + '" + ) + ); + }, + + /** + * helper function to hide the search marks again + */ + hideSearchWords: () => { + document + .querySelectorAll("#searchbox .highlight-link") + .forEach((el) => el.remove()); + document + .querySelectorAll("span.highlighted") + .forEach((el) => el.classList.remove("highlighted")); + localStorage.removeItem("sphinx_highlight_terms") + }, + + initEscapeListener: () => { + // only install a listener if it is really needed + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.shiftKey || event.altKey || event.ctrlKey || event.metaKey) return; + if (DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS && (event.key === "Escape")) { + SphinxHighlight.hideSearchWords(); + event.preventDefault(); + } + }); + }, +}; + +_ready(SphinxHighlight.highlightSearchWords); +_ready(SphinxHighlight.initEscapeListener); diff --git a/branch/remove_assignable/autodoc/index.html b/branch/remove_assignable/autodoc/index.html new file mode 100644 index 0000000..4179b9e --- /dev/null +++ b/branch/remove_assignable/autodoc/index.html @@ -0,0 +1,165 @@ + + + + + + + Lasso Classes and Functions — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Lasso Classes and Functions

+
+

Base Classes

+ + + + + + + + + + + + + + + + + + +

CubeTransit

Class for storing information about transit defined in Cube line files.

StandardTransit

Holds a standard transit feed as a Partridge object and contains methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files.

ModelRoadwayNetwork

Subclass of network_wrangler class RoadwayNetwork

Project

A single or set of changes to the roadway or transit system.

Parameters

A class representing all the parameters defining the networks including time of day, categories, etc.

+
+
+

Utils and Functions

+ + + + + + + + + +

util

logger

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/genindex/index.html b/branch/remove_assignable/genindex/index.html new file mode 100644 index 0000000..2fdd453 --- /dev/null +++ b/branch/remove_assignable/genindex/index.html @@ -0,0 +1,1040 @@ + + + + + + Index — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Index

+ +
+ _ + | A + | B + | C + | D + | E + | F + | G + | H + | I + | K + | L + | M + | N + | O + | P + | R + | S + | T + | U + | V + | W + | X + | Y + | Z + +
+

_

+ + +
+ +

A

+ + + +
+ +

B

+ + + +
+ +

C

+ + + +
+ +

D

+ + + +
+ +

E

+ + + +
+ +

F

+ + + +
+ +

G

+ + + +
+ +

H

+ + + +
+ +

I

+ + + +
+ +

K

+ + +
+ +

L

+ + + +
+ +

M

+ + + +
+ +

N

+ + + +
+ +

O

+ + + +
+ +

P

+ + + +
+ +

R

+ + + +
+ +

S

+ + + +
+ +

T

+ + + +
+ +

U

+ + + +
+ +

V

+ + + +
+ +

W

+ + + +
+ +

X

+ + + +
+ +

Y

+ + +
+ +

Z

+ + + +
+ + + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/index.html b/branch/remove_assignable/index.html new file mode 100644 index 0000000..c65fa50 --- /dev/null +++ b/branch/remove_assignable/index.html @@ -0,0 +1,181 @@ + + + + + + + Welcome to lasso’s documentation! — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Welcome to lasso’s documentation!

+

This package of utilities is a wrapper around the +network_wrangler package +for MetCouncil and MTC. It aims to have the following functionality:

+
    +
  1. parse Cube log files and base highway networks and create ProjectCards +for Network Wrangler

  2. +
  3. parse two Cube transit line files and create ProjectCards for NetworkWrangler

  4. +
  5. refine Network Wrangler highway networks to contain specific variables and +settings for the respective agency and export them to a format that can +be read in by Citilab’s Cube software.

  6. +
+ +
+
+

Indices and tables

+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/objects.inv b/branch/remove_assignable/objects.inv new file mode 100644 index 0000000..1f37899 Binary files /dev/null and b/branch/remove_assignable/objects.inv differ diff --git a/branch/remove_assignable/py-modindex/index.html b/branch/remove_assignable/py-modindex/index.html new file mode 100644 index 0000000..46c6f94 --- /dev/null +++ b/branch/remove_assignable/py-modindex/index.html @@ -0,0 +1,136 @@ + + + + + + Python Module Index — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Python Module Index

+ +
+ l +
+ + + + + + + + + + + + + +
 
+ l
+ lasso +
    + lasso.logger +
    + lasso.util +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/running/index.html b/branch/remove_assignable/running/index.html new file mode 100644 index 0000000..6feca00 --- /dev/null +++ b/branch/remove_assignable/running/index.html @@ -0,0 +1,133 @@ + + + + + + + Running Lasso — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Running Lasso

+
+

Create project files

+
+
+

Create a scenario

+
+
+

Exporting networks

+
+
+

Auditing and Reporting

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/search/index.html b/branch/remove_assignable/search/index.html new file mode 100644 index 0000000..6e5865f --- /dev/null +++ b/branch/remove_assignable/search/index.html @@ -0,0 +1,126 @@ + + + + + + Search — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + + + +
+ +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + + + + + + \ No newline at end of file diff --git a/branch/remove_assignable/searchindex.js b/branch/remove_assignable/searchindex.js new file mode 100644 index 0000000..18c16cc --- /dev/null +++ b/branch/remove_assignable/searchindex.js @@ -0,0 +1 @@ +Search.setIndex({"docnames": ["_generated/lasso.CubeTransit", "_generated/lasso.ModelRoadwayNetwork", "_generated/lasso.Parameters", "_generated/lasso.Project", "_generated/lasso.StandardTransit", "_generated/lasso.logger", "_generated/lasso.util", "autodoc", "index", "running", "setup", "starting"], "filenames": ["_generated/lasso.CubeTransit.rst", "_generated/lasso.ModelRoadwayNetwork.rst", "_generated/lasso.Parameters.rst", "_generated/lasso.Project.rst", "_generated/lasso.StandardTransit.rst", "_generated/lasso.logger.rst", "_generated/lasso.util.rst", "autodoc.rst", "index.rst", "running.md", "setup.md", "starting.md"], "titles": ["lasso.CubeTransit", "lasso.ModelRoadwayNetwork", "lasso.Parameters", "lasso.Project", "lasso.StandardTransit", "lasso.logger", "lasso.util", "Lasso Classes and Functions", "Welcome to lasso\u2019s documentation!", "Running Lasso", "Setup", "Starting Out"], "terms": {"class": [0, 1, 2, 3, 4, 6, 8, 11], "paramet": [0, 1, 3, 4, 6, 8], "sourc": [0, 1, 2, 3, 4, 5, 6, 11], "base": [0, 1, 2, 3, 4, 6, 8, 11], "object": [0, 1, 2, 3, 4, 6, 11], "store": [0, 1, 11], "inform": [0, 1, 4, 11], "about": [0, 1, 4, 11], "transit": [0, 1, 2, 3, 4, 8], "defin": [0, 1, 2, 3, 11], "cube": [0, 1, 2, 3, 4, 8], "line": [0, 1, 2, 3, 4, 6, 8, 11], "file": [0, 1, 2, 3, 4, 8], "ha": [0, 6, 11], "capabl": [0, 11], "pars": [0, 8, 11], "properti": [0, 1, 2, 4, 6, 11], "shape": [0, 1, 3, 4, 6, 11], "python": [0, 1, 2, 11], "dictionari": [0, 1, 2, 3, 4, 6, 11], "compar": [0, 3, 4, 11], "repres": [0, 2, 4, 6, 11], "chang": [0, 1, 3, 4, 11], "project": [0, 1, 2, 4, 6, 8], "card": [0, 1, 3, 4], "typic": [0, 3, 4, 6, 8], "usag": [0, 1, 3, 4], "exampl": [0, 1, 3, 4, 6, 11], "tn": [0, 11], "create_from_cub": [0, 11], "cube_dir": [0, 3, 11], "transit_change_list": [0, 11], "evaluate_differ": [0, 4, 11], "base_transit_network": [0, 3, 11], "list": [0, 1, 2, 3, 4, 6, 11], "string": [0, 1, 3, 4, 6], "uniqu": [0, 1, 3], "name": [0, 1, 2, 3, 4, 6, 11], "network": [0, 1, 2, 3, 4, 5, 8], "type": [0, 1, 2, 3, 4, 6, 11], "line_properti": 0, "kei": [0, 1, 3, 11], "valu": [0, 1, 3, 4, 6, 11], "ar": [0, 1, 2, 3, 6, 11], "These": 0, "directli": 0, "read": [0, 1, 3, 4, 8, 11], "from": [0, 1, 2, 3, 4, 6, 8], "haven": 0, "t": [0, 1, 6], "been": [0, 6], "translat": [0, 4, 11], "standard": [0, 1, 2, 4, 11], "dict": [0, 1, 2, 3, 4], "panda": [0, 1, 3], "datafram": [0, 1, 3, 4, 11], "node": [0, 1, 2, 3, 4, 6, 11], "follow": [0, 4, 8, 11], "column": [0, 1, 4], "node_id": 0, "int": [0, 1, 2, 3, 4, 6], "posit": [0, 6], "integ": [0, 1, 2], "id": [0, 1, 3, 4, 11], "number": [0, 1, 2, 3, 4, 6, 11], "neg": [0, 6], "indic": [0, 1, 6], "non": [0, 6], "stop": [0, 4], "boolean": [0, 1], "i": [0, 1, 3, 4, 5, 6, 8, 11], "order": [0, 1, 2, 6, 11], "within": [0, 1, 2, 6], "thi": [0, 1, 2, 3, 4, 6, 8, 11], "program_typ": 0, "either": [0, 1, 5, 6, 11], "pt": 0, "trnbld": 0, "str": [0, 1, 2, 3, 6], "instanc": [0, 1, 2, 3, 4, 11], "appli": [0, 1, 6], "which": [0, 1, 3, 6, 11], "includ": [0, 1, 2, 11], "time": [0, 1, 2, 4, 6, 11], "period": [0, 1, 2, 4, 11], "variabl": [0, 1, 2, 3, 4, 8, 11], "source_list": 0, "have": [0, 1, 6, 8], "ad": [0, 1, 3, 6, 11], "diff_dict": 0, "__init__": [0, 1, 2, 3, 4], "constructor": [0, 1, 3], "set": [0, 1, 2, 3, 4, 5, 6, 8, 11], "see": [0, 1, 3, 4, 6], "an": [0, 1, 3, 4, 6, 11], "method": [0, 1, 2, 3, 4, 6, 11], "add_additional_time_period": 0, "new_time_period_numb": 0, "orig_line_nam": 0, "copi": [0, 1, 3, 4, 6, 11], "rout": [0, 1, 4], "anoth": 0, "appropri": [0, 4], "specif": [0, 1, 8, 11], "new": [0, 1, 6, 11], "under": 0, "self": [0, 1, 2, 3, 4, 6], "origin": [0, 1, 6, 11], "its": [0, 1], "return": [0, 1, 3, 4, 6], "add_cub": 0, "transit_sourc": 0, "lin": [0, 3, 4], "add": [0, 1, 3, 6, 11], "exist": [0, 1, 3, 6, 11], "transitnetwork": [0, 4], "directori": [0, 1, 4, 11], "static": [0, 1, 3, 4], "build_route_nam": 0, "route_id": [0, 4], "time_period": [0, 1, 2], "agency_id": 0, "0": [0, 1, 2, 4, 6, 11], "direction_id": 0, "1": [0, 1, 2, 4, 6, 11], "creat": [0, 1, 2, 3, 4, 6, 8, 11], "contaten": 0, "agenc": [0, 8], "direct": [0, 1, 6, 11], "e": [0, 1, 11], "452": 0, "111": 0, "pk": [0, 2], "construct": [0, 6, 11], "line_nam": 0, "0_452": 0, "111_452_pk1": 0, "calculate_start_end_tim": 0, "line_properties_dict": 0, "calcul": [0, 1, 2, 3, 4, 6], "start": [0, 1, 4, 8], "end": [0, 1, 6], "warn": [0, 1], "doesn": [0, 1], "take": [0, 1], "care": 0, "discongru": 0, "flavor": [0, 1], "create_add_route_card_dict": 0, "format": [0, 1, 2, 4, 8, 11], "route_properti": 0, "being": 0, "updat": [0, 1, 11], "A": [0, 1, 2, 3, 4, 6, 11], "addit": [0, 1, 6, 8, 11], "create_delete_route_card_dict": 0, "base_transit_line_properties_dict": 0, "delet": [0, 1], "style": [0, 6], "attribut": [0, 1, 2, 3, 4, 11], "find": [0, 1, 4], "create_update_route_card_dict": 0, "updated_properties_dict": 0, "cube_properties_to_standard_properti": 0, "cube_properties_dict": 0, "convert": [0, 1, 4, 6], "most": 0, "pertin": 0, "like": [0, 1, 2, 6], "headwai": [0, 2, 4], "varibl": [0, 2], "stnadard": 0, "unit": [0, 1, 6], "minut": 0, "second": [0, 4, 6], "correct": 0, "base_transit": 0, "identifi": [0, 1, 11], "what": [0, 1, 3, 6], "need": [0, 1, 2, 3, 4], "For": [0, 1, 4, 6], "multipl": [0, 1, 6, 11], "make": [0, 1, 11], "duplic": 0, "so": [0, 1, 5, 6, 11], "each": [0, 1, 3, 6], "condit": [0, 1], "contain": [0, 1, 4, 6, 8, 11], "requir": [0, 1, 6, 11], "evalu": [0, 1, 3], "differ": [0, 2, 6], "between": [0, 1, 2, 3, 6], "evaluate_route_property_differ": 0, "properties_build": 0, "properties_bas": 0, "time_period_numb": 0, "absolut": [0, 6], "true": [0, 1, 3, 4, 5, 6, 11], "validate_bas": 0, "fals": [0, 1, 3, 4, 6, 11], "check": [0, 1, 3, 6], "ani": [0, 1, 3, 6, 11], "entri": [0, 3], "property_nam": 0, "property_valu": 0, "us": [0, 1, 2, 3, 4, 6, 11], "command": [0, 1, 3], "rather": [0, 1], "than": [0, 1, 6], "If": [0, 1, 3, 4, 6, 11], "automat": 0, "note": [0, 6, 11], "onli": [0, 1, 3, 4, 6], "numer": [0, 4, 6], "frequenc": [0, 4, 11], "suitabl": 0, "write": [0, 1, 3, 4, 11], "evaluate_route_shape_chang": 0, "shape_build": 0, "shape_bas": 0, "two": [0, 1, 6, 8, 11], "build": [0, 1, 3, 11], "version": [0, 6, 11], "ddatafram": 0, "get_time_period_numbers_from_cube_properti": 0, "properties_list": 0, "associ": [0, 2], "them": [0, 1, 4, 6, 8], "all": [0, 1, 2, 5, 6, 11], "found": [0, 1, 6], "unpack_route_nam": 0, "unpack": 0, "info": [0, 1, 2], "link": [1, 2, 3, 6, 11], "kwarg": [1, 2, 3, 6], "roadwaynetwork": [1, 3, 4], "subclass": [1, 11], "network_wrangl": [1, 8, 11], "represent": [1, 3, 4, 6], "physic": 1, "roadwai": [1, 2, 3, 4, 11], "geodatafram": [1, 11], "specifi": [1, 3, 6], "default": [1, 2, 3, 4, 6, 11], "cr": [1, 3, 6], "coordin": [1, 3, 6], "refer": [1, 3, 4, 6, 11], "system": [1, 3], "espg": [1, 3], "node_foreign_kei": [1, 3], "tabl": [1, 2, 3, 6], "link_foreign_kei": [1, 3], "foreign": [1, 3], "shape_foreign_kei": [1, 3, 11], "unique_link_id": [1, 3], "unique_node_id": [1, 3], "modes_to_network_link_vari": [1, 3], "map": [1, 2, 3, 6, 11], "mode": [1, 3, 4, 11], "modes_to_network_nodes_vari": [1, 3], "managed_lanes_node_id_scalar": [1, 3], "scalar": [1, 3, 6], "primari": [1, 3], "correspond": [1, 3, 4], "manag": [1, 2, 3, 11], "lane": [1, 2, 3, 11], "managed_lanes_link_id_scalar": [1, 3], "managed_lanes_required_attribut": [1, 3], "must": [1, 3, 6], "keep_same_attributes_ml_and_gp": [1, 3], "parallel": [1, 3, 6], "gener": [1, 3], "purpos": [1, 3, 6], "add_count": 1, "network_vari": 1, "aadt": 1, "mndot_count_shst_data": 1, "none": [1, 3, 4, 5, 6], "widot_count_shst_data": 1, "mndot_count_variable_shp": [1, 2], "widot_count_variable_shp": 1, "count": [1, 2], "mc": [1, 2, 4], "join": [1, 3, 4, 6, 11], "data": [1, 2, 3, 4, 8, 11], "via": 1, "shst": 1, "api": 1, "match": 1, "result": [1, 6, 11], "should": [1, 2, 3, 6, 11], "written": [1, 11], "path": [1, 3, 4, 6, 11], "mndot": [1, 2], "locat": [1, 2, 3, 4], "widot": 1, "geodatabas": 1, "add_incident_link_data_to_nod": 1, "links_df": [1, 11], "nodes_df": [1, 11], "link_vari": 1, "unique_node_kei": 1, "model_node_id": [1, 2], "go": [1, 11], "assess": [1, 3], "connect": 1, "incid": 1, "where": 1, "length": [1, 6], "n": [1, 2, 3, 6, 11], "out": [1, 2, 5, 6, 8], "add_new_roadway_feature_chang": 1, "featur": [1, 6], "also": [1, 6, 11], "valid": [1, 6, 11], "add_variable_using_shst_refer": 1, "var_shst_csvdata": 1, "shst_csv_variabl": 1, "network_var_typ": 1, "overwrit": 1, "bool": [1, 3, 6], "addition_map": 1, "show": 1, "project_card_dictionari": 1, "wrapper": [1, 8, 11], "apply_managed_lane_feature_chang": 1, "link_idx": 1, "in_plac": [1, 11], "lndice": 1, "whether": 1, "decid": 1, "connector": [1, 2], "when": [1, 3, 4, 6], "thei": [1, 2], "more": [1, 6, 11], "apply_python_calcul": 1, "pycod": 1, "execut": 1, "code": [1, 2, 4, 6, 11], "apply_roadway_feature_chang": [1, 11], "select": [1, 11], "pass": [1, 5, 6], "assess_connect": [1, 11], "ignore_end_nod": [1, 11], "graph": 1, "disconnect": 1, "subgraph": 1, "describ": [1, 11], "member": [1, 6], "one": [1, 6, 11], "drive": [1, 11], "walk": [1, 11], "bike": 1, "ignor": [1, 6], "strai": 1, "singleton": 1, "tupl": [1, 6], "osmnx": [1, 11], "networkx": 1, "digraph": 1, "build_selection_kei": 1, "selection_dict": 1, "combin": [1, 2, 4, 6], "queri": [1, 11], "b": [1, 2, 11], "you": [1, 3, 11], "selection_dictonari": 1, "serv": 1, "calculate_area_typ": 1, "area_type_shap": [1, 2], "area_type_shape_vari": 1, "area_typ": [1, 2, 3], "area_type_codes_dict": 1, "downtown_area_type_shap": [1, 2], "downtown_area_typ": [1, 2], "area": [1, 2, 3, 6], "centroid": [1, 2, 6], "geometri": [1, 2, 6], "field": [1, 2, 11], "determin": [1, 3, 6], "label": 1, "isn": 1, "perfect": 1, "much": 1, "quicker": 1, "other": [1, 6, 11], "The": [1, 2, 5, 6, 11], "geodadabas": 1, "input": [1, 6], "downtown": [1, 2], "boundari": [1, 2, 6], "counti": [1, 2, 3], "calculate_centroidconnect": 1, "centroidconnect": [1, 2, 3], "highest_taz_numb": [1, 2], "as_integ": 1, "max": 1, "taz": [1, 2], "calculate_counti": 1, "county_shap": [1, 2], "county_shape_vari": 1, "county_codes_dict": 1, "calculate_dist": 1, "distanc": [1, 2, 3, 6], "centroidconnect_onli": 1, "mile": 1, "centroidconnector": 1, "calculate_mpo": 1, "county_network_vari": 1, "mpo": [1, 2], "mpo_counti": [1, 2], "param": [1, 4], "county_vari": 1, "region": [1, 6], "calculate_us": 1, "defauli": 1, "convert_int": 1, "int_col_nam": 1, "create_ml_vari": 1, "ml_lane": [1, 2], "ml": 1, "placehold": 1, "come": 1, "log": [1, 3, 8], "create_calculated_vari": 1, "create_dummy_connector_link": 1, "ml_df": 1, "access_lan": 1, "egress_lan": 1, "access_roadwai": 1, "ml_access": 1, "egress_roadwai": 1, "access_name_prefix": 1, "access": [1, 4, 6], "dummi": 1, "egress_name_prefix": 1, "egress": 1, "gp_df": 1, "roadai": 1, "prefix": 1, "create_hov_corridor_vari": 1, "segment_id": [1, 2], "hov": 1, "corridor": 1, "create_managed_lane_network": [1, 11], "keep_additional_attributes_ml_and_gp": 1, "separ": [1, 3, 6], "look": 1, "want": [1, 11], "leav": 1, "some": 1, "rigor": 1, "test": [1, 2, 6], "create_managed_vari": 1, "dataframe_to_fixed_width": 1, "df": 1, "fix": [1, 2], "width": [1, 6], "transform": [1, 6], "delete_roadway_feature_chang": 1, "ignore_miss": 1, "get": [1, 2, 6, 11], "miss": 1, "fail": [1, 6], "deletion_map": 1, "fill_na": 1, "fill": [1, 6], "na": 1, "from_roadwaynetwork": [1, 11], "roadway_network_object": 1, "get_attribut": 1, "join_kei": 1, "source_shst_ref_df": 1, "source_gdf": 1, "field_nam": 1, "get_managed_lane_node_id": 1, "nodes_list": 1, "4500000": 1, "237": 1, "get_modal_graph": 1, "bike_access": [1, 2], "bu": [1, 4], "bus_onli": [1, 2], "drive_access": [1, 2, 11], "rail": [1, 4], "rail_onli": [1, 2], "walk_access": [1, 2], "strongli": 1, "vertex": [1, 6], "reachabl": 1, "everi": [1, 6], "get_modal_links_nod": 1, "kept": 1, "both": [1, 4, 6, 11], "filter": [1, 6], "right": [1, 6], "now": 1, "we": [1, 4, 11], "don": 1, "becaus": [1, 4, 6, 11], "mark": 1, "issu": 1, "discuss": 1, "http": [1, 4, 6, 11], "github": [1, 6, 11], "com": [1, 4, 6, 11], "wsp": [1, 11], "sag": [1, 11], "145": 1, "modal_nodes_df": 1, "mode_node_vari": 1, "get_property_by_time_period_and_group": 1, "prop": 1, "categori": [1, 2, 11], "default_return": 1, "seri": [1, 11], "group": 1, "16": [1, 2, 6], "00": [1, 2], "19": [1, 2], "option": [1, 6], "sov": [1, 2], "search": [1, 2, 8, 11], "hov3": [1, 2], "hov2": [1, 2], "identify_seg": 1, "o_id": 1, "d_id": 1, "endpoint": 1, "up": [1, 5, 11], "segment": [1, 4, 6, 11], "candid": 1, "otherwis": [1, 4, 6], "ram": 1, "hog": 1, "could": [1, 6, 11], "odd": 1, "shortest": 1, "segment_vari": 1, "keep": [1, 6], "identify_segment_endpoint": 1, "min_connecting_link": 1, "10": [1, 2, 6], "min_dist": 1, "max_link_devi": 1, "2": [1, 2, 4, 6, 11], "is_network_connect": [1, 11], "consid": [1, 6, 11], "cach": 1, "long": [1, 6], "load_transform_network": 1, "node_filenam": [1, 11], "link_filenam": [1, 11], "shape_filenam": [1, 11], "4326": [1, 6], "validate_schema": 1, "disk": 1, "schema": [1, 11], "shapes_df": [1, 11], "network_connection_plot": 1, "g": [1, 6], "disconnected_subgraph_nod": 1, "plot": 1, "fig": 1, "ax": [1, 6], "orig_dest_nodes_foreign_kei": 1, "whatev": 1, "u": 1, "v": [1, 2, 11], "ab": 1, "noth": 1, "assum": 1, "a_id": 1, "b_id": 1, "ox_graph": 1, "unique_link_kei": 1, "model_link_id": [1, 2, 3], "arrai": [1, 6], "remov": [1, 6], "certain": 1, "do": [1, 6, 11], "too": [1, 5], "link_df": 1, "referenc": 1, "multidigraph": 1, "path_search": 1, "candidate_links_df": 1, "weight_column": 1, "weight_factor": 1, "search_breadth": 1, "5": [1, 2, 4, 6], "max_search_breadth": 1, "candidate_link": 1, "part": [1, 6, 11], "foreigh": 1, "destin": 1, "weight": 1, "iter": [1, 6], "multipli": 1, "fast": [1, 11], "recalculate_calculated_vari": [1, 3], "recalculate_dist": [1, 3], "json": [1, 11], "geojson": 1, "skip": 1, "speed": 1, "spatial": [1, 6], "etc": [1, 2, 3, 11], "re": 1, "read_match_result": 1, "lot": 1, "same": [1, 4, 6], "concaten": 1, "singl": [1, 3, 6], "geopanda": [1, 11], "sure": 1, "why": 1, "util": [1, 8, 11], "rename_variables_for_dbf": 1, "input_df": 1, "variable_crosswalk": 1, "output_vari": [1, 2], "convert_geometry_to_xi": 1, "renam": [1, 3], "dbf": 1, "shp": [1, 2], "char": 1, "crosswalk": [1, 3], "x": [1, 2, 6], "y": [1, 2, 6], "roadway_net_to_gdf": 1, "roadway_net": 1, "turn": [1, 11], "export": [1, 8, 11], "sophist": 1, "attach": 1, "roadway_standard_to_met_council_network": 1, "output_epsg": [1, 2], "consist": [1, 6, 11], "metcouncil": [1, 2, 4, 8, 11], "": [1, 2, 4, 6, 11], "model": [1, 2], "expect": [1, 6], "epsg": [1, 2, 6], "output": [1, 2, 3, 4, 6], "select_roadway_featur": [1, 11], "search_mod": 1, "force_search": 1, "sp_weight_factor": 1, "satisfi": [1, 6], "criteria": 1, "net": [1, 11], "osm": [1, 2], "share": [1, 11], "street": [1, 11], "osm_model_link_id": 1, "1234": 1, "shstid": 1, "4321": 1, "regex": 1, "facil": [1, 2, 11], "main": 1, "st": [1, 2], "least": [1, 11], "perform": [1, 4], "even": 1, "previou": 1, "discourag": 1, "meander": 1, "ref": 1, "here": [1, 6], "defaul": 1, "selection_has_unique_link_id": 1, "selection_dictionari": 1, "selection_map": 1, "selected_link_idx": 1, "candidate_link_idx": 1, "selected_links_idx": 1, "candidate_links_idx": 1, "shortest_path": 1, "graph_links_df": 1, "100": 1, "four": 1, "nx": 1, "split_properties_by_time_period_and_categori": 1, "properties_to_split": [1, 2], "split": [1, 2, 4], "structur": 1, "stratifi": 1, "times_period": 1, "am": [1, 2], "6": [1, 2, 4, 6], "pm": [1, 2], "15": [1, 2], "update_dist": 1, "use_shap": 1, "inplac": 1, "straight": 1, "avail": 1, "portion": 1, "provid": [1, 3, 4, 6, 11], "entir": 1, "crow": 1, "fly": 1, "meter": [1, 6], "nan": 1, "validate_link_schema": 1, "schema_loc": 1, "roadway_network_link": 1, "validate_node_schema": 1, "node_fil": 1, "roadway_network_nod": 1, "validate_properti": 1, "ignore_exist": 1, "require_existing_for_chang": 1, "theproject": 1, "dictonari": 1, "validate_select": 1, "selection_requir": 1, "whetther": 1, "minimum": [1, 5, 6], "validate_shape_schema": 1, "shape_fil": 1, "roadway_network_shap": 1, "validate_uniqu": 1, "confirm": 1, "met": 1, "filenam": [1, 3, 11], "were": 1, "save": 1, "write_roadway_as_fixedwidth": [1, 11], "output_dir": 1, "node_output_vari": 1, "link_output_vari": 1, "output_link_txt": [1, 2], "output_node_txt": [1, 2], "output_link_header_width_txt": [1, 2], "output_node_header_width_txt": [1, 2], "output_cube_network_script": [1, 2], "drive_onli": 1, "function": [1, 5, 6, 8, 11], "doe": [1, 4], "header": [1, 2], "3": [1, 2, 4, 6, 11], "script": [1, 2], "run": [1, 8], "databas": [1, 3], "record": [1, 4], "write_roadway_as_shp": [1, 11], "data_to_csv": 1, "data_to_dbf": 1, "output_link_shp": [1, 2], "output_node_shp": [1, 2], "output_link_csv": [1, 2], "output_node_csv": [1, 2], "output_gpkg": 1, "output_link_gpkg_lay": 1, "output_node_gpkg_lay": 1, "output_gpkg_link_filt": 1, "gpkg": 1, "csv": [1, 2, 3, 11], "full": [1, 6], "geopackag": 1, "layer": [1, 11], "subset": 1, "calculated_valu": [1, 3], "dai": [2, 4, 11], "can": [2, 6, 8, 11], "runtim": [2, 11], "initi": [2, 3, 11], "keyword": [2, 6, 11], "argument": [2, 6, 11], "explicitli": [2, 11], "highlight": [2, 11], "attr": 2, "time_period_to_tim": 2, "abbrevi": [2, 4], "gtf": [2, 4, 11], "highwai": [2, 3, 8, 11], "ea": 2, "md": 2, "ev": 2, "cube_time_period": 2, "4": [2, 4, 6], "demand": 2, "allow": [2, 6], "suffix": 2, "truck": 2, "trk": 2, "final": 2, "lanes_am": 2, "time_periods_to_tim": 2, "shapefil": 2, "r": 2, "metcouncil_data": 2, "cb_2017_us_county_5m": 2, "county_variable_shp": 2, "lanes_lookup_fil": 2, "lookup": 2, "centroid_connect_lan": 2, "anoka": 2, "dakota": 2, "hennepin": 2, "ramsei": 2, "scott": 2, "washington": 2, "carver": 2, "taz_shap": 2, "tazofficialwcurrentforecast": 2, "taz_data": 2, "highest": 2, "3100": 2, "link_id": 2, "shstgeometryid": 2, "roadway_class": 2, "truck_access": 2, "trn_priority_ea": 2, "trn_priority_am": 2, "trn_priority_md": 2, "trn_priority_pm": 2, "trn_priority_ev": 2, "ttime_assert_ea": 2, "ttime_assert_am": 2, "ttime_assert_md": 2, "ttime_assert_pm": 2, "ttime_assert_ev": 2, "lanes_ea": 2, "lanes_md": 2, "lanes_pm": 2, "lanes_ev": 2, "price_sov_ea": 2, "price_hov2_ea": 2, "price_hov3_ea": 2, "price_truck_ea": 2, "price_sov_am": 2, "price_hov2_am": 2, "price_hov3_am": 2, "price_truck_am": 2, "price_sov_md": 2, "price_hov2_md": 2, "price_hov3_md": 2, "price_truck_md": 2, "price_sov_pm": 2, "price_hov2_pm": 2, "price_hov3_pm": 2, "price_truck_pm": 2, "price_sov_ev": 2, "price_hov2_ev": 2, "price_hov3_ev": 2, "price_truck_ev": 2, "roadway_class_idx": 2, "facility_typ": 2, "osm_node_id": [2, 6, 11], "bike_nod": 2, "transit_nod": 2, "walk_nod": 2, "drive_nod": 2, "ml_lanes_ea": 2, "ml_lanes_am": 2, "ml_lanes_md": 2, "ml_lanes_pm": 2, "ml_lanes_ev": 2, "osm_facility_type_dict": 2, "thrivemsp2040communitydesign": 2, "area_type_variable_shp": 2, "comdes2040": 2, "area_type_code_dict": 2, "23": 2, "urban": [2, 4], "center": [2, 6], "24": 2, "25": 2, "35": 2, "36": 2, "41": 2, "51": 2, "52": 2, "53": 2, "60": 2, "downtownzones_taz": 2, "mrcc_roadway_class_shap": 2, "mrcc": 2, "trans_mrcc_centerlin": 2, "mrcc_roadway_class_variable_shp": 2, "mrcc_roadway_class_shp": 2, "route_si": 2, "widot_roadway_class_shap": 2, "wisconsin": 2, "wisconsin_lanes_counts_median": 2, "wislr": 2, "widot_roadway_class_variable_shp": 2, "rdwy_ctgy_": 2, "mndot_count_shap": 2, "count_mn": 2, "aadt_2017_count_loc": 2, "osm_highway_facility_type_crosswalk": 2, "legacy_tm2_attribut": 2, "shstreferenceid": 2, "legaci": 2, "tm2": 2, "osm_lanes_attribut": 2, "tam_tm2_attribut": 2, "tam": 2, "tom_tom_attribut": 2, "tomtom": 2, "tomtom_attribut": 2, "sfcta_attribut": 2, "sfcta": 2, "geograph": 2, "102646": 2, "scratch": 2, "txt": [2, 11], "links_header_width": 2, "nodes_header_width": 2, "import": [2, 6], "make_complete_network_from_fixed_width_fil": 2, "county_link_range_dict": 2, "county_code_dict": 2, "7": [2, 4, 11], "extern": 2, "chisago": 2, "11": 2, "goodhu": 2, "12": 2, "isanti": 2, "13": 2, "le": 2, "sueur": 2, "14": 2, "mcleod": 2, "pierc": 2, "polk": 2, "17": 2, "rice": 2, "18": 2, "sherburn": 2, "siblei": 2, "20": 2, "croix": 2, "21": 2, "wright": 2, "22": 2, "route_type_bus_mode_dict": 2, "urb": 2, "loc": 2, "sub": [2, 6], "express": [2, 4, 6], "route_type_mode_dict": 2, "8": [2, 4, 6], "9": [2, 4], "cube_time_periods_nam": 2, "op": 2, "detail": [2, 5], "zone": 2, "possibl": [2, 6], "roadway_link_chang": 3, "roadway_node_chang": 3, "transit_chang": [3, 4], "base_roadway_network": 3, "base_cube_transit_network": 3, "build_cube_transit_network": 3, "project_nam": 3, "produc": [3, 6], "test_project": [3, 11], "create_project": [3, 11], "base_cube_transit_sourc": 3, "o": [3, 4, 11], "build_cube_transit_sourc": 3, "transit_route_shape_chang": [3, 11], "evaluate_chang": [3, 11], "write_project_card": [3, 11], "scratch_dir": [3, 11], "t_transit_shape_test": [3, 11], "yml": [3, 11], "default_project_nam": 3, "level": 3, "constant": 3, "static_valu": 3, "card_data": 3, "cubetransit": [3, 8], "bunch": 3, "projectcard": [3, 8], "case": 3, "standardtransit": [3, 8], "add_highway_chang": 3, "limit_variables_to_existing_network": 3, "hoc": [3, 11], "add_transit_chang": 3, "roadway_log_fil": 3, "roadway_shp_fil": 3, "roadway_csv_fil": 3, "network_build_fil": 3, "emme_node_id_crosswalk_fil": 3, "emme_name_crosswalk_fil": 3, "base_roadway_dir": 3, "base_transit_dir": [3, 4, 11], "consum": 3, "logfil": 3, "emm": [3, 4], "folder": 3, "base_cube_transit_fil": 3, "build_cube_transit_fil": 3, "first": [3, 4, 6], "recalcul": 3, "determine_roadway_network_changes_compat": 3, "emme_id_to_wrangler_id": 3, "emme_link_change_df": 3, "emme_node_change_df": 3, "emme_transit_changes_df": 3, "rewrit": 3, "wrangler": [3, 8, 11], "emme_name_to_wrangler_nam": 3, "aggreg": 3, "get_object_from_network_build_command": 3, "row": [3, 4], "histori": 3, "l": 3, "get_operation_from_network_build_command": 3, "action": 3, "c": [3, 6], "d": [3, 6], "read_logfil": 3, "logfilenam": 3, "reprsent": [3, 11], "read_network_build_fil": 3, "networkbuildfilenam": 3, "nework": 3, "assign_group": 3, "user": [3, 11], "TO": 3, "ptg_feed": 4, "hold": [4, 11], "feed": [4, 11], "partridg": [4, 11], "manipul": [4, 11], "cube_transit_net": [4, 11], "read_gtf": [4, 11], "write_as_cube_lin": [4, 11], "write_dir": [4, 11], "outfil": [4, 11], "calculate_cube_mod": 4, "assign": 4, "logic": 4, "route_typ": 4, "develop": [4, 11], "googl": 4, "cube_mod": 4, "route_type_to_cube_mod": 4, "tram": 4, "streetcar": 4, "light": 4, "further": 4, "disaggreg": 4, "buse": 4, "suburban": 4, "longnam": 4, "lower": [4, 6], "elif": 4, "99": 4, "local": [4, 11], "els": [4, 6], "route_long_nam": 4, "cube_format": 4, "represnt": 4, "notat": 4, "trip": 4, "trip_id": 4, "shape_id": [4, 11], "tod": 4, "onewai": 4, "oper": [4, 6], "fromtransitnetwork": 4, "transit_network_object": 4, "modelroadwaynetwork": [4, 8], "gtfs_feed_dir": 4, "route_properties_gtfs_to_cub": 4, "prepar": 4, "trip_df": 4, "shape_gtfs_to_cub": 4, "add_nntim": 4, "shape_gtfs_to_dict_list": 4, "step": 4, "through": 4, "todo": 4, "elimin": 4, "necess": 4, "tag": [4, 11], "begin": 4, "As": 4, "m": 4, "minim": 4, "modif": 4, "question": 4, "shape_pt_sequ": 4, "shape_mode_node_id": 4, "is_stop": 4, "stop_sequ": 4, "shape_gtfs_to_emm": 4, "trip_row": 4, "time_to_cube_time_period": 4, "start_time_sec": 4, "as_str": 4, "verbos": 4, "midnight": [4, 6], "this_tp": 4, "this_tp_num": 4, "outpath": 4, "after": 4, "setuplog": 5, "infologfilenam": 5, "debuglogfilenam": 5, "logtoconsol": 5, "infolog": 5, "ters": 5, "just": 5, "give": 5, "bare": 5, "composit": 5, "clear": 5, "later": 5, "debuglog": 5, "veri": [5, 6], "noisi": 5, "debug": 5, "spew": 5, "consol": 5, "point": 6, "arg": 6, "basegeometri": 6, "possibli": 6, "z": 6, "zero": 6, "dimension": 6, "float": 6, "sequenc": 6, "individu": 6, "p": 6, "print": 6, "almost_equ": 6, "decim": 6, "equal": 6, "place": 6, "deprec": 6, "sinc": 6, "confus": 6, "equals_exact": 6, "instead": 6, "approxim": 6, "compon": [6, 8], "linestr": 6, "1e": 6, "buffer": 6, "quad_seg": 6, "cap_styl": 6, "round": 6, "join_styl": 6, "mitre_limit": 6, "single_sid": 6, "dilat": 6, "eros": 6, "small": 6, "mai": 6, "sometim": 6, "tidi": 6, "polygon": 6, "around": [6, 8, 11], "resolut": 6, "angl": 6, "fillet": 6, "buffercapstyl": 6, "squar": 6, "flat": 6, "circular": 6, "rectangular": 6, "while": 6, "involv": 6, "bufferjoinstyl": 6, "mitr": 6, "bevel": 6, "midpoint": 6, "edg": [6, 8], "touch": 6, "depend": 6, "limit": 6, "ratio": 6, "sharp": 6, "corner": 6, "offset": 6, "meet": 6, "miter": 6, "extend": 6, "To": [6, 11], "prevent": 6, "unreason": 6, "control": 6, "maximum": 6, "exce": 6, "side": 6, "sign": 6, "left": 6, "hand": 6, "regular": 6, "cap": 6, "alwai": 6, "forc": 6, "equival": 6, "cap_flat": 6, "quadseg": 6, "alia": 6, "strictli": 6, "wkt": 6, "load": 6, "gon": 6, "approx": 6, "radiu": 6, "circl": 6, "1365484905459": 6, "128": 6, "141513801144": 6, "triangl": 6, "exterior": 6, "coord": 6, "contains_properli": 6, "complet": 6, "common": 6, "document": [6, 11], "covered_bi": 6, "cover": 6, "cross": 6, "grid_siz": 6, "disjoint": 6, "unitless": 6, "dwithin": 6, "given": [6, 11], "topolog": 6, "toler": 6, "comparison": 6, "geometrytyp": 6, "hausdorff_dist": 6, "hausdorff": 6, "interpol": 6, "normal": 6, "along": 6, "linear": 6, "taken": 6, "measur": 6, "revers": 6, "rang": 6, "index": [6, 8], "handl": 6, "clamp": 6, "interpret": 6, "fraction": 6, "line_interpolate_point": 6, "intersect": 6, "line_locate_point": 6, "nearest": 6, "form": 6, "canon": 6, "ring": 6, "multi": 6, "multilinestr": 6, "overlap": 6, "point_on_surfac": 6, "guarante": 6, "cheapli": 6, "representative_point": 6, "relat": 6, "de": 6, "9im": 6, "matrix": 6, "relate_pattern": 6, "pattern": 6, "relationship": 6, "interior": 6, "unchang": 6, "is_ccw": 6, "clockwis": 6, "max_segment_length": 6, "vertic": 6, "longer": 6, "evenli": 6, "subdivid": 6, "densifi": 6, "unmodifi": 6, "array_lik": 6, "greater": 6, "simplifi": 6, "preserve_topologi": 6, "dougla": 6, "peucker": 6, "algorithm": 6, "unless": 6, "topologi": 6, "preserv": 6, "invalid": 6, "svg": 6, "scale_factor": 6, "fill_color": 6, "opac": 6, "element": 6, "factor": 6, "diamet": 6, "hex": 6, "color": 6, "66cc99": 6, "ff3333": 6, "symmetric_differ": 6, "symmetr": 6, "union": 6, "dimens": 6, "bound": 6, "collect": 6, "empti": 6, "null": 6, "minx": 6, "mini": 6, "maxx": 6, "maxi": 6, "geometr": 6, "convex_hul": 6, "convex": 6, "hull": 6, "less": 6, "three": [6, 11], "multipoint": 6, "triangular": 6, "imagin": 6, "elast": 6, "band": 6, "stretch": 6, "coordinatesequ": 6, "envelop": 6, "figur": 6, "geom_typ": 6, "has_z": 6, "is_clos": 6, "close": 6, "applic": 6, "is_empti": 6, "is_r": 6, "is_simpl": 6, "simpl": 6, "mean": 6, "is_valid": 6, "definit": 6, "minimum_clear": 6, "move": 6, "minimum_rotated_rectangl": 6, "orient": 6, "rotat": 6, "rectangl": 6, "enclos": 6, "unlik": 6, "constrain": 6, "degener": 6, "oriented_envelop": 6, "wkb": 6, "wkb_hex": 6, "xy": 6, "shell": 6, "hole": 6, "It": [6, 8], "space": 6, "pair": [6, 11], "tripl": 6, "abov": 6, "classmethod": 6, "from_bound": 6, "xmin": 6, "ymin": 6, "xmax": 6, "ymax": 6, "stroke": 6, "partial": 6, "func": 6, "futur": 6, "call": 6, "column_name_to_part": 6, "create_locationrefer": 6, "geodesic_point_buff": 6, "lat": 6, "lon": 6, "get_shared_streets_intersection_hash": 6, "per": [6, 11], "sharedstreet": 6, "j": 6, "blob": 6, "0e6d7de0aee2e9ae3b007d1e45284b06cc241d02": 6, "src": 6, "l553": 6, "l565": 6, "93": 6, "0965985": 6, "44": 6, "952112199999995": 6, "954734870": 6, "69f13f881649cb21ee3b359730790bb9": 6, "hhmmss_to_datetim": 6, "hhmmss_str": 6, "datetim": 6, "hh": 6, "mm": 6, "ss": 6, "dt": 6, "secs_to_datetim": 6, "sec": 6, "shorten_nam": 6, "geom": 6, "xp": 6, "yp": 6, "zp": 6, "shall": 6, "ident": 6, "def": 6, "id_func": 6, "g2": 6, "g1": 6, "pyproj": 6, "accur": 6, "wgs84": 6, "utm": 6, "32618": 6, "from_cr": 6, "always_xi": 6, "support": 6, "lambda": 6, "unidecod": 6, "error": 6, "replace_str": 6, "transliter": 6, "unicod": 6, "ascii": 6, "\u5317\u4eb0": 6, "bei": 6, "jing": 6, "tri": 6, "codec": 6, "charact": 6, "fall": 6, "back": 6, "five": 6, "faster": 6, "slightli": 6, "slower": 6, "unicode_expect_nonascii": 6, "present": 6, "replac": [6, 11], "strict": 6, "rais": 6, "unidecodeerror": 6, "substitut": 6, "might": [6, 11], "packag": [8, 11], "mtc": 8, "aim": 8, "networkwrangl": [8, 11], "refin": 8, "respect": 8, "citilab": 8, "softwar": [8, 11], "instal": 8, "bleed": 8, "clone": 8, "brief": 8, "intro": 8, "workflow": 8, "quickstart": 8, "jupyt": 8, "notebook": 8, "setup": 8, "scenario": 8, "audit": 8, "report": 8, "logger": 8, "modul": 8, "page": 8, "suggest": 11, "virtualenv": 11, "conda": 11, "virtual": 11, "environ": 11, "recommend": 11, "pip": 11, "lasso": 11, "config": 11, "channel": 11, "forg": 11, "rtree": 11, "my_lasso_environ": 11, "activ": 11, "git": 11, "master": 11, "pypi": 11, "repositori": 11, "date": 11, "branch": 11, "work": 11, "your": 11, "machin": 11, "edit": 11, "plan": 11, "well": 11, "cd": 11, "team": 11, "contribut": 11, "bxack": 11, "pleas": 11, "fork": 11, "befor": 11, "upstream": 11, "branchnam": 11, "frequent": 11, "instruct": 11, "good": 11, "atom": 11, "sublim": 11, "text": 11, "syntax": 11, "desktop": 11, "built": 11, "mashup": 11, "open": 11, "In": 11, "nest": 11, "span": 11, "implement": 11, "novel": 11, "travel": 11, "break": 11, "publictransport": 11, "done": 11, "gui": 11, "public": 11, "transport": 11, "infrastructur": 11, "servic": 11, "tier": 11, "made": 11, "mainli": 11, "my_link_fil": 11, "my_node_fil": 11, "my_shape_fil": 11, "my_select": 11, "35e": 11, "961117623": 11, "2564047368": 11, "my_chang": 11, "my_net": 11, "ml_net": 11, "_": 11, "disconnected_nod": 11, "my_out_prefix": 11, "my_dir": 11, "my_base_scenario": 11, "road_net": 11, "stpaul_link_fil": 11, "stpaul_node_fil": 11, "stpaul_shape_fil": 11, "transit_net": 11, "stpaul_dir": 11, "card_filenam": 11, "3_multiple_roadway_attribute_chang": 11, "multiple_chang": 11, "4_simple_managed_lan": 11, "project_card_directori": 11, "project_card": 11, "project_cards_list": 11, "my_scenario": 11, "create_scenario": 11, "base_scenario": 11, "check_scenario_requisit": 11, "apply_all_project": 11, "scenario_summari": 11, "base_transit_sourc": 11, "build_transit_sourc": 11, "understand": 11, "how": 11, "overrid": 11, "those": 11, "instanti": 11, "yaml": 11, "configur": 11, "config_fil": 11, "f": 11, "my_config": 11, "safe_load": 11, "model_road_net": 11, "my_paramet": 11, "accomplish": 11, "goal": 11, "top": 11, "learn": 11, "basic": 11, "creation": 11, "ipynb": 11}, "objects": {"": [[7, 0, 0, "-", "lasso"]], "lasso": [[0, 1, 1, "", "CubeTransit"], [1, 1, 1, "", "ModelRoadwayNetwork"], [2, 1, 1, "", "Parameters"], [3, 1, 1, "", "Project"], [4, 1, 1, "", "StandardTransit"], [5, 0, 0, "-", "logger"], [6, 0, 0, "-", "util"]], "lasso.CubeTransit": [[0, 2, 1, "", "__init__"], [0, 2, 1, "", "add_additional_time_periods"], [0, 2, 1, "", "add_cube"], [0, 2, 1, "", "build_route_name"], [0, 2, 1, "", "calculate_start_end_times"], [0, 2, 1, "", "create_add_route_card_dict"], [0, 2, 1, "", "create_delete_route_card_dict"], [0, 2, 1, "", "create_from_cube"], [0, 2, 1, "", "create_update_route_card_dict"], [0, 2, 1, "", "cube_properties_to_standard_properties"], [0, 3, 1, "", "diff_dict"], [0, 2, 1, "", "evaluate_differences"], [0, 2, 1, "", "evaluate_route_property_differences"], [0, 2, 1, "", "evaluate_route_shape_changes"], [0, 2, 1, "", "get_time_period_numbers_from_cube_properties"], [0, 3, 1, "", "line_properties"], [0, 3, 1, "", "lines"], [0, 3, 1, "", "parameters"], [0, 3, 1, "", "program_type"], [0, 3, 1, "", "shapes"], [0, 3, 1, "", "source_list"], [0, 2, 1, "", "unpack_route_name"]], "lasso.ModelRoadwayNetwork": [[1, 3, 1, "", "CALCULATED_VALUES"], [1, 2, 1, "", "__init__"], [1, 2, 1, "", "add_counts"], [1, 2, 1, "", "add_incident_link_data_to_nodes"], [1, 2, 1, "", "add_new_roadway_feature_change"], [1, 2, 1, "", "add_variable_using_shst_reference"], [1, 2, 1, "", "addition_map"], [1, 2, 1, "", "apply"], [1, 2, 1, "", "apply_managed_lane_feature_change"], [1, 2, 1, "", "apply_python_calculation"], [1, 2, 1, "", "apply_roadway_feature_change"], [1, 2, 1, "", "assess_connectivity"], [1, 2, 1, "", "build_selection_key"], [1, 2, 1, "", "calculate_area_type"], [1, 2, 1, "", "calculate_centroidconnect"], [1, 2, 1, "", "calculate_county"], [1, 2, 1, "", "calculate_distance"], [1, 2, 1, "", "calculate_mpo"], [1, 2, 1, "", "calculate_use"], [1, 2, 1, "", "convert_int"], [1, 2, 1, "", "create_ML_variable"], [1, 2, 1, "", "create_calculated_variables"], [1, 2, 1, "", "create_dummy_connector_links"], [1, 2, 1, "", "create_hov_corridor_variable"], [1, 2, 1, "", "create_managed_lane_network"], [1, 2, 1, "", "create_managed_variable"], [1, 2, 1, "", "dataframe_to_fixed_width"], [1, 2, 1, "", "delete_roadway_feature_change"], [1, 2, 1, "", "deletion_map"], [1, 2, 1, "", "fill_na"], [1, 2, 1, "", "from_RoadwayNetwork"], [1, 2, 1, "", "get_attribute"], [1, 2, 1, "", "get_managed_lane_node_ids"], [1, 2, 1, "", "get_modal_graph"], [1, 2, 1, "", "get_modal_links_nodes"], [1, 2, 1, "", "get_property_by_time_period_and_group"], [1, 2, 1, "", "identify_segment"], [1, 2, 1, "", "identify_segment_endpoints"], [1, 2, 1, "", "is_network_connected"], [1, 2, 1, "", "load_transform_network"], [1, 2, 1, "", "network_connection_plot"], [1, 2, 1, "", "orig_dest_nodes_foreign_key"], [1, 2, 1, "", "ox_graph"], [1, 2, 1, "", "path_search"], [1, 2, 1, "", "read"], [1, 2, 1, "", "read_match_result"], [1, 2, 1, "", "rename_variables_for_dbf"], [1, 2, 1, "", "roadway_net_to_gdf"], [1, 2, 1, "", "roadway_standard_to_met_council_network"], [1, 2, 1, "", "select_roadway_features"], [1, 2, 1, "", "selection_has_unique_link_id"], [1, 2, 1, "", "selection_map"], [1, 2, 1, "", "shortest_path"], [1, 2, 1, "", "split_properties_by_time_period_and_category"], [1, 2, 1, "", "update_distance"], [1, 2, 1, "", "validate_link_schema"], [1, 2, 1, "", "validate_node_schema"], [1, 2, 1, "", "validate_properties"], [1, 2, 1, "", "validate_selection"], [1, 2, 1, "", "validate_shape_schema"], [1, 2, 1, "", "validate_uniqueness"], [1, 2, 1, "", "write"], [1, 2, 1, "", "write_roadway_as_fixedwidth"], [1, 2, 1, "", "write_roadway_as_shp"]], "lasso.Parameters": [[2, 2, 1, "", "__init__"], [2, 3, 1, "", "county_link_range_dict"], [2, 3, 1, "", "cube_time_periods"], [2, 3, 1, "", "properties_to_split"], [2, 3, 1, "", "zones"]], "lasso.Project": [[3, 3, 1, "", "CALCULATED_VALUES"], [3, 3, 1, "id0", "DEFAULT_PROJECT_NAME"], [3, 3, 1, "id1", "STATIC_VALUES"], [3, 2, 1, "", "__init__"], [3, 2, 1, "", "add_highway_changes"], [3, 2, 1, "", "add_transit_changes"], [3, 3, 1, "", "base_cube_transit_network"], [3, 3, 1, "", "base_roadway_network"], [3, 3, 1, "", "build_cube_transit_network"], [3, 3, 1, "", "card_data"], [3, 2, 1, "", "create_project"], [3, 2, 1, "", "determine_roadway_network_changes_compatability"], [3, 2, 1, "", "emme_id_to_wrangler_id"], [3, 2, 1, "", "emme_name_to_wrangler_name"], [3, 2, 1, "", "evaluate_changes"], [3, 2, 1, "", "get_object_from_network_build_command"], [3, 2, 1, "", "get_operation_from_network_build_command"], [3, 3, 1, "", "parameters"], [3, 3, 1, "", "project_name"], [3, 2, 1, "", "read_logfile"], [3, 2, 1, "", "read_network_build_file"], [3, 3, 1, "", "roadway_link_changes"], [3, 3, 1, "", "roadway_node_changes"], [3, 3, 1, "", "transit_changes"], [3, 2, 1, "", "write_project_card"]], "lasso.StandardTransit": [[4, 2, 1, "", "__init__"], [4, 2, 1, "", "calculate_cube_mode"], [4, 2, 1, "", "cube_format"], [4, 2, 1, "", "evaluate_differences"], [4, 3, 1, "", "feed"], [4, 2, 1, "", "fromTransitNetwork"], [4, 3, 1, "", "parameters"], [4, 2, 1, "", "read_gtfs"], [4, 2, 1, "", "route_properties_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_dict_list"], [4, 2, 1, "", "shape_gtfs_to_emme"], [4, 2, 1, "", "time_to_cube_time_period"], [4, 2, 1, "", "write_as_cube_lin"]], "lasso.logger": [[5, 4, 1, "", "setupLogging"]], "lasso.util": [[6, 1, 1, "", "Point"], [6, 1, 1, "", "Polygon"], [6, 4, 1, "", "column_name_to_parts"], [6, 4, 1, "", "create_locationreference"], [6, 4, 1, "", "geodesic_point_buffer"], [6, 4, 1, "", "get_shared_streets_intersection_hash"], [6, 4, 1, "", "hhmmss_to_datetime"], [6, 1, 1, "", "partial"], [6, 4, 1, "", "secs_to_datetime"], [6, 4, 1, "", "shorten_name"], [6, 4, 1, "", "transform"], [6, 4, 1, "", "unidecode"]], "lasso.util.Point": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "x"], [6, 5, 1, "", "xy"], [6, 5, 1, "", "y"], [6, 5, 1, "", "z"]], "lasso.util.Polygon": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "id0", "exterior"], [6, 2, 1, "", "from_bounds"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 5, 1, "id1", "interiors"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "xy"]], "lasso.util.partial": [[6, 3, 1, "", "args"], [6, 3, 1, "", "func"], [6, 3, 1, "", "keywords"]]}, "objtypes": {"0": "py:module", "1": "py:class", "2": "py:method", "3": "py:attribute", "4": "py:function", "5": "py:property"}, "objnames": {"0": ["py", "module", "Python module"], "1": ["py", "class", "Python class"], "2": ["py", "method", "Python method"], "3": ["py", "attribute", "Python attribute"], "4": ["py", "function", "Python function"], "5": ["py", "property", "Python property"]}, "titleterms": {"lasso": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], "cubetransit": [0, 11], "modelroadwaynetwork": [1, 11], "todo": 1, "paramet": [2, 10, 11], "project": [3, 9, 10, 11], "standardtransit": [4, 11], "logger": 5, "util": [6, 7], "class": 7, "function": 7, "base": 7, "welcom": 8, "": 8, "document": 8, "content": 8, "indic": 8, "tabl": 8, "run": [9, 11], "creat": 9, "file": [9, 10, 11], "scenario": [9, 11], "export": 9, "network": [9, 11], "audit": 9, "report": 9, "setup": 10, "set": 10, "addit": 10, "data": 10, "start": 11, "out": 11, "instal": 11, "bleed": 11, "edg": 11, "from": 11, "clone": 11, "brief": 11, "intro": 11, "compon": 11, "roadwaynetwork": 11, "transitnetwork": 11, "projectcard": 11, "typic": 11, "workflow": 11, "card": 11, "transit": 11, "lin": 11, "cube": 11, "log": 11, "model": 11, "quickstart": 11, "jupyt": 11, "notebook": 11}, "envversion": {"sphinx.domains.c": 3, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 9, "sphinx.domains.index": 1, "sphinx.domains.javascript": 3, "sphinx.domains.math": 2, "sphinx.domains.python": 4, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.intersphinx": 1, "sphinx.ext.todo": 2, "sphinx.ext.viewcode": 1, "sphinx": 58}, "alltitles": {"lasso.CubeTransit": [[0, "lasso-cubetransit"]], "lasso.ModelRoadwayNetwork": [[1, "lasso-modelroadwaynetwork"]], "Todo": [[1, "id1"], [1, "id2"], [1, "id3"], [1, "id4"], [1, "id5"], [1, "id6"]], "lasso.Parameters": [[2, "lasso-parameters"]], "lasso.Project": [[3, "lasso-project"]], "lasso.StandardTransit": [[4, "lasso-standardtransit"]], "lasso.logger": [[5, "module-lasso.logger"]], "lasso.util": [[6, "module-lasso.util"]], "Lasso Classes and Functions": [[7, "module-lasso"]], "Base Classes": [[7, "base-classes"]], "Utils and Functions": [[7, "utils-and-functions"]], "Welcome to lasso\u2019s documentation!": [[8, "welcome-to-lasso-s-documentation"]], "Contents:": [[8, null]], "Indices and tables": [[8, "indices-and-tables"]], "Running Lasso": [[9, "running-lasso"]], "Create project files": [[9, "create-project-files"]], "Create a scenario": [[9, "create-a-scenario"]], "Exporting networks": [[9, "exporting-networks"]], "Auditing and Reporting": [[9, "auditing-and-reporting"]], "Setup": [[10, "setup"]], "Projects": [[10, "projects"]], "Parameters": [[10, "parameters"], [11, "parameters"]], "Settings": [[10, "settings"]], "Additional Data Files": [[10, "additional-data-files"]], "Starting Out": [[11, "starting-out"]], "Installation": [[11, "installation"]], "Bleeding Edge": [[11, "bleeding-edge"]], "From Clone": [[11, "from-clone"]], "Brief Intro": [[11, "brief-intro"]], "Components": [[11, "components"]], "RoadwayNetwork": [[11, "roadwaynetwork"]], "TransitNetwork": [[11, "transitnetwork"]], "ProjectCard": [[11, "projectcard"]], "Scenario": [[11, "scenario"]], "Project": [[11, "project"]], "ModelRoadwayNetwork": [[11, "modelroadwaynetwork"]], "StandardTransit": [[11, "standardtransit"]], "CubeTransit": [[11, "cubetransit"]], "Typical Workflow": [[11, "typical-workflow"]], "Project Cards from Transit LIN Files": [[11, "project-cards-from-transit-lin-files"]], "Project Cards from Cube LOG Files": [[11, "project-cards-from-cube-log-files"]], "Model Network Files for a Scenario": [[11, "model-network-files-for-a-scenario"]], "Running Quickstart Jupyter Notebooks": [[11, "running-quickstart-jupyter-notebooks"]]}, "indexentries": {"cubetransit (class in lasso)": [[0, "lasso.CubeTransit"]], "__init__() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.__init__"]], "add_additional_time_periods() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_additional_time_periods"]], "add_cube() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_cube"]], "build_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.build_route_name"]], "calculate_start_end_times() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.calculate_start_end_times"]], "create_add_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_add_route_card_dict"]], "create_delete_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_delete_route_card_dict"]], "create_from_cube() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.create_from_cube"]], "create_update_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_update_route_card_dict"]], "cube_properties_to_standard_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.cube_properties_to_standard_properties"]], "diff_dict (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.diff_dict"]], "evaluate_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_differences"]], "evaluate_route_property_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_route_property_differences"]], "evaluate_route_shape_changes() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.evaluate_route_shape_changes"]], "get_time_period_numbers_from_cube_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.get_time_period_numbers_from_cube_properties"]], "line_properties (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.line_properties"]], "lines (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.lines"]], "parameters (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.parameters"]], "program_type (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.program_type"]], "shapes (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.shapes"]], "source_list (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.source_list"]], "unpack_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.unpack_route_name"]], "calculated_values (lasso.modelroadwaynetwork attribute)": [[1, "lasso.ModelRoadwayNetwork.CALCULATED_VALUES"]], "modelroadwaynetwork (class in lasso)": [[1, "lasso.ModelRoadwayNetwork"]], "__init__() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.__init__"]], "add_counts() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_counts"]], "add_incident_link_data_to_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.add_incident_link_data_to_nodes"]], "add_new_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_new_roadway_feature_change"]], "add_variable_using_shst_reference() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_variable_using_shst_reference"]], "addition_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.addition_map"]], "apply() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply"]], "apply_managed_lane_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_managed_lane_feature_change"]], "apply_python_calculation() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_python_calculation"]], "apply_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_roadway_feature_change"]], "assess_connectivity() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.assess_connectivity"]], "build_selection_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.build_selection_key"]], "calculate_area_type() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_area_type"]], "calculate_centroidconnect() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_centroidconnect"]], "calculate_county() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_county"]], "calculate_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_distance"]], "calculate_mpo() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_mpo"]], "calculate_use() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_use"]], "convert_int() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.convert_int"]], "create_ml_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_ML_variable"]], "create_calculated_variables() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_calculated_variables"]], "create_dummy_connector_links() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_dummy_connector_links"]], "create_hov_corridor_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_hov_corridor_variable"]], "create_managed_lane_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_lane_network"]], "create_managed_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_variable"]], "dataframe_to_fixed_width() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.dataframe_to_fixed_width"]], "delete_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.delete_roadway_feature_change"]], "deletion_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.deletion_map"]], "fill_na() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.fill_na"]], "from_roadwaynetwork() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.from_RoadwayNetwork"]], "get_attribute() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_attribute"]], "get_managed_lane_node_ids() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_managed_lane_node_ids"]], "get_modal_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_graph"]], "get_modal_links_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_links_nodes"]], "get_property_by_time_period_and_group() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.get_property_by_time_period_and_group"]], "identify_segment() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment"]], "identify_segment_endpoints() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment_endpoints"]], "is_network_connected() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.is_network_connected"]], "load_transform_network() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.load_transform_network"]], "network_connection_plot() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.network_connection_plot"]], "orig_dest_nodes_foreign_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.orig_dest_nodes_foreign_key"]], "ox_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.ox_graph"]], "path_search() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.path_search"]], "read() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read"]], "read_match_result() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read_match_result"]], "rename_variables_for_dbf() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.rename_variables_for_dbf"]], "roadway_net_to_gdf() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.roadway_net_to_gdf"]], "roadway_standard_to_met_council_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.roadway_standard_to_met_council_network"]], "select_roadway_features() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.select_roadway_features"]], "selection_has_unique_link_id() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_has_unique_link_id"]], "selection_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_map"]], "shortest_path() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.shortest_path"]], "split_properties_by_time_period_and_category() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.split_properties_by_time_period_and_category"]], "update_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.update_distance"]], "validate_link_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_link_schema"]], "validate_node_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_node_schema"]], "validate_properties() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_properties"]], "validate_selection() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_selection"]], "validate_shape_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_shape_schema"]], "validate_uniqueness() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_uniqueness"]], "write() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write"]], "write_roadway_as_fixedwidth() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_fixedwidth"]], "write_roadway_as_shp() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_shp"]], "parameters (class in lasso)": [[2, "lasso.Parameters"]], "__init__() (lasso.parameters method)": [[2, "lasso.Parameters.__init__"]], "county_link_range_dict (lasso.parameters attribute)": [[2, "lasso.Parameters.county_link_range_dict"]], "cube_time_periods (lasso.parameters attribute)": [[2, "lasso.Parameters.cube_time_periods"]], "properties_to_split (lasso.parameters attribute)": [[2, "lasso.Parameters.properties_to_split"]], "zones (lasso.parameters attribute)": [[2, "lasso.Parameters.zones"]], "calculated_values (lasso.project attribute)": [[3, "lasso.Project.CALCULATED_VALUES"]], "default_project_name (lasso.project attribute)": [[3, "id0"], [3, "lasso.Project.DEFAULT_PROJECT_NAME"]], "project (class in lasso)": [[3, "lasso.Project"]], "static_values (lasso.project attribute)": [[3, "id1"], [3, "lasso.Project.STATIC_VALUES"]], "__init__() (lasso.project method)": [[3, "lasso.Project.__init__"]], "add_highway_changes() (lasso.project method)": [[3, "lasso.Project.add_highway_changes"]], "add_transit_changes() (lasso.project method)": [[3, "lasso.Project.add_transit_changes"]], "base_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.base_cube_transit_network"]], "base_roadway_network (lasso.project attribute)": [[3, "lasso.Project.base_roadway_network"]], "build_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.build_cube_transit_network"]], "card_data (lasso.project attribute)": [[3, "lasso.Project.card_data"]], "create_project() (lasso.project static method)": [[3, "lasso.Project.create_project"]], "determine_roadway_network_changes_compatability() (lasso.project static method)": [[3, "lasso.Project.determine_roadway_network_changes_compatability"]], "emme_id_to_wrangler_id() (lasso.project static method)": [[3, "lasso.Project.emme_id_to_wrangler_id"]], "emme_name_to_wrangler_name() (lasso.project static method)": [[3, "lasso.Project.emme_name_to_wrangler_name"]], "evaluate_changes() (lasso.project method)": [[3, "lasso.Project.evaluate_changes"]], "get_object_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_object_from_network_build_command"]], "get_operation_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_operation_from_network_build_command"]], "parameters (lasso.project attribute)": [[3, "lasso.Project.parameters"]], "project_name (lasso.project attribute)": [[3, "lasso.Project.project_name"]], "read_logfile() (lasso.project static method)": [[3, "lasso.Project.read_logfile"]], "read_network_build_file() (lasso.project static method)": [[3, "lasso.Project.read_network_build_file"]], "roadway_link_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_link_changes"]], "roadway_node_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_node_changes"]], "transit_changes (lasso.project attribute)": [[3, "lasso.Project.transit_changes"]], "write_project_card() (lasso.project method)": [[3, "lasso.Project.write_project_card"]], "standardtransit (class in lasso)": [[4, "lasso.StandardTransit"]], "__init__() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.__init__"]], "calculate_cube_mode() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.calculate_cube_mode"]], "cube_format() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.cube_format"]], "evaluate_differences() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.evaluate_differences"]], "feed (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.feed"]], "fromtransitnetwork() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.fromTransitNetwork"]], "parameters (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.parameters"]], "read_gtfs() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.read_gtfs"]], "route_properties_gtfs_to_cube() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.route_properties_gtfs_to_cube"]], "shape_gtfs_to_cube() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_cube"]], "shape_gtfs_to_dict_list() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_dict_list"]], "shape_gtfs_to_emme() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_emme"]], "time_to_cube_time_period() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.time_to_cube_time_period"]], "write_as_cube_lin() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.write_as_cube_lin"]], "lasso.logger": [[5, "module-lasso.logger"]], "module": [[5, "module-lasso.logger"], [6, "module-lasso.util"], [7, "module-lasso"]], "setuplogging() (in module lasso.logger)": [[5, "lasso.logger.setupLogging"]], "point (class in lasso.util)": [[6, "lasso.util.Point"]], "polygon (class in lasso.util)": [[6, "lasso.util.Polygon"]], "almost_equals() (lasso.util.point method)": [[6, "lasso.util.Point.almost_equals"]], "almost_equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.almost_equals"]], "area (lasso.util.point property)": [[6, "lasso.util.Point.area"]], "area (lasso.util.polygon property)": [[6, "lasso.util.Polygon.area"]], "args (lasso.util.partial attribute)": [[6, "lasso.util.partial.args"]], "boundary (lasso.util.point property)": [[6, "lasso.util.Point.boundary"]], "boundary (lasso.util.polygon property)": [[6, "lasso.util.Polygon.boundary"]], "bounds (lasso.util.point property)": [[6, "lasso.util.Point.bounds"]], "bounds (lasso.util.polygon property)": [[6, "lasso.util.Polygon.bounds"]], "buffer() (lasso.util.point method)": [[6, "lasso.util.Point.buffer"]], "buffer() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.buffer"]], "centroid (lasso.util.point property)": [[6, "lasso.util.Point.centroid"]], "centroid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.centroid"]], "column_name_to_parts() (in module lasso.util)": [[6, "lasso.util.column_name_to_parts"]], "contains() (lasso.util.point method)": [[6, "lasso.util.Point.contains"]], "contains() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains"]], "contains_properly() (lasso.util.point method)": [[6, "lasso.util.Point.contains_properly"]], "contains_properly() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains_properly"]], "convex_hull (lasso.util.point property)": [[6, "lasso.util.Point.convex_hull"]], "convex_hull (lasso.util.polygon property)": [[6, "lasso.util.Polygon.convex_hull"]], "coords (lasso.util.point property)": [[6, "lasso.util.Point.coords"]], "coords (lasso.util.polygon property)": [[6, "lasso.util.Polygon.coords"]], "covered_by() (lasso.util.point method)": [[6, "lasso.util.Point.covered_by"]], "covered_by() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covered_by"]], "covers() (lasso.util.point method)": [[6, "lasso.util.Point.covers"]], "covers() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covers"]], "create_locationreference() (in module lasso.util)": [[6, "lasso.util.create_locationreference"]], "crosses() (lasso.util.point method)": [[6, "lasso.util.Point.crosses"]], "crosses() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.crosses"]], "difference() (lasso.util.point method)": [[6, "lasso.util.Point.difference"]], "difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.difference"]], "disjoint() (lasso.util.point method)": [[6, "lasso.util.Point.disjoint"]], "disjoint() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.disjoint"]], "distance() (lasso.util.point method)": [[6, "lasso.util.Point.distance"]], "distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.distance"]], "dwithin() (lasso.util.point method)": [[6, "lasso.util.Point.dwithin"]], "dwithin() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.dwithin"]], "envelope (lasso.util.point property)": [[6, "lasso.util.Point.envelope"]], "envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.envelope"]], "equals() (lasso.util.point method)": [[6, "lasso.util.Point.equals"]], "equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals"]], "equals_exact() (lasso.util.point method)": [[6, "lasso.util.Point.equals_exact"]], "equals_exact() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals_exact"]], "exterior (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.exterior"]], "exterior (lasso.util.polygon property)": [[6, "id0"]], "from_bounds() (lasso.util.polygon class method)": [[6, "lasso.util.Polygon.from_bounds"]], "func (lasso.util.partial attribute)": [[6, "lasso.util.partial.func"]], "geodesic_point_buffer() (in module lasso.util)": [[6, "lasso.util.geodesic_point_buffer"]], "geom_type (lasso.util.point property)": [[6, "lasso.util.Point.geom_type"]], "geom_type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.geom_type"]], "geometrytype() (lasso.util.point method)": [[6, "lasso.util.Point.geometryType"]], "geometrytype() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.geometryType"]], "get_shared_streets_intersection_hash() (in module lasso.util)": [[6, "lasso.util.get_shared_streets_intersection_hash"]], "has_z (lasso.util.point property)": [[6, "lasso.util.Point.has_z"]], "has_z (lasso.util.polygon property)": [[6, "lasso.util.Polygon.has_z"]], "hausdorff_distance() (lasso.util.point method)": [[6, "lasso.util.Point.hausdorff_distance"]], "hausdorff_distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.hausdorff_distance"]], "hhmmss_to_datetime() (in module lasso.util)": [[6, "lasso.util.hhmmss_to_datetime"]], "interiors (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.interiors"]], "interiors (lasso.util.polygon property)": [[6, "id1"]], "interpolate() (lasso.util.point method)": [[6, "lasso.util.Point.interpolate"]], "interpolate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.interpolate"]], "intersection() (lasso.util.point method)": [[6, "lasso.util.Point.intersection"]], "intersection() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersection"]], "intersects() (lasso.util.point method)": [[6, "lasso.util.Point.intersects"]], "intersects() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersects"]], "is_closed (lasso.util.point property)": [[6, "lasso.util.Point.is_closed"]], "is_closed (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_closed"]], "is_empty (lasso.util.point property)": [[6, "lasso.util.Point.is_empty"]], "is_empty (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_empty"]], "is_ring (lasso.util.point property)": [[6, "lasso.util.Point.is_ring"]], "is_ring (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_ring"]], "is_simple (lasso.util.point property)": [[6, "lasso.util.Point.is_simple"]], "is_simple (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_simple"]], "is_valid (lasso.util.point property)": [[6, "lasso.util.Point.is_valid"]], "is_valid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_valid"]], "keywords (lasso.util.partial attribute)": [[6, "lasso.util.partial.keywords"]], "lasso.util": [[6, "module-lasso.util"]], "length (lasso.util.point property)": [[6, "lasso.util.Point.length"]], "length (lasso.util.polygon property)": [[6, "lasso.util.Polygon.length"]], "line_interpolate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_interpolate_point"]], "line_interpolate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_interpolate_point"]], "line_locate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_locate_point"]], "line_locate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_locate_point"]], "minimum_clearance (lasso.util.point property)": [[6, "lasso.util.Point.minimum_clearance"]], "minimum_clearance (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_clearance"]], "minimum_rotated_rectangle (lasso.util.point property)": [[6, "lasso.util.Point.minimum_rotated_rectangle"]], "minimum_rotated_rectangle (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_rotated_rectangle"]], "normalize() (lasso.util.point method)": [[6, "lasso.util.Point.normalize"]], "normalize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.normalize"]], "oriented_envelope (lasso.util.point property)": [[6, "lasso.util.Point.oriented_envelope"]], "oriented_envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.oriented_envelope"]], "overlaps() (lasso.util.point method)": [[6, "lasso.util.Point.overlaps"]], "overlaps() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.overlaps"]], "partial (class in lasso.util)": [[6, "lasso.util.partial"]], "point_on_surface() (lasso.util.point method)": [[6, "lasso.util.Point.point_on_surface"]], "point_on_surface() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.point_on_surface"]], "project() (lasso.util.point method)": [[6, "lasso.util.Point.project"]], "project() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.project"]], "relate() (lasso.util.point method)": [[6, "lasso.util.Point.relate"]], "relate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate"]], "relate_pattern() (lasso.util.point method)": [[6, "lasso.util.Point.relate_pattern"]], "relate_pattern() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate_pattern"]], "representative_point() (lasso.util.point method)": [[6, "lasso.util.Point.representative_point"]], "representative_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.representative_point"]], "reverse() (lasso.util.point method)": [[6, "lasso.util.Point.reverse"]], "reverse() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.reverse"]], "secs_to_datetime() (in module lasso.util)": [[6, "lasso.util.secs_to_datetime"]], "segmentize() (lasso.util.point method)": [[6, "lasso.util.Point.segmentize"]], "segmentize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.segmentize"]], "shorten_name() (in module lasso.util)": [[6, "lasso.util.shorten_name"]], "simplify() (lasso.util.point method)": [[6, "lasso.util.Point.simplify"]], "simplify() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.simplify"]], "svg() (lasso.util.point method)": [[6, "lasso.util.Point.svg"]], "svg() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.svg"]], "symmetric_difference() (lasso.util.point method)": [[6, "lasso.util.Point.symmetric_difference"]], "symmetric_difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.symmetric_difference"]], "touches() (lasso.util.point method)": [[6, "lasso.util.Point.touches"]], "touches() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.touches"]], "transform() (in module lasso.util)": [[6, "lasso.util.transform"]], "type (lasso.util.point property)": [[6, "lasso.util.Point.type"]], "type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.type"]], "unidecode() (in module lasso.util)": [[6, "lasso.util.unidecode"]], "union() (lasso.util.point method)": [[6, "lasso.util.Point.union"]], "union() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.union"]], "within() (lasso.util.point method)": [[6, "lasso.util.Point.within"]], "within() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.within"]], "wkb (lasso.util.point property)": [[6, "lasso.util.Point.wkb"]], "wkb (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb"]], "wkb_hex (lasso.util.point property)": [[6, "lasso.util.Point.wkb_hex"]], "wkb_hex (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb_hex"]], "wkt (lasso.util.point property)": [[6, "lasso.util.Point.wkt"]], "wkt (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkt"]], "x (lasso.util.point property)": [[6, "lasso.util.Point.x"]], "xy (lasso.util.point property)": [[6, "lasso.util.Point.xy"]], "xy (lasso.util.polygon property)": [[6, "lasso.util.Polygon.xy"]], "y (lasso.util.point property)": [[6, "lasso.util.Point.y"]], "z (lasso.util.point property)": [[6, "lasso.util.Point.z"]], "lasso": [[7, "module-lasso"]]}}) \ No newline at end of file diff --git a/branch/remove_assignable/setup/index.html b/branch/remove_assignable/setup/index.html new file mode 100644 index 0000000..224865c --- /dev/null +++ b/branch/remove_assignable/setup/index.html @@ -0,0 +1,133 @@ + + + + + + + Setup — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Setup

+
+

Projects

+
+
+

Parameters

+
+
+

Settings

+
+
+

Additional Data Files

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/remove_assignable/starting/index.html b/branch/remove_assignable/starting/index.html new file mode 100644 index 0000000..1741429 --- /dev/null +++ b/branch/remove_assignable/starting/index.html @@ -0,0 +1,434 @@ + + + + + + + Starting Out — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Starting Out

+
+

Installation

+

If you are managing multiple python versions, we suggest using virtualenv or conda virtual environments.

+

Example using a conda environment (recommended) and using the package manager pip to install Lasso from the source on GitHub.

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/Lasso@master
+
+
+

Lasso will install network_wrangler from the PyPi repository because it is included in Lasso’s requirements.txt.

+
+

Bleeding Edge

+

If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the develop branch of

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/network_wrangler@develop
+pip install git+https://github.com/wsp-sag/Lasso@develop
+
+
+
+
+

From Clone

+

If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in editable mode.

+

if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas osmnx -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+git clone https://github.com/wsp-sag/Lasso
+git clone https://github.com/wsp-sag/network_wrangler
+cd network_wrangler
+pip install -e .
+cd ..
+cd Lasso
+pip install -e .
+
+
+

Notes:

+
    +
  1. The -e installs it in editable mode.

  2. +
  3. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per these directions on github.

  4. +
  5. if you wanted to install from a specific tag/version number or branch, replace @master with @<branchname> or @tag

  6. +
  7. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler

  8. +
+

If you are going to be doing Lasso development, we also recommend:

+
    +
  • a good IDE such as Atom, VS Code, Sublime Text, etc. +with Python syntax highlighting turned on.

  • +
  • GitHub Desktop to locally update your clones

  • +
+
+
+
+

Brief Intro

+

Lasso is a ‘wrapper’ around the Network Wrangler utility.

+

Both Lasso and NetworkWrangler are built around the following data schemas:

+
    +
  • [roadway network], which is based on a mashup of Open Street Map and Shared Streets. In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category.

  • +
  • [transit network], which is based on a frequency-based implementation of the csv-based GTFS; and

  • +
  • [project card], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml.

  • +
+

In addition, Lasso utilizes the following data schemas:

+
    +
  • [MetCouncil Model Roadway Network Schema], which adds data fields to the roadway network schema that MetCouncil uses in their travel model including breaking out data fields by time period.

  • +
  • [MetCouncil Model Transit Network Schema], which uses the Cube PublicTransport format, and

  • +
  • [Cube Log Files], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler.

  • +
  • [Cube public transport line files], which define a set of transit lines in the cube software.

  • +
+
+

Components

+

Network Wrangler has the following atomic parts:

+
    +
  • RoadwayNetwork object, which represents the roadway network data as GeoDataFrames;

  • +
  • TransitNetwork object, which represents the transit network data as DataFrames;

  • +
  • ProjectCard object, which represents the data of the project card. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.;

  • +
  • Scenario object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network.

  • +
+

In addition, Lasso has the following atomic parts:

+
    +
  • Project object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file.

  • +
  • ModelRoadwayNetwork object is a subclass of RoadwayNetwork and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube.

  • +
  • StandardTransit, an object for holding a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s Cube Line files.

  • +
  • CubeTransit, an object for storing information about transit defined in Cube public transport line files +. Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries.

  • +
  • Parameters, A class representing all the parameters defining the networks +including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class.

  • +
+
+

RoadwayNetwork

+

Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames links_df, nodes_df, and shapes_df.

+
net = RoadwayNetwork.read(
+        link_filename=MY_LINK_FILE,
+        node_filename=MY_NODE_FILE,
+        shape_filename=MY_SHAPE_FILE,
+        shape_foreign_key ='shape_id',
+        
+    )
+my_selection = {
+    "link": [{"name": ["I 35E"]}],
+    "A": {"osm_node_id": "961117623"},  # start searching for segments at A
+    "B": {"osm_node_id": "2564047368"},
+}
+net.select_roadway_features(my_selection)
+
+my_change = [
+    {
+        'property': 'lanes',
+        'existing': 1,
+        'set': 2,
+     },
+     {
+        'property': 'drive_access',
+        'set': 0,
+      },
+]
+
+my_net.apply_roadway_feature_change(
+    my_net.select_roadway_features(my_selection),
+    my_change
+)
+
+ml_net = net.create_managed_lane_network(in_place=False)
+
+ml_net.is_network_connected(mode="drive"))
+
+_, disconnected_nodes = ml_net.assess_connectivity(
+  mode="walk",
+  ignore_end_nodes=True
+)
+ml_net.write(filename=my_out_prefix, path=my_dir)
+
+
+
+
+

TransitNetwork

+
+
+

ProjectCard

+
+
+

Scenario

+

Manages sets of project cards and tiering from a base scenario/set of networks.

+

+my_base_scenario = {
+    "road_net": RoadwayNetwork.read(
+        link_filename=STPAUL_LINK_FILE,
+        node_filename=STPAUL_NODE_FILE,
+        shape_filename=STPAUL_SHAPE_FILE,
+        fast=True,
+        shape_foreign_key ='shape_id',
+    ),
+    "transit_net": TransitNetwork.read(STPAUL_DIR),
+}
+
+card_filenames = [
+    "3_multiple_roadway_attribute_change.yml",
+    "multiple_changes.yml",
+    "4_simple_managed_lane.yml",
+]
+
+project_card_directory = os.path.join(STPAUL_DIR, "project_cards")
+
+project_cards_list = [
+    ProjectCard.read(os.path.join(project_card_directory, filename), validate=False)
+    for filename in card_filenames
+]
+
+my_scenario = Scenario.create_scenario(
+  base_scenario=my_base_scenario,
+  project_cards_list=project_cards_list,
+)
+my_scenario.check_scenario_requisites()
+
+my_scenario.apply_all_projects()
+
+my_scenario.scenario_summary()
+
+
+
+
+

Project

+

Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network;

+

+test_project = Project.create_project(
+  base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+  build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+  )
+
+test_project.evaluate_changes()
+
+test_project.write_project_card(
+  os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+  )
+
+
+
+
+

ModelRoadwayNetwork

+

A subclass of network_wrangler’s RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema.

+
net = ModelRoadwayNetwork.read(
+      link_filename=STPAUL_LINK_FILE,
+      node_filename=STPAUL_NODE_FILE,
+      shape_filename=STPAUL_SHAPE_FILE,
+      fast=True,
+      shape_foreign_key ='shape_id',
+  )
+
+net.write_roadway_as_fixedwidth()
+
+
+
+
+

StandardTransit

+

Translates the standard GTFS data to MetCouncil’s Cube Line files.

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+

CubeTransit

+

Used by the project class and has the capability to:

+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+

Parameters

+

Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary.

+
# read parameters from a yaml configuration  file
+# could also provide as a key/value pair
+with open(config_file) as f:
+      my_config = yaml.safe_load(f)
+
+# provide parameters at instantiation of ModelRoadwayNetwork
+model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork(
+            my_scenario.road_net, parameters=my_config.get("my_parameters", {})
+        )
+# network written with direction from the parameters given
+model_road_net.write_roadway_as_shp()
+
+
+
+
+
+

Typical Workflow

+

Workflows in Lasso and Network Wrangler typically accomplish one of two goals:

+
    +
  1. Create Project Cards to document network changes as a result of either transit or roadway projects.

  2. +
  3. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network.

  4. +
+
+

Project Cards from Transit LIN Files

+
+
+

Project Cards from Cube LOG Files

+
+
+

Model Network Files for a Scenario

+
+
+
+
+

Running Quickstart Jupyter Notebooks

+

To learn basic lasso functionality, please refer to the following jupyter notebooks in the /notebooks directory:

+
    +
  • Lasso Project Card Creation Quickstart.ipynb

  • +
  • Lasso Scenario Creation Quickstart.ipynb

  • +
+

Jupyter notebooks can be started by activating the lasso conda environment and typing jupyter notebook:

+
conda activate <my_lasso_environment>
+jupyter notebook
+
+
+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/.buildinfo b/branch/seperate_maz_and_taz/.buildinfo new file mode 100644 index 0000000..b0ac44f --- /dev/null +++ b/branch/seperate_maz_and_taz/.buildinfo @@ -0,0 +1,4 @@ +# Sphinx build info version 1 +# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. +config: a6ecde6fa215bda40c6c2c465fb089ee +tags: d77d1c0d9ca2f4c8421862c7c5a0d620 diff --git a/branch/seperate_maz_and_taz/_generated/lasso.CubeTransit/index.html b/branch/seperate_maz_and_taz/_generated/lasso.CubeTransit/index.html new file mode 100644 index 0000000..1bdcd1f --- /dev/null +++ b/branch/seperate_maz_and_taz/_generated/lasso.CubeTransit/index.html @@ -0,0 +1,571 @@ + + + + + + + lasso.CubeTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.CubeTransit

+
+
+class lasso.CubeTransit(parameters={})[source]
+

Bases: object

+

Class for storing information about transit defined in Cube line +files.

+

Has the capability to:

+
+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
+

Typical usage example:

+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+lines
+

list of strings representing unique line names in +the cube network.

+
+
Type:
+

list

+
+
+
+ +
+
+line_properties
+

dictionary of line properties keyed by line name. Property +values are stored in a dictionary by property name. These +properties are directly read from the cube line files and haven’t +been translated to standard transit values.

+
+
Type:
+

dict

+
+
+
+ +
+
+shapes
+

dictionary of shapes +keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns:

+
+
    +
  • ‘node_id’ (int): positive integer of node id

  • +
  • ‘node’ (int): node number, with negative indicating a non-stop

  • +
  • ‘stop’ (boolean): indicates if it is a stop

  • +
  • ‘order’ (int): order within this shape

  • +
+
+
+
Type:
+

dict

+
+
+
+ +
+
+program_type
+

Either PT or TRNBLD

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

Parameters instance that will be applied to this instance which +includes information about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+source_list
+

List of cube line file sources that have been read and added.

+
+
Type:
+

list

+
+
+
+ +
+
+diff_dict
+
+
Type:
+

dict

+
+
+
+ +
+
+__init__(parameters={})[source]
+

Constructor for CubeTransit

+

parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([parameters])

Constructor for CubeTransit

add_additional_time_periods(...)

Copies a route to another cube time period with appropriate values for time-period-specific properties.

add_cube(transit_source)

Reads a .lin file and adds it to existing TransitNetwork instance.

build_route_name([route_id, time_period, ...])

Create a route name by contatenating route, time period, agency, and direction

calculate_start_end_times(line_properties_dict)

Calculate the start and end times of the property change WARNING: Doesn't take care of discongruous time periods!!!!

create_add_route_card_dict(line)

Creates a project card change formatted dictionary for adding a route based on the information in self.route_properties for the line.

create_delete_route_card_dict(line, ...)

Creates a project card change formatted dictionary for deleting a line.

create_from_cube(transit_source[, parameters])

Reads a cube .lin file and stores as TransitNetwork object.

create_update_route_card_dict(line, ...)

Creates a project card change formatted dictionary for updating the line.

cube_properties_to_standard_properties(...)

Converts cube style properties to standard properties.

evaluate_differences(base_transit)

    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
+

evaluate_route_property_differences(...[, ...])

Checks if any values have been updated or added for a specific route and creates project card entries for each.

evaluate_route_shape_changes(shape_build, ...)

Compares two route shapes and constructs returns list of changes suitable for a project card.

get_time_period_numbers_from_cube_properties(...)

Finds properties that are associated with time periods and the returns the numbers in them.

unpack_route_name(line_name)

Unpacks route name into direction, route, agency, and time period info

+
+
+add_additional_time_periods(new_time_period_number, orig_line_name)[source]
+

Copies a route to another cube time period with appropriate +values for time-period-specific properties.

+
+
New properties are stored under the new name in:
    +
  • ::self.shapes

  • +
  • ::self.line_properties

  • +
+
+
+
+
Parameters:
+
    +
  • new_time_period_number (int) – cube time period number

  • +
  • orig_line_name (str) – name of the originating line, from which +the new line will copy its properties.

  • +
+
+
Returns:
+

Line name with new time period.

+
+
+
+ +
+
+add_cube(transit_source)[source]
+

Reads a .lin file and adds it to existing TransitNetwork instance.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
+
+ +
+
+static build_route_name(route_id='', time_period='', agency_id=0, direction_id=1)[source]
+

Create a route name by contatenating route, time period, agency, and direction

+
+
Parameters:
+
    +
  • route_id – i.e. 452-111

  • +
  • time_period – i.e. pk

  • +
  • direction_id – i.e. 1

  • +
  • agency_id – i.e. 0

  • +
+
+
Returns:
+

constructed line_name i.e. “0_452-111_452_pk1”

+
+
+
+ +
+
+calculate_start_end_times(line_properties_dict)[source]
+

Calculate the start and end times of the property change +WARNING: Doesn’t take care of discongruous time periods!!!!

+
+
Parameters:
+

line_properties_dict – dictionary of cube-flavor properties for a transit line

+
+
+
+ +
+
+create_add_route_card_dict(line)[source]
+

Creates a project card change formatted dictionary for adding +a route based on the information in self.route_properties for +the line.

+
+
Parameters:
+

line – name of line that is being updated

+
+
Returns:
+

A project card change-formatted dictionary for the route addition.

+
+
+
+ +
+
+create_delete_route_card_dict(line, base_transit_line_properties_dict)[source]
+

Creates a project card change formatted dictionary for deleting a line.

+
+
Parameters:
+
    +
  • line – name of line that is being deleted

  • +
  • base_transit_line_properties_dict – dictionary of cube-style +attribute values in order to find time periods and +start and end times.

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the route deletion.

+
+
+
+ +
+
+static create_from_cube(transit_source, parameters={})[source]
+

Reads a cube .lin file and stores as TransitNetwork object.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
Returns:
+

A ::CubeTransit object created from the transit_source.

+
+
+
+ +
+
+create_update_route_card_dict(line, updated_properties_dict)[source]
+

Creates a project card change formatted dictionary for updating +the line.

+
+
Parameters:
+
    +
  • line – name of line that is being updated

  • +
  • updated_properties_dict – dictionary of attributes to update as +‘property’: <property name>, +‘set’: <new property value>

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the attribute update.

+
+
+
+ +
+
+static cube_properties_to_standard_properties(cube_properties_dict)[source]
+

Converts cube style properties to standard properties.

+

This is most pertinent to time-period specific variables like headway, +and varibles that have stnadard units like headway, which is minutes +in cube and seconds in standard format.

+
+
Parameters:
+

cube_properties_dict – <cube style property name> : <property value>

+
+
Returns:
+

+
<standard

style property name>, “set” : <property value with correct units>`

+
+
+

+
+
Return type:
+

A list of dictionaries with values for `”property”

+
+
+
+ +
+
+evaluate_differences(base_transit)[source]
+
    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
  3. +
    For routes being added or updated, identify if the time periods

    have changed or if there are multiples, and make duplicate lines if so

    +
    +
    +
  4. +
  5. Create project card dictionaries for each change.

  6. +
+
+
Parameters:
+

base_transit (CubeTransit) – an instance of this class for the base condition

+
+
Returns:
+

A list of dictionaries containing project card changes +required to evaluate the differences between the base network +and this transit network instance.

+
+
+
+ +
+
+evaluate_route_property_differences(properties_build, properties_base, time_period_number, absolute=True, validate_base=False)[source]
+

Checks if any values have been updated or added for a specific +route and creates project card entries for each.

+
+
Parameters:
+
    +
  • properties_build – ::<property_name>: <property_value>

  • +
  • properties_base – ::<property_name>: <property_value>

  • +
  • time_period_number – time period to evaluate

  • +
  • absolute – if True, will use set command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway

  • +
  • validate_base – if True, will add the existing line in the project card

  • +
+
+
Returns:
+

+
a list of dictionary values suitable for writing to a project card

{ +‘property’: <property_name>, +‘set’: <set value>, +‘change’: <change from existing value>, +‘existing’: <existing value to check>, +}

+
+
+

+
+
Return type:
+

transit_change_list (list)

+
+
+
+ +
+
+static evaluate_route_shape_changes(shape_build, shape_base)[source]
+

Compares two route shapes and constructs returns list of changes +suitable for a project card.

+
+
Parameters:
+
    +
  • shape_build – DataFrame of the build-version of the route shape.

  • +
  • shape_base – dDataFrame of the base-version of the route shape.

  • +
+
+
Returns:
+

List of shape changes formatted as a project card-change dictionary.

+
+
+
+ +
+
+static get_time_period_numbers_from_cube_properties(properties_list)[source]
+

Finds properties that are associated with time periods and the +returns the numbers in them.

+
+
Parameters:
+

properties_list (list) – list of all properties.

+
+
Returns:
+

list of strings of the time period numbers found

+
+
+
+ +
+
+static unpack_route_name(line_name)[source]
+

Unpacks route name into direction, route, agency, and time period info

+
+
Parameters:
+

line_name (str) – i.e. “0_452-111_452_pk1”

+
+
Returns:
+

452-111 +time_period (str): i.e. pk +direction_id (str) : i.e. 1 +agency_id (str) : i.e. 0

+
+
Return type:
+

route_id (str)

+
+
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_generated/lasso.ModelRoadwayNetwork/index.html b/branch/seperate_maz_and_taz/_generated/lasso.ModelRoadwayNetwork/index.html new file mode 100644 index 0000000..8837127 --- /dev/null +++ b/branch/seperate_maz_and_taz/_generated/lasso.ModelRoadwayNetwork/index.html @@ -0,0 +1,1573 @@ + + + + + + + lasso.ModelRoadwayNetwork — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.ModelRoadwayNetwork

+
+
+class lasso.ModelRoadwayNetwork(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Bases: RoadwayNetwork

+

Subclass of network_wrangler class RoadwayNetwork

+

A representation of the physical roadway network and its properties.

+
+
+__init__(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Constructor

+
+
Parameters:
+
    +
  • nodes – geodataframe of nodes

  • +
  • links – dataframe of links

  • +
  • shapes – geodataframe of shapes

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. +If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(nodes, links, shapes[, parameters])

Constructor

add_counts([network_variable, ...])

Adds count variable.

add_incident_link_data_to_nodes([links_df, ...])

Add data from links going to/from nodes to node.

add_new_roadway_feature_change(links, nodes)

add the new roadway features defined in the project card.

add_variable_using_shst_reference([...])

Join network links with source data, via SHST API node match result.

addition_map(links, nodes)

Shows which links and nodes are added to the roadway network

apply(project_card_dictionary)

Wrapper method to apply a project to a roadway network.

apply_managed_lane_feature_change(link_idx, ...)

Apply the managed lane feature changes to the roadway network

apply_python_calculation(pycode[, in_place])

Changes roadway network object by executing pycode.

apply_roadway_feature_change(link_idx, ...)

Changes the roadway attributes for the selected features based on the project card information passed

assess_connectivity([mode, ...])

Returns a network graph and list of disconnected subgraphs as described by a list of their member nodes.

build_selection_key(selection_dict)

Selections are stored by a key combining the query and the A and B ids.

calculate_area_type([area_type_shape, ...])

#MC Calculates area type variable.

calculate_centroidconnect(parameters[, ...])

Calculates centroid connector variable.

calculate_county([county_shape, ...])

#MC Calculates county variable.

calculate_distance([network_variable, ...])

calculate link distance in miles

calculate_mpo([county_network_variable, ...])

Calculates mpo variable.

calculate_use([network_variable, ...])

Calculates use variable.

convert_int([int_col_names])

Convert integer columns

create_ML_variable([network_variable, overwrite])

Created ML lanes placeholder for project to write out ML changes

create_calculated_variables()

Creates calculated roadway variables.

create_dummy_connector_links(ml_df[, ...])

create dummy connector links between the general purpose and managed lanes

create_hov_corridor_variable([...])

Created hov corridor placeholder for project to write out corridor changes

create_managed_lane_network([...])

Create a roadway network with managed lanes links separated out.

create_managed_variable([network_variable, ...])

Created placeholder for project to write out managed

dataframe_to_fixed_width(df)

Convert dataframe to fixed width format, geometry column will not be transformed.

delete_roadway_feature_change(links, nodes)

delete the roadway features defined in the project card.

deletion_map(links, nodes)

Shows which links and nodes are deleted from the roadway network

fill_na()

Fill na values from create_managed_lane_network()

from_RoadwayNetwork(roadway_network_object)

RoadwayNetwork to ModelRoadwayNetwork

get_attribute(links_df, join_key, ...)

Gets attribute from source data using SHST match result.

get_managed_lane_node_ids(nodes_list[, scalar])

Transform a list of node IDS by a scalar.

get_modal_graph(links_df, nodes_df[, mode, ...])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

get_modal_links_nodes(links_df, nodes_df[, ...])

Returns nodes and link dataframes for specific mode.

get_property_by_time_period_and_group(prop)

Return a series for the properties with a specific group or time period.

identify_segment(O_id, D_id[, ...])

+
param endpoints:
+

list of length of two unique keys of nodes making up endpoints of segment

+
+
+

identify_segment_endpoints([mode, links_df, ...])

+
param mode:
+

list of modes of the network, one of drive,`transit`,

+
+
+

is_network_connected([mode, links_df, nodes_df])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

load_transform_network(node_filename, ...[, ...])

Reads roadway network files from disk and transforms them into GeoDataFrames.

network_connection_plot(G, ...)

Plot a graph to check for network connection.

orig_dest_nodes_foreign_key(selection[, ...])

Returns the foreign key id (whatever is used in the u and v variables in the links file) for the AB nodes as a tuple.

ox_graph(nodes_df, links_df[, ...])

create an osmnx-flavored network graph

path_search(candidate_links_df, O_id, D_id)

+
param candidate_links:
+

selection of links geodataframe with links likely to be part of path

+
+
+

read(link_filename, node_filename, ...[, ...])

Reads in links and nodes network standard.

read_match_result(path)

Reads the shst geojson match returns.

rename_variables_for_dbf(input_df[, ...])

Rename attributes for DBF/SHP, make sure length within 10 chars.

roadway_net_to_gdf(roadway_net)

+
rtype:
+

GeoDataFrame

+
+
+

roadway_standard_to_met_council_network([...])

Rename and format roadway attributes to be consistent with what metcouncil's model is expecting.

select_roadway_features(selection[, ...])

Selects roadway features that satisfy selection criteria

selection_has_unique_link_id(selection_dict)

+
rtype:
+

bool

+
+
+

selection_map(selected_link_idx[, A, B, ...])

Shows which links are selected for roadway property change or parallel managed lanes category of roadway projects.

shortest_path(graph_links_df, O_id, D_id[, ...])

+
rtype:
+

tuple

+
+
+

split_properties_by_time_period_and_category([...])

Splits properties by time period, assuming a variable structure of

update_distance([links_df, use_shapes, ...])

Calculate link distance in specified units to network variable using either straight line distance or (if specified) shape distance if available.

validate_link_schema(link_filename[, ...])

Validate roadway network data link schema and output a boolean

validate_node_schema(node_file[, ...])

Validate roadway network data node schema and output a boolean

validate_properties(properties[, ...])

If there are change or existing commands, make sure that that property exists in the network.

validate_selection(selection[, ...])

Evaluate whetther the selection dictionary contains the minimum required values.

validate_shape_schema(shape_file[, ...])

Validate roadway network data shape schema and output a boolean

validate_uniqueness()

Confirms that the unique identifiers are met.

write([path, filename])

Writes a network in the roadway network standard

write_roadway_as_fixedwidth(output_dir[, ...])

Writes out fixed width file.

write_roadway_as_shp(output_dir[, ...])

Write out dbf/shp/gpkg for cube.

+

Attributes

+ + + + + + +

CALCULATED_VALUES

+
+
+add_counts(network_variable='AADT', mndot_count_shst_data=None, widot_count_shst_data=None, mndot_count_variable_shp=None, widot_count_variable_shp=None)[source]
+

Adds count variable. +#MC +join the network with count node data, via SHST API node match result

+
+
Parameters:
+
    +
  • network_variable (str) – Name of the variable that should be written to. Default to “AADT”.

  • +
  • mndot_count_shst_data (str) – File path to MNDOT count location SHST API node match result.

  • +
  • widot_count_shst_data (str) – File path to WIDOT count location SHST API node match result.

  • +
  • mndot_count_variable_shp (str) – File path to MNDOT count location geodatabase.

  • +
  • widot_count_variable_shp (str) – File path to WIDOT count location geodatabase.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+ +

Add data from links going to/from nodes to node.

+
+
Return type:
+

DataFrame

+
+
Parameters:
+
    +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
  • link_variables – list of columns in links dataframe to add to incident nodes

  • +
+
+
Returns:
+

nodes DataFrame with link data where length is N*number of links going in/out

+
+
+
+ +
+
+add_new_roadway_feature_change(links, nodes)
+

add the new roadway features defined in the project card. +new shapes are also added for the new roadway links.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – list of dictionaries

  • +
  • nodes – list of dictionaries

  • +
+
+
+

returns: None

+
+ +
+
+add_variable_using_shst_reference(var_shst_csvdata=None, shst_csv_variable=None, network_variable=None, network_var_type=<class 'int'>, overwrite=False)[source]
+

Join network links with source data, via SHST API node match result.

+
+
Parameters:
+
    +
  • var_shst_csvdata (str) – File path to SHST API return.

  • +
  • shst_csv_variable (str) – Variable name in the source data.

  • +
  • network_variable (str) – Name of the variable that should be written to.

  • +
  • network_var_type – Variable type in the written network.

  • +
  • overwrite (bool) – True is overwriting existing variable. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+addition_map(links, nodes)
+

Shows which links and nodes are added to the roadway network

+
+ +
+
+apply(project_card_dictionary)
+

Wrapper method to apply a project to a roadway network.

+
+
Parameters:
+

project_card_dictionary – dict +a dictionary of the project card object

+
+
+
+ +
+
+apply_managed_lane_feature_change(link_idx, properties, in_place=True)
+

Apply the managed lane feature changes to the roadway network

+
+
Parameters:
+
    +
  • link_idx – list of lndices of all links to apply change to

  • +
  • properties – list of dictionarys roadway properties to change

  • +
  • in_place – boolean to indicate whether to update self or return +a new roadway network object

  • +
+
+
+
+ +
+
+apply_python_calculation(pycode, in_place=True)
+

Changes roadway network object by executing pycode.

+
+
Parameters:
+
    +
  • pycode – python code which changes values in the roadway network object

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+
+ +
+
+apply_roadway_feature_change(link_idx, properties, in_place=True)
+

Changes the roadway attributes for the selected features based on the +project card information passed

+
+
Parameters:
+
    +
  • link_idx – list +lndices of all links to apply change to

  • +
  • properties – list of dictionarys +roadway properties to change

  • +
  • in_place – boolean +update self or return a new roadway network object

  • +
+
+
+
+ +
+
+assess_connectivity(mode='', ignore_end_nodes=True, links_df=None, nodes_df=None)
+

Returns a network graph and list of disconnected subgraphs +as described by a list of their member nodes.

+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • ignore_end_nodes – if True, ignores stray singleton nodes

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+
Returns: Tuple of

Network Graph (osmnx flavored networkX DiGraph) +List of disconnected subgraphs described by the list of their

+
+

member nodes (as described by their model_node_id)

+
+
+
+
+ +
+
+build_selection_key(selection_dict)
+

Selections are stored by a key combining the query and the A and B ids. +This method combines the two for you based on the selection dictionary.

+
+
Return type:
+

tuple

+
+
Parameters:
+

selection_dictonary – Selection Dictionary

+
+
+

Returns: Tuple serving as the selection key.

+
+ +
+
+calculate_area_type(area_type_shape=None, area_type_shape_variable=None, network_variable='area_type', area_type_codes_dict=None, downtown_area_type_shape=None, downtown_area_type=None, overwrite=False)[source]
+

#MC +Calculates area type variable.

+

This uses the centroid of the geometry field to determine which area it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • area_type_shape (str) – The File path to area geodatabase.

  • +
  • area_type_shape_variable (str) – The variable name of area type in area geodadabase.

  • +
  • network_variable (str) – The variable name of area type in network standard. Default to “area_type”.

  • +
  • area_type_codes_dict – The dictionary to map input area_type_shape_variable to network_variable

  • +
  • downtown_area_type_shape – The file path to the downtown area type boundary.

  • +
  • downtown_area_type (int) – Integer value of downtown area type

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_centroidconnect(parameters, network_variable='centroidconnect', highest_taz_number=None, as_integer=True, overwrite=False)[source]
+

Calculates centroid connector variable.

+
+
Parameters:
+
    +
  • parameters (Parameters) – A Lasso Parameters, which stores input files.

  • +
  • network_variable (str) – Variable that should be written to in the network. Default to “centroidconnect”

  • +
  • highest_taz_number (int) – the max TAZ number in the network.

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Default to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

RoadwayNetwork

+
+
+
+ +
+
+calculate_county(county_shape=None, county_shape_variable=None, network_variable='county', county_codes_dict=None, overwrite=False)[source]
+

#MC +Calculates county variable.

+

This uses the centroid of the geometry field to determine which county it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • county_shape (str) – The File path to county geodatabase.

  • +
  • county_shape_variable (str) – The variable name of county in county geodadabase.

  • +
  • network_variable (str) – The variable name of county in network standard. Default to “county”.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_distance(network_variable='distance', centroidconnect_only=False, overwrite=False)[source]
+

calculate link distance in miles

+
+
Parameters:
+
    +
  • centroidconnect_only (Bool) – True if calculating distance for centroidconnectors only. Default to False.

  • +
  • overwrite (Bool) – True if overwriting existing variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_mpo(county_network_variable='county', network_variable='mpo', as_integer=True, mpo_counties=None, overwrite=False)[source]
+

Calculates mpo variable. +#MC +:param county_variable: Name of the variable where the county names are stored. Default to “county”. +:type county_variable: str +:param network_variable: Name of the variable that should be written to. Default to “mpo”. +:type network_variable: str +:param as_integer: If true, will convert true/false to 1/0s. +:type as_integer: bool +:param mpo_counties: List of county names that are within mpo region. +:type mpo_counties: list +:param overwrite: True if overwriting existing county variable in network. Default to False. +:type overwrite: Bool

+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_use(network_variable='use', as_integer=True, overwrite=False)[source]
+

Calculates use variable.

+
+
Parameters:
+
    +
  • network_variable (str) – Variable that should be written to in the network. Default to “use”

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Defauly to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+convert_int(int_col_names=[])[source]
+

Convert integer columns

+
+ +
+
+create_ML_variable(network_variable='ML_lanes', overwrite=False)[source]
+

Created ML lanes placeholder for project to write out ML changes

+

ML lanes default to 0, ML info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_calculated_variables()[source]
+

Creates calculated roadway variables.

+
+
Parameters:
+

None

+
+
+
+ +
+ +

create dummy connector links between the general purpose and managed lanes

+
+
Parameters:
+
    +
  • gp_df – GeoDataFrame +dataframe of general purpose links (where managed lane also exists)

  • +
  • ml_df – GeoDataFrame +dataframe of corresponding managed lane links,

  • +
  • access_lanes – int +number of lanes in access dummy link

  • +
  • egress_lanes – int +number of lanes in egress dummy link

  • +
  • access_roadway – str +roaday type for access dummy link

  • +
  • egress_roadway – str +roadway type for egress dummy link

  • +
  • access_name_prefix – str +prefix for access dummy link name

  • +
  • egress_name_prefix – str +prefix for egress dummy link name

  • +
+
+
+
+ +
+
+create_hov_corridor_variable(network_variable='segment_id', overwrite=False)[source]
+

Created hov corridor placeholder for project to write out corridor changes

+

hov corridor id default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_managed_lane_network(keep_same_attributes_ml_and_gp=None, keep_additional_attributes_ml_and_gp=[], managed_lanes_required_attributes=[], managed_lanes_node_id_scalar=None, managed_lanes_link_id_scalar=None, in_place=False)
+

Create a roadway network with managed lanes links separated out. +Add new parallel managed lane links, access/egress links, +and add shapes corresponding to the new links

+
+
Return type:
+

RoadwayNetwork

+
+
Parameters:
+
    +
  • keep_same_attributes_ml_and_gp – list of attributes to copy from general purpose +lane to managed lane. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to KEEP_SAME_ATTRIBUTES_ML_AND_GP.

  • +
  • keep_additional_attributes_ml_and_gp – list of additional attributes to add. This is useful +if you want to leave the default attributes and then ALSO some others.

  • +
  • managed_lanes_required_attributes – list of attributes that are required to be specified +in new managed lanes. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_REQUIRED_ATTRIBUTES.

  • +
  • managed_lanes_node_id_scalar – integer value added to original node IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_NODE_ID_SCALAR.

  • +
  • managed_lanes_link_id_scalar – integer value added to original link IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_LINK_ID_SCALAR.

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+

returns: A RoadwayNetwork instance

+
+ +
+
+create_managed_variable(network_variable='managed', overwrite=False)[source]
+

Created placeholder for project to write out managed

+

managed default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+static dataframe_to_fixed_width(df)[source]
+

Convert dataframe to fixed width format, geometry column will not be transformed.

+
+
Parameters:
+

df (pandas DataFrame) –

+
+
Returns:
+

dataframe with fixed width for each column. +dict: dictionary with columns names as keys, column width as values.

+
+
Return type:
+

pandas dataframe

+
+
+
+ +
+
+delete_roadway_feature_change(links, nodes, ignore_missing=True)
+

delete the roadway features defined in the project card. +valid links and nodes defined in the project gets deleted +and shapes corresponding to the deleted links are also deleted.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – dict +list of dictionaries

  • +
  • nodes – dict +list of dictionaries

  • +
  • ignore_missing – bool +If True, will only warn about links/nodes that are missing from +network but specified to “delete” in project card +If False, will fail.

  • +
+
+
+
+ +
+
+deletion_map(links, nodes)
+

Shows which links and nodes are deleted from the roadway network

+
+ +
+
+fill_na()[source]
+

Fill na values from create_managed_lane_network()

+
+ +
+
+static from_RoadwayNetwork(roadway_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • roadway_network_object (RoadwayNetwork) –

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static get_attribute(links_df, join_key, source_shst_ref_df, source_gdf, field_name)[source]
+

Gets attribute from source data using SHST match result.

+
+
Parameters:
+
    +
  • links_df (dataframe) – The network dataframe that new attribute should be written to.

  • +
  • join_key (str) – SHST ID variable name used to join source data with network dataframe.

  • +
  • source_shst_ref_df (str) – File path to source data SHST match result.

  • +
  • source_gdf (str) – File path to source data.

  • +
  • field_name (str) – Name of the attribute to get from source data.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+static get_managed_lane_node_ids(nodes_list, scalar=4500000)
+

Transform a list of node IDS by a scalar. +..todo #237 what if node ids are not a number?

+
+
Parameters:
+
    +
  • nodes_list – list of integers

  • +
  • scalar – value to add to node IDs

  • +
+
+
+

Returns: list of integers

+
+ +
+
+static get_modal_graph(links_df, nodes_df, mode=None, modes_to_network_link_variables={'bike': ['bike_access'], 'bus': ['bus_only', 'drive_access'], 'drive': ['drive_access'], 'rail': ['rail_only'], 'transit': ['bus_only', 'rail_only', 'drive_access'], 'walk': ['walk_access']})
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: networkx: osmnx: DiGraph of network

+
+ +
+ +

Returns nodes and link dataframes for specific mode.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • modes – list of the modes of the network to be kept, must be in drive,`transit`,`rail`,`bus`, +walk, bike. For example, if bike and walk are selected, both bike and walk links will be kept.

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: tuple of DataFrames for links, nodes filtered by mode

+

links with walk access are not marked as having walk access +Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145 +modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]

+
+ +
+
+get_property_by_time_period_and_group(prop, time_period=None, category=None, default_return=None)
+

Return a series for the properties with a specific group or time period.

+
+
Parameters:
+
    +
  • prop (str) – the variable that you want from network

  • +
  • time_period (list(str)) – the time period that you are querying for +i.e. [‘16:00’, ‘19:00’]

  • +
  • category (str or list(str)(Optional)) –

    the group category +i.e. “sov”

    +

    or

    +

    list of group categories in order of search, i.e. +[“hov3”,”hov2”]

    +

  • +
  • default_return (what to return if variable or time period not found. Default is None.) –

  • +
+
+
Return type:
+

pandas series

+
+
+
+ +
+
+identify_segment(O_id, D_id, selection_dict={}, mode=None, nodes_df=None, links_df=None)
+
+
Parameters:
+
    +
  • endpoints – list of length of two unique keys of nodes making up endpoints of segment

  • +
  • selection_dict – dictionary of link variables to select candidate links from, otherwise will create a graph of ALL links which will be both a RAM hog and could result in odd shortest paths.

  • +
  • segment_variables – list of variables to keep

  • +
+
+
+
+ +
+
+identify_segment_endpoints(mode='', links_df=None, nodes_df=None, min_connecting_links=10, min_distance=None, max_link_deviation=2)
+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+ +
+
+is_network_connected(mode=None, links_df=None, nodes_df=None)
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
+
+
+

Returns: boolean

+
+ +
+
+static load_transform_network(node_filename, link_filename, shape_filename, crs=4326, node_foreign_key='model_node_id', validate_schema=True, **kwargs)
+

Reads roadway network files from disk and transforms them into GeoDataFrames.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • node_filename – file name for nodes.

  • +
  • link_filename – file name for links.

  • +
  • shape_filename – file name for shapes.

  • +
  • crs – coordinate reference system. Defaults to value in CRS.

  • +
  • node_foreign_key – variable linking the node table to the link table. Defaults +to NODE_FOREIGN_KEY.

  • +
  • validate_schema – boolean indicating if network should be validated to schema.

  • +
+
+
+

returns: tuple of GeodataFrames nodes_df, links_df, shapes_df

+
+ +
+
+static network_connection_plot(G, disconnected_subgraph_nodes)
+

Plot a graph to check for network connection.

+
+
Parameters:
+
    +
  • G – OSMNX flavored networkX graph.

  • +
  • disconnected_subgraph_nodes – List of disconnected subgraphs described by the list of their +member nodes (as described by their model_node_id).

  • +
+
+
+

returns: fig, ax : tuple

+
+ +
+
+orig_dest_nodes_foreign_key(selection, node_foreign_key='')
+

Returns the foreign key id (whatever is used in the u and v +variables in the links file) for the AB nodes as a tuple.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • selection – selection dictionary with A and B keys

  • +
  • node_foreign_key – variable name for whatever is used by the u and v variable

  • +
  • specified (in the links_df file. If nothing is) –

  • +
  • whatever (assume) –

  • +
  • is (default) –

  • +
+
+
+

Returns: tuple of (A_id, B_id)

+
+ +
+
+static ox_graph(nodes_df, links_df, node_foreign_key='model_node_id', link_foreign_key=['A', 'B'], unique_link_key='model_link_id')
+

create an osmnx-flavored network graph

+

osmnx doesn’t like values that are arrays, so remove the variables +that have arrays. osmnx also requires that certain variables +be filled in, so do that too.

+
+
Parameters:
+
    +
  • nodes_df – GeoDataFrame of nodes

  • +
  • link_df – GeoDataFrame of links

  • +
  • node_foreign_key – field referenced in link_foreign_key

  • +
  • link_foreign_key – list of attributes that define the link start and end nodes to the node foreign key

  • +
  • unique_link_key – primary key for links

  • +
+
+
+

Returns: a networkx multidigraph

+
+ +
+ +
+
Parameters:
+
    +
  • candidate_links – selection of links geodataframe with links likely to be part of path

  • +
  • O_id – origin node foreigh key ID

  • +
  • D_id – destination node foreigh key ID

  • +
  • weight_column – column to use for weight of shortest path. Defaults to “i” (iteration)

  • +
  • weight_factor – optional weight to multiply the weight column by when finding the shortest path

  • +
  • search_breadth

  • +
+
+
+

Returns

+
+ +
+
+static read(link_filename, node_filename, shape_filename, fast=False, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Reads in links and nodes network standard.

+
+
Parameters:
+
    +
  • link_filename (str) – File path to link json.

  • +
  • node_filename (str) – File path to node geojson.

  • +
  • shape_filename (str) – File path to link true shape geojson

  • +
  • fast (bool) – boolean that will skip validation to speed up read time.

  • +
  • recalculate_calculated_variables (bool) – calculates fields from spatial joins, etc.

  • +
  • recalculate_distance (bool) – re-calculates distance.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static read_match_result(path)[source]
+

Reads the shst geojson match returns.

+

Returns shst dataframe.

+

Reading lots of same type of file and concatenating them into a single DataFrame.

+
+
Parameters:
+

path (str) – File path to SHST match results.

+
+
Returns:
+

geopandas geodataframe

+
+
Return type:
+

geodataframe

+
+
+

##todo +not sure why we need, but should be in utilities not this class

+
+ +
+
+rename_variables_for_dbf(input_df, variable_crosswalk=None, output_variables=None, convert_geometry_to_xy=False)[source]
+

Rename attributes for DBF/SHP, make sure length within 10 chars.

+
+
Parameters:
+
    +
  • input_df (dataframe) – Network standard DataFrame.

  • +
  • variable_crosswalk (str) – File path to variable name crosswalk from network standard to DBF names.

  • +
  • output_variables (list) – List of strings for DBF variables.

  • +
  • convert_geometry_to_xy (bool) – True if converting node geometry to X/Y

  • +
+
+
Returns:
+

dataframe

+
+
+
+ +
+
+static roadway_net_to_gdf(roadway_net)
+
+
Return type:
+

GeoDataFrame

+
+
+

Turn the roadway network into a GeoDataFrame +:param roadway_net: the roadway network to export

+

returns: shapes dataframe

+
+ +
+
+roadway_standard_to_met_council_network(output_epsg=None)[source]
+

Rename and format roadway attributes to be consistent with what metcouncil’s model is expecting. +#MC +:param output_epsg: epsg number of output network. +:type output_epsg: int

+
+
Returns:
+

None

+
+
+
+ +
+
+select_roadway_features(selection, search_mode='drive', force_search=False, sp_weight_factor=None)
+

Selects roadway features that satisfy selection criteria

+
+
Return type:
+

GeoDataFrame

+
+
+
+
Example usage:
+
net.select_roadway_features(
+
selection = [ {

# a match condition for the from node using osm, +# shared streets, or model node number +‘from’: {‘osm_model_link_id’: ‘1234’}, +# a match for the to-node.. +‘to’: {‘shstid’: ‘4321’}, +# a regex or match for facility condition +# could be # of lanes, facility type, etc. +‘facility’: {‘name’:’Main St’}, +}, … ])

+
+
+
+
+
+
+
+
Parameters:
+
    +
  • selection – dictionary with keys for: +A - from node +B - to node +link - which includes at least a variable for name

  • +
  • search_mode – mode which you are searching for; defaults to “drive”

  • +
  • force_search – boolean directing method to perform search even if one +with same selection dict is stored from a previous search.

  • +
  • sp_weight_factor – multiple used to discourage shortest paths which +meander from original search returned from name or ref query. +If not set here, will default to value of sp_weight_factor in +RoadwayNetwork instance. If not set there, will defaul to SP_WEIGHT_FACTOR.

  • +
+
+
+

Returns: a list of link IDs in selection

+
+ +
+ +
+
Return type:
+

bool

+
+
Parameters:
+

selection_dictionary – Dictionary representation of selection +of roadway features, containing a “link” key.

+
+
+
+
Returns: A boolean indicating if the selection dictionary contains

a unique identifier for links.

+
+
+
+ +
+
+selection_map(selected_link_idx, A=None, B=None, candidate_link_idx=[])
+

Shows which links are selected for roadway property change or parallel +managed lanes category of roadway projects.

+
+
Parameters:
+
    +
  • selected_links_idx – list of selected link indices

  • +
  • candidate_links_idx – optional list of candidate link indices to also include in map

  • +
  • A – optional foreign key of starting node of a route selection

  • +
  • B – optional foreign key of ending node of a route selection

  • +
+
+
+
+ +
+
+shortest_path(graph_links_df, O_id, D_id, nodes_df=None, weight_column='i', weight_factor=100)
+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • graph_links_df

  • +
  • O_id – foreign key for start node

  • +
  • D_id – foreign key for end node

  • +
  • nodes_df – optional nodes df, otherwise will use network instance

  • +
  • weight_column – column to use as a weight, defaults to “i”

  • +
  • weight_factor – any additional weighting to multiply the weight column by, defaults to SP_WEIGHT_FACTOR

  • +
+
+
+

Returns: tuple with length of four +- Boolean if shortest path found +- nx Directed graph of graph links +- route of shortest path nodes as List +- links in shortest path selected from links_df

+
+ +
+
+split_properties_by_time_period_and_category(properties_to_split=None)[source]
+

Splits properties by time period, assuming a variable structure of

+
+
Parameters:
+

properties_to_split

dict +dictionary of output variable prefix mapped to the source variable and what to stratify it by +e.g. +{

+
+

’lanes’ : {‘v’:’lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘ML_lanes’ : {‘v’:’ML_lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘use’ : {‘v’:’use’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}},

+
+

}

+

+
+
+
+ +
+
+update_distance(links_df=None, use_shapes=False, units='miles', network_variable='distance', overwrite=True, inplace=True)
+

Calculate link distance in specified units to network variable using either straight line +distance or (if specified) shape distance if available.

+
+
Parameters:
+
    +
  • links_df – Links GeoDataFrame. Useful if want to update a portion of network links +(i.e. only centroid connectors). If not provided, will use entire self.links_df.

  • +
  • use_shapes – if True, will add length information from self.shapes_df rather than crow-fly. +If no corresponding shape found in self.shapes_df, will default to crow-fly.

  • +
  • units – units to use. Defaults to the standard unit of miles. Available units: “meters”, “miles”.

  • +
  • network_variable – variable to store link distance in. Defaults to “distance”.

  • +
  • overwrite – Defaults to True and will overwrite all existing calculated distances. +False will only update NaNs.

  • +
  • inplace – updates self.links_df

  • +
+
+
Returns:
+

links_df with updated distance

+
+
+
+ +
+ +

Validate roadway network data link schema and output a boolean

+
+ +
+
+static validate_node_schema(node_file, schema_location='roadway_network_node.json')
+

Validate roadway network data node schema and output a boolean

+
+ +
+
+validate_properties(properties, ignore_existing=False, require_existing_for_change=False)
+

If there are change or existing commands, make sure that that +property exists in the network.

+
+
Return type:
+

bool

+
+
Parameters:
+
    +
  • properties – properties dictionary to be evaluated

  • +
  • ignore_existing – If True, will only warn about properties +that specify an “existing” value. If False, will fail.

  • +
  • require_existing_for_change – If True, will fail if there isn’t +a specified value in theproject card for existing when a +change is specified.

  • +
+
+
+

Returns: boolean value as to whether the properties dictonary is valid.

+
+ +
+
+validate_selection(selection, selection_requires=['link'])
+

Evaluate whetther the selection dictionary contains the +minimum required values.

+
+
Return type:
+

bool

+
+
Parameters:
+

selection – selection dictionary to be evaluated

+
+
+

Returns: boolean value as to whether the selection dictonary is valid.

+
+ +
+
+static validate_shape_schema(shape_file, schema_location='roadway_network_shape.json')
+

Validate roadway network data shape schema and output a boolean

+
+ +
+
+validate_uniqueness()
+

Confirms that the unique identifiers are met.

+
+
Return type:
+

bool

+
+
+
+ +
+
+write(path='.', filename=None)
+

Writes a network in the roadway network standard

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • path – the path were the output will be saved

  • +
  • filename – the name prefix of the roadway files that will be generated

  • +
+
+
+
+ +
+
+write_roadway_as_fixedwidth(output_dir, node_output_variables=None, link_output_variables=None, output_link_txt=None, output_node_txt=None, output_link_header_width_txt=None, output_node_header_width_txt=None, output_cube_network_script=None, drive_only=False)[source]
+

Writes out fixed width file.

+

This function does: +1. write out link and node fixed width data files for cube. +2. write out header and width correspondence. +3. write out cube network building script with header and width specification.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to where links, nodes and script will be written and run

  • +
  • node_output_variables (list) – list of node variable names.

  • +
  • link_output_variables (list) – list of link variable names.

  • +
  • output_link_txt (str) – File name of output link database (within output_dir)

  • +
  • output_node_txt (str) – File name of output node database (within output_dir)

  • +
  • output_link_header_width_txt (str) – File name of link column width records (within output_dir)

  • +
  • output_node_header_width_txt (str) – File name of node column width records (within output_dir)

  • +
  • output_cube_network_script (str) – File name of CUBE network building script (within output_dir)

  • +
  • drive_only (bool) – If True, only writes drive nodes and links

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+write_roadway_as_shp(output_dir, node_output_variables=None, link_output_variables=None, data_to_csv=True, data_to_dbf=False, output_link_shp=None, output_node_shp=None, output_link_csv=None, output_node_csv=None, output_gpkg=None, output_link_gpkg_layer=None, output_node_gpkg_layer=None, output_gpkg_link_filter=None)[source]
+

Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to directory

  • +
  • node_output_variables (list) – List of strings for node output variables.

  • +
  • link_output_variables (list) – List of strings for link output variables.

  • +
  • data_to_csv (bool) – True if write network in csv format.

  • +
  • data_to_dbf (bool) – True if write network in dbf/shp format.

  • +
  • output_link_shp (str) – File name to output link dbf/shp.

  • +
  • output_node_shp (str) – File name of output node dbf/shp.

  • +
  • output_link_csv (str) – File name to output link csv.

  • +
  • output_node_csv (str) – File name to output node csv.

  • +
  • output_gpkg (str) – File name to output GeoPackage.

  • +
  • output_link_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_node_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_gpkg_link_filter (str) – Optional column name to additional output link subset layers

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_generated/lasso.Parameters/index.html b/branch/seperate_maz_and_taz/_generated/lasso.Parameters/index.html new file mode 100644 index 0000000..9654be6 --- /dev/null +++ b/branch/seperate_maz_and_taz/_generated/lasso.Parameters/index.html @@ -0,0 +1,555 @@ + + + + + + + lasso.Parameters — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Parameters

+
+
+class lasso.Parameters(**kwargs)[source]
+

Bases: object

+

A class representing all the parameters defining the networks +including time of day, categories, etc.

+

Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class. +.. highlight:: python

+
+
Attr:
+
time_period_to_time (dict): Maps time period abbreviations used in

Cube to time of days used on gtfs and highway network standard +Default:

+
{
+    "EA": ("3:00", "6:00"),
+    "AM": ("6:00, "10:00"),
+    "MD": ("10:00", "15:00"),
+    "PM": ("15:00", "19:00"),
+    "EV": ("19:00", "3:00"),
+}
+
+
+
+
cube_time_periods (dict): Maps cube time period numbers used in

transit line files to the time period abbreviations in time_period_to_time +dictionary. +Default:

+
{"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"}
+
+
+
+
categories (dict): Maps demand category abbreviations to a list of

network categories they are allowed to use. +Default:

+
{
+    # suffix, source (in order of search)
+    "sov": ["sov", "default"],
+    "hov2": ["hov2", "default", "sov"],
+    "hov3": ["hov3", "hov2", "default", "sov"],
+    "truck": ["trk", "sov", "default"],
+}
+
+
+
+
properties_to_split (dict): Dictionary mapping variables in standard

roadway network to categories and time periods that need to be +split out in final model network to get variables like LANES_AM. +Default:

+
{
+    "lanes": {
+        "v": "lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "ML_lanes": {
+        "v": "ML_lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "use": {
+        "v": "use",
+        "time_periods": self.time_periods_to_time
+    },
+}
+
+
+
+
county_shape (str): File location of shapefile defining counties.

Default:

+
r"metcouncil_data/county/cb_2017_us_county_5m.shp"
+
+
+
+
county_variable_shp (str): Property defining the county n ame in

the county_shape file. +Default:

+
NAME
+
+
+
+
lanes_lookup_file (str): Lookup table of number of lanes for different data sources.

Default:

+
r"metcouncil_data/lookups/lanes.csv"
+
+
+
+
centroid_connect_lanes (int): Number of lanes for centroid connectors.

Default:

+
1
+
+
+
+
mpo_counties (list): list of county names within MPO boundary.

Default:

+
[
+    "ANOKA",
+    "DAKOTA",
+    "HENNEPIN",
+    "RAMSEY",
+    "SCOTT",
+    "WASHINGTON",
+    "CARVER",
+]
+
+
+
+
taz_shape (str):

Default:

+
r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp"
+
+
+
+
taz_data (str):

Default:

+
??
+
+
+
+
highest_taz_number (int): highest TAZ number in order to define

centroid connectors. +Default:

+
3100
+
+
+
+
output_variables (list): list of variables to output in final model

network. +Default:

+
[
+    "model_link_id",
+    "link_id",
+    "A",
+    "B",
+    "shstGeometryId",
+    "distance",
+    "roadway",
+    "name",
+    "roadway_class",
+    "bike_access",
+    "walk_access",
+    "drive_access",
+    "truck_access",
+    "trn_priority_EA",
+    "trn_priority_AM",
+    "trn_priority_MD",
+    "trn_priority_PM",
+    "trn_priority_EV",
+    "ttime_assert_EA",
+    "ttime_assert_AM",
+    "ttime_assert_MD",
+    "ttime_assert_PM",
+    "ttime_assert_EV",
+    "lanes_EA",
+    "lanes_AM",
+    "lanes_MD",
+    "lanes_PM",
+    "lanes_EV",
+    "price_sov_EA",
+    "price_hov2_EA",
+    "price_hov3_EA",
+    "price_truck_EA",
+    "price_sov_AM",
+    "price_hov2_AM",
+    "price_hov3_AM",
+    "price_truck_AM",
+    "price_sov_MD",
+    "price_hov2_MD",
+    "price_hov3_MD",
+    "price_truck_MD",
+    "price_sov_PM",
+    "price_hov2_PM",
+    "price_hov3_PM",
+    "price_truck_PM",
+    "price_sov_EV",
+    "price_hov2_EV",
+    "price_hov3_EV",
+    "price_truck_EV",
+    "roadway_class_idx",
+    "facility_type",
+    "county",
+    "centroidconnect",
+    "model_node_id",
+    "N",
+    "osm_node_id",
+    "bike_node",
+    "transit_node",
+    "walk_node",
+    "drive_node",
+    "geometry",
+    "X",
+    "Y",
+    "ML_lanes_EA",
+    "ML_lanes_AM",
+    "ML_lanes_MD",
+    "ML_lanes_PM",
+    "ML_lanes_EV",
+    "segment_id",
+    "managed",
+    "bus_only",
+    "rail_only"
+]
+
+
+
+
osm_facility_type_dict (dict): Mapping between OSM Roadway variable

and facility type. Default:

+
+
area_type_shape (str): Location of shapefile defining area type.

Default:

+
r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp"
+
+
+
+
area_type_variable_shp (str): property in area_type_shape with area

type in it. +Default:

+
"COMDES2040"
+
+
+
+
area_type_code_dict (dict): Mapping of the area_type_variable_shp to

the area type code used in the MetCouncil cube network. +Default:

+
{
+    23: 4,  # urban center
+    24: 3,
+    25: 2,
+    35: 2,
+    36: 1,
+    41: 1,
+    51: 1,
+    52: 1,
+    53: 1,
+    60: 1,
+}
+
+
+
+
downtown_area_type_shape (str): Location of shapefile defining downtown area type.

Default:

+
r"metcouncil_data/area_type/downtownzones_TAZ.shp"
+
+
+
+
downtown_area_type (int): Area type integer for downtown.

Default:

+
5
+
+
+
+
mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property

associated with roadway class. Default:

+
r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp"
+
+
+
+
mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp

associated with roadway class. Default:

+
"ROUTE_SYS"
+
+
+
+
widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property

associated with roadway class. Default:

+
r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp"
+
+
+
+
widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape

associated with roadway class.Default:

+
"RDWY_CTGY_"
+
+
+
+
mndot_count_shape (str): Shapefile of MnDOT links with a property

associated with counts. Default:

+
r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp"
+
+
+
+
mndot_count_variable_shp (str): The property in mndot_count_shape

associated with counts. Default:

+
+
::

“lookups/osm_highway_facility_type_crosswalk.csv”

+
+
+
+
legacy_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from Legacy TM2 network. Default:

+
"lookups/legacy_tm2_attributes.csv"
+
+
+
+
osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId

from OSM. Default:

+
"lookups/osm_lanes_attributes.csv"
+
+
+
+
tam_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from TAM TM2 network. Default:

+
"lookups/tam_tm2_attributes.csv"
+
+
+
+
tom_tom_attributes (str): CSV file of link attributes by

shStReferenceId from TomTom network. Default:

+
"lookups/tomtom_attributes.csv"
+
+
+
+
sfcta_attributes (str): CSV file of link attributes by

shStReferenceId from SFCTA network. Default:

+
"lookups/sfcta_attributes.csv"
+
+
+
+
output_epsg (int): EPSG type of geographic projection for output

shapefiles. Default:

+
102646
+
+
+
+
output_link_shp (str): Output shapefile for roadway links. Default:
+
::

r”tests/scratch/links.shp”

+
+
+
+
output_node_shp (str): Output shapefile for roadway nodes. Default:
+
::

r”tests/scratch/nodes.shp”

+
+
+
+
output_link_csv (str): Output csv for roadway links. Default:
+
::

r”tests/scratch/links.csv”

+
+
+
+
output_node_csv (str): Output csv for roadway nodes. Default:
+
::

r”tests/scratch/nodes.csv”

+
+
+
+
output_link_txt (str): Output fixed format txt for roadway links. Default:
+
::

r”tests/scratch/links.txt”

+
+
+
+
output_node_txt (str): Output fixed format txt for roadway nodes. Default:
+
::

r”tests/scratch/nodes.txt”

+
+
+
+
output_link_header_width_txt (str): Header for txt roadway links. Default:
+
::

r”tests/scratch/links_header_width.txt”

+
+
+
+
output_node_header_width_txt (str): Header for txt for roadway Nodes. Default:
+
::

r”tests/scratch/nodes_header_width.txt”

+
+
+
+
output_cube_network_script (str): Cube script for importing

fixed-format roadway network. Default:

+
r"tests/scratch/make_complete_network_from_fixed_width_file.s
+
+
+
+
+
+
+
+
+__init__(**kwargs)[source]
+

Time period and category splitting info

+
+ +

Methods

+ + + + + + +

__init__(**kwargs)

Time period and category splitting info

+

Attributes

+ + + + + + + + + + + + + + + +

maz_shape_file

#MC self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7}

properties_to_split

Details for calculating the county based on the centroid of the link.

county_link_range_dict

self.county_code_dict = {

zones

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+
self.county_code_dict = {

“Anoka”: 1, +“Carver”: 2, +“Dakota”: 3, +“Hennepin”: 4, +“Ramsey”: 5, +“Scott”: 6, +“Washington”: 7, +“external”: 10, +“Chisago”: 11, +“Goodhue”: 12, +“Isanti”: 13, +“Le Sueur”: 14, +“McLeod”: 15, +“Pierce”: 16, +“Polk”: 17, +“Rice”: 18, +“Sherburne”: 19, +“Sibley”: 20, +“St. Croix”: 21, +“Wright”: 22,

+
+
+

}

+
+ +
+
+maz_shape_file
+

#MC +self.route_type_bus_mode_dict = {“Urb Loc”: 5, “Sub Loc”: 6, “Express”: 7}

+

self.route_type_mode_dict = {0: 8, 2: 9}

+

self.cube_time_periods = {“1”: “AM”, “2”: “MD”} +self.cube_time_periods_name = {“AM”: “pk”, “MD”: “op”}

+
+ +
+
+properties_to_split
+

Details for calculating the county based on the centroid of the link. +The NAME varible should be the name of a field in shapefile.

+
+ +
+
+zones
+

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_generated/lasso.Project/index.html b/branch/seperate_maz_and_taz/_generated/lasso.Project/index.html new file mode 100644 index 0000000..db9cc2c --- /dev/null +++ b/branch/seperate_maz_and_taz/_generated/lasso.Project/index.html @@ -0,0 +1,521 @@ + + + + + + + lasso.Project — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Project

+
+
+class lasso.Project(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

Bases: object

+

A single or set of changes to the roadway or transit system.

+

Compares a base and a build transit network or a base and build +highway network and produces project cards.

+

Typical usage example:

+
test_project = Project.create_project(
+    base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+    build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+)
+test_project.evaluate_changes()
+test_project.write_project_card(
+    os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+)
+
+
+
+
+DEFAULT_PROJECT_NAME
+

a class-level constant that defines what +the project name will be if none is set.

+
+ +
+
+STATIC_VALUES
+

a class-level constant which defines values that +are not evaluated when assessing changes.

+
+ +
+
+card_data
+

{“project”: <project_name>, “changes”: <list of change dicts>}

+
+
Type:
+

dict

+
+
+
+ +
+ +

pandas dataframe of CUBE roadway link changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+roadway_node_changes
+

pandas dataframe of CUBE roadway node changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+transit_changes
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+base_roadway_network
+
+
Type:
+

RoadwayNetwork

+
+
+
+ +
+
+base_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+build_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+project_name
+

name of the project, set to DEFAULT_PROJECT_NAME if not provided

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

an instance of the Parameters class which sets a bunch of parameters

+
+ +
+
+__init__(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

ProjectCard constructor.

+
+
Parameters:
+
    +
  • roadway_link_changes – dataframe of roadway changes read from a log file

  • +
  • roadway_node_changes – dataframe of roadway changes read from a log file

  • +
  • transit_changes – dataframe of transit changes read from a log file

  • +
  • base_roadway_network – RoadwayNetwork instance for base case

  • +
  • base_transit_network – StandardTransit instance for base case

  • +
  • base_cube_transit_network – CubeTransit instance for base transit network

  • +
  • build_cube_transit_network – CubeTransit instance for build transit network

  • +
  • project_name – name of the project

  • +
  • evaluate – defaults to false, but if true, will create card data

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
+

returns: instance of ProjectCard

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([roadway_link_changes, ...])

ProjectCard constructor.

add_highway_changes([...])

Evaluates changes from the log file based on the base highway object and adds entries into the self.card_data dictionary.

add_transit_changes()

Evaluates changes between base and build transit objects and adds entries into the self.card_data dictionary.

create_project([roadway_log_file, ...])

Constructor for a Project instance.

determine_roadway_network_changes_compatability(...)

Checks to see that any links or nodes that change exist in base roadway network.

emme_id_to_wrangler_id(emme_link_change_df, ...)

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

emme_name_to_wrangler_name(...)

rename emme names to wrangler names using crosswalk file

evaluate_changes()

Determines which changes should be evaluated, initiates self.card_data to be an aggregation of transit and highway changes.

get_object_from_network_build_command()

determine the network build object is node or link

get_operation_from_network_build_command()

determine the network build object action type

read_logfile(logfilename)

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

read_network_build_file(networkbuildfilename)

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

write_project_card([filename])

Writes project cards.

+

Attributes

+ + + + + + + + + + + + +

CALCULATED_VALUES

DEFAULT_PROJECT_NAME

STATIC_VALUES

+
+
+add_highway_changes(limit_variables_to_existing_network=False)[source]
+

Evaluates changes from the log file based on the base highway object and +adds entries into the self.card_data dictionary.

+
+
Parameters:
+

limit_variables_to_existing_network (bool) – True if no ad-hoc variables. Default to False.

+
+
+
+ +
+
+add_transit_changes()[source]
+

Evaluates changes between base and build transit objects and +adds entries into the self.card_data dictionary.

+
+ +
+
+static create_project(roadway_log_file=None, roadway_shp_file=None, roadway_csv_file=None, network_build_file=None, emme_node_id_crosswalk_file=None, emme_name_crosswalk_file=None, base_roadway_dir=None, base_transit_dir=None, base_cube_transit_source=None, build_cube_transit_source=None, roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name=None, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Constructor for a Project instance.

+
+
Parameters:
+
    +
  • roadway_log_file – File path to consuming logfile or a list of logfile paths.

  • +
  • roadway_shp_file – File path to consuming shape file for roadway changes.

  • +
  • roadway_csv_file – File path to consuming csv file for roadway changes.

  • +
  • network_build_file – File path to consuming EMME network build for network changes.

  • +
  • base_roadway_dir – Folder path to base roadway network.

  • +
  • base_transit_dir – Folder path to base transit network.

  • +
  • base_cube_transit_source – Folder path to base transit network or cube line file string.

  • +
  • base_cube_transit_file – File path to base transit network.

  • +
  • build_cube_transit_source – Folder path to build transit network or cube line file string.

  • +
  • build_cube_transit_file – File path to build transit network.

  • +
  • roadway_link_changes – pandas dataframe of CUBE roadway link changes.

  • +
  • roadway_node_changes – pandas dataframe of CUBE roadway node changes.

  • +
  • transit_changes – build transit changes.

  • +
  • base_roadway_network – Base roadway network object.

  • +
  • base_cube_transit_network – Base cube transit network object.

  • +
  • build_cube_transit_network – Build cube transit network object.

  • +
  • project_name – If not provided, will default to the roadway_log_file filename if +provided (or the first filename if a list is provided)

  • +
  • recalculate_calculated_variables – if reading in a base network, if this is true it +will recalculate variables such as area type, etc. This only needs to be true +if you are creating project cards that are changing the calculated variables.

  • +
  • recalculate_distance – recalculate the distance variable. This only needs to be +true if you are creating project cards that change the distance.

  • +
  • parameters – dictionary of parameters

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in +the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables +in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

A Project instance.

+
+
+
+ +
+
+static determine_roadway_network_changes_compatability(base_roadway_network, roadway_link_changes, roadway_node_changes, parameters)[source]
+

Checks to see that any links or nodes that change exist in base roadway network.

+
+ +
+
+static emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file)[source]
+

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

+
+ +
+
+static emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file)[source]
+

rename emme names to wrangler names using crosswalk file

+
+ +
+
+evaluate_changes()[source]
+

Determines which changes should be evaluated, initiates +self.card_data to be an aggregation of transit and highway changes.

+
+ +
+
+get_object_from_network_build_command()[source]
+

determine the network build object is node or link

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘N’ for node, ‘L’ for link

+
+
+
+ +
+
+get_operation_from_network_build_command()[source]
+

determine the network build object action type

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘A’, ‘C’, ‘D’

+
+
+
+ +
+
+static read_logfile(logfilename)[source]
+

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

logfilename (str or list[str]) – File path to CUBE logfile or list of logfile paths.

+
+
Returns:
+

A DataFrame reprsentation of the log file.

+
+
+
+ +
+
+static read_network_build_file(networkbuildfilename)[source]
+

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

networkbuildfilename (str or list[str]) – File path to emme nework build file or list of network build file paths.

+
+
Returns:
+

A DataFrame representation of the network build file

+
+
+
+ +
+
+write_project_card(filename=None)[source]
+

Writes project cards.

+
+
Parameters:
+

filename (str) – File path to output .yml

+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'assign_group', 'centroidconnect']
+
+ +
+
+DEFAULT_PROJECT_NAME = 'USER TO define'
+
+ +
+
+STATIC_VALUES = ['model_link_id', 'area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_generated/lasso.StandardTransit/index.html b/branch/seperate_maz_and_taz/_generated/lasso.StandardTransit/index.html new file mode 100644 index 0000000..07f2d1d --- /dev/null +++ b/branch/seperate_maz_and_taz/_generated/lasso.StandardTransit/index.html @@ -0,0 +1,453 @@ + + + + + + + lasso.StandardTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.StandardTransit

+
+
+class lasso.StandardTransit(ptg_feed, parameters={})[source]
+

Bases: object

+

Holds a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s +Cube Line files.

+

Typical usage example:

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+feed
+

Partridge Feed object containing read-only access to GTFS feed

+
+ +
+
+parameters
+

Parameters instance containing information +about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+__init__(ptg_feed, parameters={})[source]
+
+
Parameters:
+
    +
  • ptg_feed – partridge feed object

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(ptg_feed[, parameters])

+
param ptg_feed:
+

partridge feed object

+
+
+

calculate_cube_mode(row)

Assigns a cube mode number by following logic.

cube_format(row)

Creates a string represnting the route in cube line file notation. #MC :param row: row of a DataFrame representing a cube-formatted trip, with the Attributes trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR.

evaluate_differences(transit_changes)

Compare changes from the transit_changes dataframe with the standard transit network returns the project card changes in dictionary format

fromTransitNetwork(transit_network_object[, ...])

RoadwayNetwork to ModelRoadwayNetwork

read_gtfs(gtfs_feed_dir[, parameters])

Reads GTFS files from a directory and returns a StandardTransit instance.

route_properties_gtfs_to_cube(self)

Prepare gtfs for cube lin file.

shape_gtfs_to_cube(row[, add_nntime])

Creates a list of nodes that for the route in appropriate cube format.

shape_gtfs_to_dict_list(trip_id, shape_id, ...)

This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of stepping through the routed nodes and corresponding them with shape nodes.

shape_gtfs_to_emme(trip_row)

Creates transit segment for the trips in appropriate emme format.

time_to_cube_time_period(start_time_secs[, ...])

Converts seconds from midnight to the cube time period.

write_as_cube_lin([outpath])

Writes the gtfs feed as a cube line file after converting gtfs properties to MetCouncil cube properties.

+
+
+calculate_cube_mode(row)[source]
+

Assigns a cube mode number by following logic. +#MC +For rail, uses GTFS route_type variable: +https://developers.google.com/transit/gtfs/reference

+
+
::

# route_type : cube_mode +route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail

+
+

3: 0, # Bus; further disaggregated for cube +2: 9} # Rail

+
+
+
+

For buses, uses route id numbers and route name to find +express and suburban buses as follows:

+
+
::
+
if not cube_mode:
+
if ‘express’ in row[‘LONGNAME’].lower():

cube_mode = 7 # Express

+
+
elif int(row[‘route_id’].split(“-“)[0]) > 99:

cube_mode = 6 # Suburban Local

+
+
else:

cube_mode = 5 # Urban Local

+
+
+
+
+
+
+
+
Parameters:
+

row – A DataFrame row with route_type, route_long_name, and route_id

+
+
Returns:
+

cube mode number

+
+
+
+ +
+
+cube_format(row)[source]
+

Creates a string represnting the route in cube line file notation. +#MC +:param row: row of a DataFrame representing a cube-formatted trip, with the Attributes

+
+

trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR

+
+
+
Returns:
+

string representation of route in cube line file notation

+
+
+
+ +
+
+evaluate_differences(transit_changes)[source]
+

Compare changes from the transit_changes dataframe with the standard transit network +returns the project card changes in dictionary format

+
+ +
+
+static fromTransitNetwork(transit_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • transit_network_object – Reference to an instance of TransitNetwork.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit

+
+
+
+ +
+
+static read_gtfs(gtfs_feed_dir, parameters={})[source]
+

Reads GTFS files from a directory and returns a StandardTransit +instance.

+
+
Parameters:
+
    +
  • gtfs_feed_dir – location of the GTFS files

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit instance

+
+
+
+ +
+
+static route_properties_gtfs_to_cube(self)[source]
+

Prepare gtfs for cube lin file. +#MC +Does the following operations: +1. Combines route, frequency, trip, and shape information +2. Converts time of day to time periods +3. Calculates cube route name from gtfs route name and properties +4. Assigns a cube-appropriate mode number +5. Assigns a cube-appropriate operator number

+
+
Returns:
+

+
DataFrame of trips with cube-appropriate values for:
    +
  • NAME

  • +
  • ONEWAY

  • +
  • OPERATOR

  • +
  • MODE

  • +
  • HEADWAY

  • +
+
+
+

+
+
Return type:
+

trip_df (DataFrame)

+
+
+
+ +
+
+shape_gtfs_to_cube(row, add_nntime=False)[source]
+

Creates a list of nodes that for the route in appropriate +cube format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a string representation of the node list

for a route in cube format.

+
+
+
+ +
+
+shape_gtfs_to_dict_list(trip_id, shape_id, add_nntime)[source]
+

This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of +stepping through the routed nodes and corresponding them with shape nodes.

+

TODO: eliminate this necessity by tagging the stop nodes in the shapes to begin with when +the transit routing on the roadway network is first performed.

+

As such, I’m copying the code from StandardTransit.shape_gtfs_to_cube() with minimal modifications.

+
+
Parameters:
+
    +
  • question (shape_id of the trip in) –

  • +
  • question

  • +
+
+
Returns:
+

trip_id +shape_id +shape_pt_sequence +shape_mode_node_id +is_stop +access +stop_sequence

+
+
Return type:
+

list of dict records with columns

+
+
+
+ +
+
+shape_gtfs_to_emme(trip_row)[source]
+

Creates transit segment for the trips in appropriate +emme format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a dataframe representation of the transit segment

for a trip in emme format.

+
+
+
+ +
+
+time_to_cube_time_period(start_time_secs, as_str=True, verbose=False)[source]
+

Converts seconds from midnight to the cube time period.

+
+
Parameters:
+
    +
  • start_time_secs – start time for transit trip in seconds +from midnight

  • +
  • as_str – if True, returns the time period as a string, +otherwise returns a numeric time period

  • +
+
+
Returns:
+

+
if as_str is False, returns the numeric

time period

+
+
this_tp: if as_str is True, returns the Cube time period

name abbreviation

+
+
+

+
+
Return type:
+

this_tp_num

+
+
+
+ +
+
+write_as_cube_lin(outpath=None)[source]
+

Writes the gtfs feed as a cube line file after +converting gtfs properties to MetCouncil cube properties. +#MC +:param outpath: File location for output cube line file.

+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_generated/lasso.logger/index.html b/branch/seperate_maz_and_taz/_generated/lasso.logger/index.html new file mode 100644 index 0000000..b6cabdb --- /dev/null +++ b/branch/seperate_maz_and_taz/_generated/lasso.logger/index.html @@ -0,0 +1,143 @@ + + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.logger

+

Functions

+ + + + + + +

setupLogging(infoLogFilename, debugLogFilename)

Sets up the logger.

+
+
+lasso.logger.setupLogging(infoLogFilename, debugLogFilename, logToConsole=True)[source]
+

Sets up the logger. The infoLog is terse, just gives the bare minimum of details +so the network composition will be clear later. +The debuglog is very noisy, for debugging.

+

Pass none to either. +Spews it all out to console too, if logToConsole is true.

+
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_generated/lasso.util/index.html b/branch/seperate_maz_and_taz/_generated/lasso.util/index.html new file mode 100644 index 0000000..d088911 --- /dev/null +++ b/branch/seperate_maz_and_taz/_generated/lasso.util/index.html @@ -0,0 +1,1696 @@ + + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.util

+

Functions

+ + + + + + + + + + + + + + + + + + + + + + + + +

column_name_to_parts(c[, parameters])

create_locationreference(node, link)

geodesic_point_buffer(lat, lon, meters)

creates circular buffer polygon for node

get_shared_streets_intersection_hash(lat, long)

Calculated per:

hhmmss_to_datetime(hhmmss_str)

Creates a datetime time object from a string of hh:mm:ss

secs_to_datetime(secs)

Creates a datetime time object from a seconds from midnight

shorten_name(name)

+
+
+class lasso.util.Point(*args)[source]
+

Bases: BaseGeometry

+

A geometry type that represents a single coordinate with +x,y and possibly z values.

+

A point is a zero-dimensional feature and has zero length and zero area.

+
+
Parameters:
+

args (float, or sequence of floats) –

The coordinates can either be passed as a single parameter, or as +individual float values using multiple parameters:

+
    +
  1. 1 parameter: a sequence or array-like of with 2 or 3 values.

  2. +
  3. 2 or 3 parameters (float): x, y, and possibly z.

  4. +
+

+
+
+
+
+x, y, z
+

Coordinate values

+
+
Type:
+

float

+
+
+
+ +

Examples

+

Constructing the Point using separate parameters for x and y:

+
>>> p = Point(1.0, -1.0)
+
+
+

Constructing the Point using a list of x, y coordinates:

+
>>> p = Point([1.0, -1.0])
+>>> print(p)
+POINT (1 -1)
+>>> p.y
+-1.0
+>>> p.x
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG circle element for the Point geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG circle diameter. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property x
+

Return x coordinate.

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+

Example

+
>>> x, y = Point(0, 0).xy
+>>> list(x)
+[0.0]
+>>> list(y)
+[0.0]
+
+
+
+ +
+
+property y
+

Return y coordinate.

+
+ +
+
+property z
+

Return z coordinate.

+
+ +
+ +
+
+class lasso.util.Polygon(shell=None, holes=None)[source]
+

Bases: BaseGeometry

+

A geometry type representing an area that is enclosed by a linear ring.

+

A polygon is a two-dimensional feature and has a non-zero area. It may +have one or more negative-space “holes” which are also bounded by linear +rings. If any rings cross each other, the feature is invalid and +operations on it may fail.

+
+
Parameters:
+
    +
  • shell (sequence) – A sequence of (x, y [,z]) numeric coordinate pairs or triples, or +an array-like with shape (N, 2) or (N, 3). +Also can be a sequence of Point objects.

  • +
  • holes (sequence) – A sequence of objects which satisfy the same requirements as the +shell parameters above

  • +
+
+
+
+
+exterior
+

The ring which bounds the positive space of the polygon.

+
+
Type:
+

LinearRing

+
+
+
+ +
+
+interiors
+

A sequence of rings which bound all existing holes.

+
+
Type:
+

sequence

+
+
+
+ +

Examples

+

Create a square polygon with no holes

+
>>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.))
+>>> polygon = Polygon(coords)
+>>> polygon.area
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+classmethod from_bounds(xmin, ymin, xmax, ymax)[source]
+

Construct a Polygon() from spatial bounds.

+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG path element for the Polygon geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG stroke-width. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property exterior
+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property interiors
+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+
+ +
+ +
+
+class lasso.util.partial[source]
+

Bases: object

+

partial(func, *args, **keywords) - new function with partial application +of the given arguments and keywords.

+
+
+args
+

tuple of arguments to future partial calls

+
+ +
+
+func
+

function object to use in future partial calls

+
+ +
+
+keywords
+

dictionary of keyword arguments to future partial calls

+
+ +
+ +
+
+lasso.util.column_name_to_parts(c, parameters=None)[source]
+
+ +
+
+lasso.util.create_locationreference(node, link)[source]
+
+ +
+
+lasso.util.geodesic_point_buffer(lat, lon, meters)[source]
+

creates circular buffer polygon for node

+
+
Parameters:
+
    +
  • lat – node lat

  • +
  • lon – node lon

  • +
  • meters – buffer distance, radius of circle

  • +
+
+
Returns:
+

Polygon

+
+
+
+ +
+
+lasso.util.get_shared_streets_intersection_hash(lat, long, osm_node_id=None)[source]
+
+
Calculated per:

https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565

+
+
Expected in/out
+
-93.0965985, 44.952112199999995 osm_node_id = 954734870

69f13f881649cb21ee3b359730790bb9

+
+
+
+
+
+ +
+
+lasso.util.hhmmss_to_datetime(hhmmss_str)[source]
+

Creates a datetime time object from a string of hh:mm:ss

+
+
Parameters:
+

hhmmss_str – string of hh:mm:ss

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.secs_to_datetime(secs)[source]
+

Creates a datetime time object from a seconds from midnight

+
+
Parameters:
+

secs – seconds from midnight

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.shorten_name(name)[source]
+
+ +
+
+lasso.util.transform(func, geom)[source]
+

Applies func to all coordinates of geom and returns a new +geometry of the same type from the transformed coordinates.

+

func maps x, y, and optionally z to output xp, yp, zp. The input +parameters may iterable types like lists or arrays or single values. +The output shall be of the same type. Scalars in, scalars out. +Lists in, lists out.

+

For example, here is an identity function applicable to both types +of input.

+
+
+
def id_func(x, y, z=None):

return tuple(filter(None, [x, y, z]))

+
+
+

g2 = transform(id_func, g1)

+
+

Using pyproj >= 2.1, this example will accurately project Shapely geometries:

+
+

import pyproj

+

wgs84 = pyproj.CRS(‘EPSG:4326’) +utm = pyproj.CRS(‘EPSG:32618’)

+

project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform

+

g2 = transform(project, g1)

+
+

Note that the always_xy kwarg is required here as Shapely geometries only support +X,Y coordinate ordering.

+

Lambda expressions such as the one in

+
+

g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1)

+
+

also satisfy the requirements for func.

+
+ +
+
+lasso.util.unidecode(string, errors='ignore', replace_str='?')
+

Transliterate an Unicode object into an ASCII string

+
+
Return type:
+

str

+
+
+
>>> unidecode("北亰")
+"Bei Jing "
+
+
+

This function first tries to convert the string using ASCII codec. +If it fails (because of non-ASCII characters), it falls back to +transliteration using the character tables.

+

This is approx. five times faster if the string only contains ASCII +characters, but slightly slower than unicode_expect_nonascii if +non-ASCII characters are present.

+

errors specifies what to do with characters that have not been +found in replacement tables. The default is ‘ignore’ which ignores +the character. ‘strict’ raises an UnidecodeError. ‘replace’ +substitutes the character with replace_str (default is ‘?’). +‘preserve’ keeps the original character.

+

Note that if ‘preserve’ is used the returned string might not be +ASCII!

+
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_modules/functools/index.html b/branch/seperate_maz_and_taz/_modules/functools/index.html new file mode 100644 index 0000000..b3abe79 --- /dev/null +++ b/branch/seperate_maz_and_taz/_modules/functools/index.html @@ -0,0 +1,1083 @@ + + + + + + functools — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for functools

+"""functools.py - Tools for working with functions and callable objects
+"""
+# Python module wrapper for _functools C module
+# to allow utilities written in Python to be added
+# to the functools module.
+# Written by Nick Coghlan <ncoghlan at gmail.com>,
+# Raymond Hettinger <python at rcn.com>,
+# and Łukasz Langa <lukasz at langa.pl>.
+#   Copyright (C) 2006-2013 Python Software Foundation.
+# See C source code for _functools credits/copyright
+
+__all__ = ['update_wrapper', 'wraps', 'WRAPPER_ASSIGNMENTS', 'WRAPPER_UPDATES',
+           'total_ordering', 'cmp_to_key', 'lru_cache', 'reduce', 'partial',
+           'partialmethod', 'singledispatch', 'singledispatchmethod',
+           "cached_property"]
+
+from abc import get_cache_token
+from collections import namedtuple
+# import types, weakref  # Deferred to single_dispatch()
+from reprlib import recursive_repr
+from _thread import RLock
+
+
+################################################################################
+### update_wrapper() and wraps() decorator
+################################################################################
+
+# update_wrapper() and wraps() are tools to help write
+# wrapper functions that can handle naive introspection
+
+WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__',
+                       '__annotations__')
+WRAPPER_UPDATES = ('__dict__',)
+def update_wrapper(wrapper,
+                   wrapped,
+                   assigned = WRAPPER_ASSIGNMENTS,
+                   updated = WRAPPER_UPDATES):
+    """Update a wrapper function to look like the wrapped function
+
+       wrapper is the function to be updated
+       wrapped is the original function
+       assigned is a tuple naming the attributes assigned directly
+       from the wrapped function to the wrapper function (defaults to
+       functools.WRAPPER_ASSIGNMENTS)
+       updated is a tuple naming the attributes of the wrapper that
+       are updated with the corresponding attribute from the wrapped
+       function (defaults to functools.WRAPPER_UPDATES)
+    """
+    for attr in assigned:
+        try:
+            value = getattr(wrapped, attr)
+        except AttributeError:
+            pass
+        else:
+            setattr(wrapper, attr, value)
+    for attr in updated:
+        getattr(wrapper, attr).update(getattr(wrapped, attr, {}))
+    # Issue #17482: set __wrapped__ last so we don't inadvertently copy it
+    # from the wrapped function when updating __dict__
+    wrapper.__wrapped__ = wrapped
+    # Return the wrapper so this can be used as a decorator via partial()
+    return wrapper
+
+def wraps(wrapped,
+          assigned = WRAPPER_ASSIGNMENTS,
+          updated = WRAPPER_UPDATES):
+    """Decorator factory to apply update_wrapper() to a wrapper function
+
+       Returns a decorator that invokes update_wrapper() with the decorated
+       function as the wrapper argument and the arguments to wraps() as the
+       remaining arguments. Default arguments are as for update_wrapper().
+       This is a convenience function to simplify applying partial() to
+       update_wrapper().
+    """
+    return partial(update_wrapper, wrapped=wrapped,
+                   assigned=assigned, updated=updated)
+
+
+################################################################################
+### total_ordering class decorator
+################################################################################
+
+# The total ordering functions all invoke the root magic method directly
+# rather than using the corresponding operator.  This avoids possible
+# infinite recursion that could occur when the operator dispatch logic
+# detects a NotImplemented result and then calls a reflected method.
+
+def _gt_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a < b) and (a != b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _le_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (a < b) or (a == b).'
+    op_result = self.__lt__(other)
+    return op_result or self == other
+
+def _ge_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a < b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _ge_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a <= b) or (a == b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _lt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (a <= b) and (a != b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _gt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a <= b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _lt_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a > b) and (a != b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _ge_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (a > b) or (a == b).'
+    op_result = self.__gt__(other)
+    return op_result or self == other
+
+def _le_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a > b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _le_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a >= b) or (a == b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _gt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (a >= b) and (a != b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _lt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a >= b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+_convert = {
+    '__lt__': [('__gt__', _gt_from_lt),
+               ('__le__', _le_from_lt),
+               ('__ge__', _ge_from_lt)],
+    '__le__': [('__ge__', _ge_from_le),
+               ('__lt__', _lt_from_le),
+               ('__gt__', _gt_from_le)],
+    '__gt__': [('__lt__', _lt_from_gt),
+               ('__ge__', _ge_from_gt),
+               ('__le__', _le_from_gt)],
+    '__ge__': [('__le__', _le_from_ge),
+               ('__gt__', _gt_from_ge),
+               ('__lt__', _lt_from_ge)]
+}
+
+def total_ordering(cls):
+    """Class decorator that fills in missing ordering methods"""
+    # Find user-defined comparisons (not those inherited from object).
+    roots = {op for op in _convert if getattr(cls, op, None) is not getattr(object, op, None)}
+    if not roots:
+        raise ValueError('must define at least one ordering operation: < > <= >=')
+    root = max(roots)       # prefer __lt__ to __le__ to __gt__ to __ge__
+    for opname, opfunc in _convert[root]:
+        if opname not in roots:
+            opfunc.__name__ = opname
+            setattr(cls, opname, opfunc)
+    return cls
+
+
+################################################################################
+### cmp_to_key() function converter
+################################################################################
+
+def cmp_to_key(mycmp):
+    """Convert a cmp= function into a key= function"""
+    class K(object):
+        __slots__ = ['obj']
+        def __init__(self, obj):
+            self.obj = obj
+        def __lt__(self, other):
+            return mycmp(self.obj, other.obj) < 0
+        def __gt__(self, other):
+            return mycmp(self.obj, other.obj) > 0
+        def __eq__(self, other):
+            return mycmp(self.obj, other.obj) == 0
+        def __le__(self, other):
+            return mycmp(self.obj, other.obj) <= 0
+        def __ge__(self, other):
+            return mycmp(self.obj, other.obj) >= 0
+        __hash__ = None
+    return K
+
+try:
+    from _functools import cmp_to_key
+except ImportError:
+    pass
+
+
+################################################################################
+### reduce() sequence to a single item
+################################################################################
+
+_initial_missing = object()
+
+def reduce(function, sequence, initial=_initial_missing):
+    """
+    reduce(function, sequence[, initial]) -> value
+
+    Apply a function of two arguments cumulatively to the items of a sequence,
+    from left to right, so as to reduce the sequence to a single value.
+    For example, reduce(lambda x, y: x+y, [1, 2, 3, 4, 5]) calculates
+    ((((1+2)+3)+4)+5).  If initial is present, it is placed before the items
+    of the sequence in the calculation, and serves as a default when the
+    sequence is empty.
+    """
+
+    it = iter(sequence)
+
+    if initial is _initial_missing:
+        try:
+            value = next(it)
+        except StopIteration:
+            raise TypeError("reduce() of empty sequence with no initial value") from None
+    else:
+        value = initial
+
+    for element in it:
+        value = function(value, element)
+
+    return value
+
+try:
+    from _functools import reduce
+except ImportError:
+    pass
+
+
+################################################################################
+### partial() argument application
+################################################################################
+
+# Purely functional, no descriptor behaviour
+
[docs]class partial: + """New function with partial application of the given arguments + and keywords. + """ + + __slots__ = "func", "args", "keywords", "__dict__", "__weakref__" + + def __new__(cls, func, /, *args, **keywords): + if not callable(func): + raise TypeError("the first argument must be callable") + + if hasattr(func, "func"): + args = func.args + args + keywords = {**func.keywords, **keywords} + func = func.func + + self = super(partial, cls).__new__(cls) + + self.func = func + self.args = args + self.keywords = keywords + return self + + def __call__(self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(*self.args, *args, **keywords) + + @recursive_repr() + def __repr__(self): + qualname = type(self).__qualname__ + args = [repr(self.func)] + args.extend(repr(x) for x in self.args) + args.extend(f"{k}={v!r}" for (k, v) in self.keywords.items()) + if type(self).__module__ == "functools": + return f"functools.{qualname}({', '.join(args)})" + return f"{qualname}({', '.join(args)})" + + def __reduce__(self): + return type(self), (self.func,), (self.func, self.args, + self.keywords or None, self.__dict__ or None) + + def __setstate__(self, state): + if not isinstance(state, tuple): + raise TypeError("argument to __setstate__ must be a tuple") + if len(state) != 4: + raise TypeError(f"expected 4 items in state, got {len(state)}") + func, args, kwds, namespace = state + if (not callable(func) or not isinstance(args, tuple) or + (kwds is not None and not isinstance(kwds, dict)) or + (namespace is not None and not isinstance(namespace, dict))): + raise TypeError("invalid partial state") + + args = tuple(args) # just in case it's a subclass + if kwds is None: + kwds = {} + elif type(kwds) is not dict: # XXX does it need to be *exactly* dict? + kwds = dict(kwds) + if namespace is None: + namespace = {} + + self.__dict__ = namespace + self.func = func + self.args = args + self.keywords = kwds
+ +try: + from _functools import partial +except ImportError: + pass + +# Descriptor version +class partialmethod(object): + """Method descriptor with partial application of the given arguments + and keywords. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(*args, **keywords): + if len(args) >= 2: + self, func, *args = args + elif not args: + raise TypeError("descriptor '__init__' of partialmethod " + "needs an argument") + elif 'func' in keywords: + func = keywords.pop('func') + self, *args = args + import warnings + warnings.warn("Passing 'func' as keyword argument is deprecated", + DeprecationWarning, stacklevel=2) + else: + raise TypeError("type 'partialmethod' takes at least one argument, " + "got %d" % (len(args)-1)) + args = tuple(args) + + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError("{!r} is not callable or a descriptor" + .format(func)) + + # func could be a descriptor like classmethod which isn't callable, + # so we can't inherit from partial (it verifies func is callable) + if isinstance(func, partialmethod): + # flattening is mandatory in order to place cls/self before all + # other arguments + # it's also more efficient since only one function will be called + self.func = func.func + self.args = func.args + args + self.keywords = {**func.keywords, **keywords} + else: + self.func = func + self.args = args + self.keywords = keywords + __init__.__text_signature__ = '($self, func, /, *args, **keywords)' + + def __repr__(self): + args = ", ".join(map(repr, self.args)) + keywords = ", ".join("{}={!r}".format(k, v) + for k, v in self.keywords.items()) + format_string = "{module}.{cls}({func}, {args}, {keywords})" + return format_string.format(module=self.__class__.__module__, + cls=self.__class__.__qualname__, + func=self.func, + args=args, + keywords=keywords) + + def _make_unbound_method(self): + def _method(cls_or_self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(cls_or_self, *self.args, *args, **keywords) + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method._partialmethod = self + return _method + + def __get__(self, obj, cls=None): + get = getattr(self.func, "__get__", None) + result = None + if get is not None: + new_func = get(obj, cls) + if new_func is not self.func: + # Assume __get__ returning something new indicates the + # creation of an appropriate callable + result = partial(new_func, *self.args, **self.keywords) + try: + result.__self__ = new_func.__self__ + except AttributeError: + pass + if result is None: + # If the underlying descriptor didn't do anything, treat this + # like an instance method + result = self._make_unbound_method().__get__(obj, cls) + return result + + @property + def __isabstractmethod__(self): + return getattr(self.func, "__isabstractmethod__", False) + +# Helper functions + +def _unwrap_partial(func): + while isinstance(func, partial): + func = func.func + return func + +################################################################################ +### LRU Cache function decorator +################################################################################ + +_CacheInfo = namedtuple("CacheInfo", ["hits", "misses", "maxsize", "currsize"]) + +class _HashedSeq(list): + """ This class guarantees that hash() will be called no more than once + per element. This is important because the lru_cache() will hash + the key multiple times on a cache miss. + + """ + + __slots__ = 'hashvalue' + + def __init__(self, tup, hash=hash): + self[:] = tup + self.hashvalue = hash(tup) + + def __hash__(self): + return self.hashvalue + +def _make_key(args, kwds, typed, + kwd_mark = (object(),), + fasttypes = {int, str}, + tuple=tuple, type=type, len=len): + """Make a cache key from optionally typed positional and keyword arguments + + The key is constructed in a way that is flat as possible rather than + as a nested structure that would take more memory. + + If there is only a single argument and its data type is known to cache + its hash value, then that argument is returned without a wrapper. This + saves space and improves lookup speed. + + """ + # All of code below relies on kwds preserving the order input by the user. + # Formerly, we sorted() the kwds before looping. The new way is *much* + # faster; however, it means that f(x=1, y=2) will now be treated as a + # distinct call from f(y=2, x=1) which will be cached separately. + key = args + if kwds: + key += kwd_mark + for item in kwds.items(): + key += item + if typed: + key += tuple(type(v) for v in args) + if kwds: + key += tuple(type(v) for v in kwds.values()) + elif len(key) == 1 and type(key[0]) in fasttypes: + return key[0] + return _HashedSeq(key) + +def lru_cache(maxsize=128, typed=False): + """Least-recently-used cache decorator. + + If *maxsize* is set to None, the LRU features are disabled and the cache + can grow without bound. + + If *typed* is True, arguments of different types will be cached separately. + For example, f(3.0) and f(3) will be treated as distinct calls with + distinct results. + + Arguments to the cached function must be hashable. + + View the cache statistics named tuple (hits, misses, maxsize, currsize) + with f.cache_info(). Clear the cache and statistics with f.cache_clear(). + Access the underlying function with f.__wrapped__. + + See: http://en.wikipedia.org/wiki/Cache_replacement_policies#Least_recently_used_(LRU) + + """ + + # Users should only access the lru_cache through its public API: + # cache_info, cache_clear, and f.__wrapped__ + # The internals of the lru_cache are encapsulated for thread safety and + # to allow the implementation to change (including a possible C version). + + if isinstance(maxsize, int): + # Negative maxsize is treated as 0 + if maxsize < 0: + maxsize = 0 + elif callable(maxsize) and isinstance(typed, bool): + # The user_function was passed in directly via the maxsize argument + user_function, maxsize = maxsize, 128 + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + elif maxsize is not None: + raise TypeError( + 'Expected first argument to be an integer, a callable, or None') + + def decorating_function(user_function): + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + + return decorating_function + +def _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo): + # Constants shared by all lru cache instances: + sentinel = object() # unique object used to signal cache misses + make_key = _make_key # build a key from the function arguments + PREV, NEXT, KEY, RESULT = 0, 1, 2, 3 # names for the link fields + + cache = {} + hits = misses = 0 + full = False + cache_get = cache.get # bound method to lookup a key or return None + cache_len = cache.__len__ # get cache size without calling len() + lock = RLock() # because linkedlist updates aren't threadsafe + root = [] # root of the circular doubly linked list + root[:] = [root, root, None, None] # initialize by pointing to self + + if maxsize == 0: + + def wrapper(*args, **kwds): + # No caching -- just a statistics update + nonlocal misses + misses += 1 + result = user_function(*args, **kwds) + return result + + elif maxsize is None: + + def wrapper(*args, **kwds): + # Simple caching without ordering or size limit + nonlocal hits, misses + key = make_key(args, kwds, typed) + result = cache_get(key, sentinel) + if result is not sentinel: + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + cache[key] = result + return result + + else: + + def wrapper(*args, **kwds): + # Size limited caching that tracks accesses by recency + nonlocal root, hits, misses, full + key = make_key(args, kwds, typed) + with lock: + link = cache_get(key) + if link is not None: + # Move the link to the front of the circular queue + link_prev, link_next, _key, result = link + link_prev[NEXT] = link_next + link_next[PREV] = link_prev + last = root[PREV] + last[NEXT] = root[PREV] = link + link[PREV] = last + link[NEXT] = root + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + with lock: + if key in cache: + # Getting here means that this same key was added to the + # cache while the lock was released. Since the link + # update is already done, we need only return the + # computed result and update the count of misses. + pass + elif full: + # Use the old root to store the new key and result. + oldroot = root + oldroot[KEY] = key + oldroot[RESULT] = result + # Empty the oldest link and make it the new root. + # Keep a reference to the old key and old result to + # prevent their ref counts from going to zero during the + # update. That will prevent potentially arbitrary object + # clean-up code (i.e. __del__) from running while we're + # still adjusting the links. + root = oldroot[NEXT] + oldkey = root[KEY] + oldresult = root[RESULT] + root[KEY] = root[RESULT] = None + # Now update the cache dictionary. + del cache[oldkey] + # Save the potentially reentrant cache[key] assignment + # for last, after the root and links have been put in + # a consistent state. + cache[key] = oldroot + else: + # Put result in a new link at the front of the queue. + last = root[PREV] + link = [last, root, key, result] + last[NEXT] = root[PREV] = cache[key] = link + # Use the cache_len bound method instead of the len() function + # which could potentially be wrapped in an lru_cache itself. + full = (cache_len() >= maxsize) + return result + + def cache_info(): + """Report cache statistics""" + with lock: + return _CacheInfo(hits, misses, maxsize, cache_len()) + + def cache_clear(): + """Clear the cache and cache statistics""" + nonlocal hits, misses, full + with lock: + cache.clear() + root[:] = [root, root, None, None] + hits = misses = 0 + full = False + + wrapper.cache_info = cache_info + wrapper.cache_clear = cache_clear + return wrapper + +try: + from _functools import _lru_cache_wrapper +except ImportError: + pass + + +################################################################################ +### singledispatch() - single-dispatch generic function decorator +################################################################################ + +def _c3_merge(sequences): + """Merges MROs in *sequences* to a single MRO using the C3 algorithm. + + Adapted from http://www.python.org/download/releases/2.3/mro/. + + """ + result = [] + while True: + sequences = [s for s in sequences if s] # purge empty sequences + if not sequences: + return result + for s1 in sequences: # find merge candidates among seq heads + candidate = s1[0] + for s2 in sequences: + if candidate in s2[1:]: + candidate = None + break # reject the current head, it appears later + else: + break + if candidate is None: + raise RuntimeError("Inconsistent hierarchy") + result.append(candidate) + # remove the chosen candidate + for seq in sequences: + if seq[0] == candidate: + del seq[0] + +def _c3_mro(cls, abcs=None): + """Computes the method resolution order using extended C3 linearization. + + If no *abcs* are given, the algorithm works exactly like the built-in C3 + linearization used for method resolution. + + If given, *abcs* is a list of abstract base classes that should be inserted + into the resulting MRO. Unrelated ABCs are ignored and don't end up in the + result. The algorithm inserts ABCs where their functionality is introduced, + i.e. issubclass(cls, abc) returns True for the class itself but returns + False for all its direct base classes. Implicit ABCs for a given class + (either registered or inferred from the presence of a special method like + __len__) are inserted directly after the last ABC explicitly listed in the + MRO of said class. If two implicit ABCs end up next to each other in the + resulting MRO, their ordering depends on the order of types in *abcs*. + + """ + for i, base in enumerate(reversed(cls.__bases__)): + if hasattr(base, '__abstractmethods__'): + boundary = len(cls.__bases__) - i + break # Bases up to the last explicit ABC are considered first. + else: + boundary = 0 + abcs = list(abcs) if abcs else [] + explicit_bases = list(cls.__bases__[:boundary]) + abstract_bases = [] + other_bases = list(cls.__bases__[boundary:]) + for base in abcs: + if issubclass(cls, base) and not any( + issubclass(b, base) for b in cls.__bases__ + ): + # If *cls* is the class that introduces behaviour described by + # an ABC *base*, insert said ABC to its MRO. + abstract_bases.append(base) + for base in abstract_bases: + abcs.remove(base) + explicit_c3_mros = [_c3_mro(base, abcs=abcs) for base in explicit_bases] + abstract_c3_mros = [_c3_mro(base, abcs=abcs) for base in abstract_bases] + other_c3_mros = [_c3_mro(base, abcs=abcs) for base in other_bases] + return _c3_merge( + [[cls]] + + explicit_c3_mros + abstract_c3_mros + other_c3_mros + + [explicit_bases] + [abstract_bases] + [other_bases] + ) + +def _compose_mro(cls, types): + """Calculates the method resolution order for a given class *cls*. + + Includes relevant abstract base classes (with their respective bases) from + the *types* iterable. Uses a modified C3 linearization algorithm. + + """ + bases = set(cls.__mro__) + # Remove entries which are already present in the __mro__ or unrelated. + def is_related(typ): + return (typ not in bases and hasattr(typ, '__mro__') + and issubclass(cls, typ)) + types = [n for n in types if is_related(n)] + # Remove entries which are strict bases of other entries (they will end up + # in the MRO anyway. + def is_strict_base(typ): + for other in types: + if typ != other and typ in other.__mro__: + return True + return False + types = [n for n in types if not is_strict_base(n)] + # Subclasses of the ABCs in *types* which are also implemented by + # *cls* can be used to stabilize ABC ordering. + type_set = set(types) + mro = [] + for typ in types: + found = [] + for sub in typ.__subclasses__(): + if sub not in bases and issubclass(cls, sub): + found.append([s for s in sub.__mro__ if s in type_set]) + if not found: + mro.append(typ) + continue + # Favor subclasses with the biggest number of useful bases + found.sort(key=len, reverse=True) + for sub in found: + for subcls in sub: + if subcls not in mro: + mro.append(subcls) + return _c3_mro(cls, abcs=mro) + +def _find_impl(cls, registry): + """Returns the best matching implementation from *registry* for type *cls*. + + Where there is no registered implementation for a specific type, its method + resolution order is used to find a more generic implementation. + + Note: if *registry* does not contain an implementation for the base + *object* type, this function may return None. + + """ + mro = _compose_mro(cls, registry.keys()) + match = None + for t in mro: + if match is not None: + # If *match* is an implicit ABC but there is another unrelated, + # equally matching implicit ABC, refuse the temptation to guess. + if (t in registry and t not in cls.__mro__ + and match not in cls.__mro__ + and not issubclass(match, t)): + raise RuntimeError("Ambiguous dispatch: {} or {}".format( + match, t)) + break + if t in registry: + match = t + return registry.get(match) + +def singledispatch(func): + """Single-dispatch generic function decorator. + + Transforms a function into a generic function, which can have different + behaviours depending upon the type of its first argument. The decorated + function acts as the default implementation, and additional + implementations can be registered using the register() attribute of the + generic function. + """ + # There are many programs that use functools without singledispatch, so we + # trade-off making singledispatch marginally slower for the benefit of + # making start-up of such applications slightly faster. + import types, weakref + + registry = {} + dispatch_cache = weakref.WeakKeyDictionary() + cache_token = None + + def dispatch(cls): + """generic_func.dispatch(cls) -> <function implementation> + + Runs the dispatch algorithm to return the best available implementation + for the given *cls* registered on *generic_func*. + + """ + nonlocal cache_token + if cache_token is not None: + current_token = get_cache_token() + if cache_token != current_token: + dispatch_cache.clear() + cache_token = current_token + try: + impl = dispatch_cache[cls] + except KeyError: + try: + impl = registry[cls] + except KeyError: + impl = _find_impl(cls, registry) + dispatch_cache[cls] = impl + return impl + + def register(cls, func=None): + """generic_func.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_func*. + + """ + nonlocal cache_token + if func is None: + if isinstance(cls, type): + return lambda f: register(cls, f) + ann = getattr(cls, '__annotations__', {}) + if not ann: + raise TypeError( + f"Invalid first argument to `register()`: {cls!r}. " + f"Use either `@register(some_class)` or plain `@register` " + f"on an annotated function." + ) + func = cls + + # only import typing if annotation parsing is necessary + from typing import get_type_hints + argname, cls = next(iter(get_type_hints(func).items())) + if not isinstance(cls, type): + raise TypeError( + f"Invalid annotation for {argname!r}. " + f"{cls!r} is not a class." + ) + registry[cls] = func + if cache_token is None and hasattr(cls, '__abstractmethods__'): + cache_token = get_cache_token() + dispatch_cache.clear() + return func + + def wrapper(*args, **kw): + if not args: + raise TypeError(f'{funcname} requires at least ' + '1 positional argument') + + return dispatch(args[0].__class__)(*args, **kw) + + funcname = getattr(func, '__name__', 'singledispatch function') + registry[object] = func + wrapper.register = register + wrapper.dispatch = dispatch + wrapper.registry = types.MappingProxyType(registry) + wrapper._clear_cache = dispatch_cache.clear + update_wrapper(wrapper, func) + return wrapper + + +# Descriptor version +class singledispatchmethod: + """Single-dispatch generic method descriptor. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(self, func): + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError(f"{func!r} is not callable or a descriptor") + + self.dispatcher = singledispatch(func) + self.func = func + + def register(self, cls, method=None): + """generic_method.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_method*. + """ + return self.dispatcher.register(cls, func=method) + + def __get__(self, obj, cls=None): + def _method(*args, **kwargs): + method = self.dispatcher.dispatch(args[0].__class__) + return method.__get__(obj, cls)(*args, **kwargs) + + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method.register = self.register + update_wrapper(_method, self.func) + return _method + + @property + def __isabstractmethod__(self): + return getattr(self.func, '__isabstractmethod__', False) + + +################################################################################ +### cached_property() - computed once per instance, cached as attribute +################################################################################ + +_NOT_FOUND = object() + + +class cached_property: + def __init__(self, func): + self.func = func + self.attrname = None + self.__doc__ = func.__doc__ + self.lock = RLock() + + def __set_name__(self, owner, name): + if self.attrname is None: + self.attrname = name + elif name != self.attrname: + raise TypeError( + "Cannot assign the same cached_property to two different names " + f"({self.attrname!r} and {name!r})." + ) + + def __get__(self, instance, owner=None): + if instance is None: + return self + if self.attrname is None: + raise TypeError( + "Cannot use cached_property instance without calling __set_name__ on it.") + try: + cache = instance.__dict__ + except AttributeError: # not all objects have __dict__ (e.g. class defines slots) + msg = ( + f"No '__dict__' attribute on {type(instance).__name__!r} " + f"instance to cache {self.attrname!r} property." + ) + raise TypeError(msg) from None + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + with self.lock: + # check if another thread filled cache while we awaited lock + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + val = self.func(instance) + try: + cache[self.attrname] = val + except TypeError: + msg = ( + f"The '__dict__' attribute on {type(instance).__name__!r} instance " + f"does not support item assignment for caching {self.attrname!r} property." + ) + raise TypeError(msg) from None + return val +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_modules/index.html b/branch/seperate_maz_and_taz/_modules/index.html new file mode 100644 index 0000000..d50382b --- /dev/null +++ b/branch/seperate_maz_and_taz/_modules/index.html @@ -0,0 +1,116 @@ + + + + + + Overview: module code — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+ +
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_modules/lasso/logger/index.html b/branch/seperate_maz_and_taz/_modules/lasso/logger/index.html new file mode 100644 index 0000000..53dd18d --- /dev/null +++ b/branch/seperate_maz_and_taz/_modules/lasso/logger/index.html @@ -0,0 +1,153 @@ + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.logger

+import logging
+
+__all__ = ["WranglerLogger", "setupLogging"]
+
+
+# for all the Wrangler logging needs!
+WranglerLogger = logging.getLogger("WranglerLogger")
+
+
+
[docs]def setupLogging(infoLogFilename, debugLogFilename, logToConsole=True): + """Sets up the logger. The infoLog is terse, just gives the bare minimum of details + so the network composition will be clear later. + The debuglog is very noisy, for debugging. + + Pass none to either. + Spews it all out to console too, if logToConsole is true. + """ + # clear handlers if any exist already + WranglerLogger.handlers = [] + + # create a logger + WranglerLogger.setLevel(logging.DEBUG) + + if infoLogFilename: + infologhandler = logging.StreamHandler(open(infoLogFilename, "w")) + infologhandler.setLevel(logging.INFO) + infologhandler.setFormatter( + logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") + ) + WranglerLogger.addHandler(infologhandler) + + if debugLogFilename: + debugloghandler = logging.StreamHandler(open(debugLogFilename, "w")) + debugloghandler.setLevel(logging.DEBUG) + debugloghandler.setFormatter( + logging.Formatter("%(asctime)s %(levelname)s %(message)s", "%Y-%m-%d %H:%M") + ) + WranglerLogger.addHandler(debugloghandler) + + if logToConsole: + consolehandler = logging.StreamHandler() + consolehandler.setLevel(logging.DEBUG) + consolehandler.setFormatter( + logging.Formatter("%(name)-12s: %(levelname)-8s %(message)s") + ) + WranglerLogger.addHandler(consolehandler)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_modules/lasso/parameters/index.html b/branch/seperate_maz_and_taz/_modules/lasso/parameters/index.html new file mode 100644 index 0000000..8e396ec --- /dev/null +++ b/branch/seperate_maz_and_taz/_modules/lasso/parameters/index.html @@ -0,0 +1,1062 @@ + + + + + + lasso.parameters — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.parameters

+import os
+from .logger import WranglerLogger
+
+
+from pyproj import CRS
+
+
+def get_base_dir(lasso_base_dir=os.getcwd()):
+    d = lasso_base_dir
+    for i in range(3):
+        if "metcouncil_data" in os.listdir(d):
+
+            WranglerLogger.info("Lasso base directory set as: {}".format(d))
+            return d
+        d = os.path.dirname(d)
+
+    msg = "Cannot find Lasso base directory from {}, please input using keyword in parameters: `lasso_base_dir =` ".format(
+        lasso_base_dir
+    )
+    WranglerLogger.error(msg)
+    raise (ValueError(msg))
+
+# should be a dataclass
+
[docs]class Parameters: + """A class representing all the parameters defining the networks + including time of day, categories, etc. + + Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + .. highlight:: python + + Attr: + time_period_to_time (dict): Maps time period abbreviations used in + Cube to time of days used on gtfs and highway network standard + Default: + :: + { + "EA": ("3:00", "6:00"), + "AM": ("6:00, "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + cube_time_periods (dict): Maps cube time period numbers used in + transit line files to the time period abbreviations in time_period_to_time + dictionary. + Default: + :: + {"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"} + categories (dict): Maps demand category abbreviations to a list of + network categories they are allowed to use. + Default: + :: + { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + properties_to_split (dict): Dictionary mapping variables in standard + roadway network to categories and time periods that need to be + split out in final model network to get variables like LANES_AM. + Default: + :: + { + "lanes": { + "v": "lanes", + "time_periods": self.time_periods_to_time + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_periods_to_time + }, + "use": { + "v": "use", + "time_periods": self.time_periods_to_time + }, + } + + county_shape (str): File location of shapefile defining counties. + Default: + :: + r"metcouncil_data/county/cb_2017_us_county_5m.shp" + + county_variable_shp (str): Property defining the county n ame in + the county_shape file. + Default: + :: + NAME + lanes_lookup_file (str): Lookup table of number of lanes for different data sources. + Default: + :: + r"metcouncil_data/lookups/lanes.csv" + centroid_connect_lanes (int): Number of lanes for centroid connectors. + Default: + :: + 1 + mpo_counties (list): list of county names within MPO boundary. + Default: + :: + [ + "ANOKA", + "DAKOTA", + "HENNEPIN", + "RAMSEY", + "SCOTT", + "WASHINGTON", + "CARVER", + ] + + taz_shape (str): + Default: + :: + r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp" + taz_data (str): + Default: + :: + ?? + highest_taz_number (int): highest TAZ number in order to define + centroid connectors. + Default: + :: + 3100 + + output_variables (list): list of variables to output in final model + network. + Default: + :: + [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + "distance", + "roadway", + "name", + "roadway_class", + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "trn_priority_EA", + "trn_priority_AM", + "trn_priority_MD", + "trn_priority_PM", + "trn_priority_EV", + "ttime_assert_EA", + "ttime_assert_AM", + "ttime_assert_MD", + "ttime_assert_PM", + "ttime_assert_EV", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "price_sov_EA", + "price_hov2_EA", + "price_hov3_EA", + "price_truck_EA", + "price_sov_AM", + "price_hov2_AM", + "price_hov3_AM", + "price_truck_AM", + "price_sov_MD", + "price_hov2_MD", + "price_hov3_MD", + "price_truck_MD", + "price_sov_PM", + "price_hov2_PM", + "price_hov3_PM", + "price_truck_PM", + "price_sov_EV", + "price_hov2_EV", + "price_hov3_EV", + "price_truck_EV", + "roadway_class_idx", + "facility_type", + "county", + "centroidconnect", + "model_node_id", + "N", + "osm_node_id", + "bike_node", + "transit_node", + "walk_node", + "drive_node", + "geometry", + "X", + "Y", + "ML_lanes_EA", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "segment_id", + "managed", + "bus_only", + "rail_only" + ] + + osm_facility_type_dict (dict): Mapping between OSM Roadway variable + and facility type. Default: + + area_type_shape (str): Location of shapefile defining area type. + Default: + :: + r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp" + area_type_variable_shp (str): property in area_type_shape with area + type in it. + Default: + :: + "COMDES2040" + area_type_code_dict (dict): Mapping of the area_type_variable_shp to + the area type code used in the MetCouncil cube network. + Default: + :: + { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + downtown_area_type_shape (str): Location of shapefile defining downtown area type. + Default: + :: + r"metcouncil_data/area_type/downtownzones_TAZ.shp" + downtown_area_type (int): Area type integer for downtown. + Default: + :: + 5 + mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp" + mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp + associated with roadway class. Default: + :: + "ROUTE_SYS" + widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp" + widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape + associated with roadway class.Default: + :: + "RDWY_CTGY_" + mndot_count_shape (str): Shapefile of MnDOT links with a property + associated with counts. Default: + :: + r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp" + mndot_count_variable_shp (str): The property in mndot_count_shape + associated with counts. Default: + + :: + "lookups/osm_highway_facility_type_crosswalk.csv" + legacy_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from Legacy TM2 network. Default: + :: + "lookups/legacy_tm2_attributes.csv" + osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId + from OSM. Default: + :: + "lookups/osm_lanes_attributes.csv" + tam_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from TAM TM2 network. Default: + :: + "lookups/tam_tm2_attributes.csv" + tom_tom_attributes (str): CSV file of link attributes by + shStReferenceId from TomTom network. Default: + :: + "lookups/tomtom_attributes.csv" + sfcta_attributes (str): CSV file of link attributes by + shStReferenceId from SFCTA network. Default: + :: + "lookups/sfcta_attributes.csv" + output_epsg (int): EPSG type of geographic projection for output + shapefiles. Default: + :: + 102646 + output_link_shp (str): Output shapefile for roadway links. Default: + :: + r"tests/scratch/links.shp" + output_node_shp (str): Output shapefile for roadway nodes. Default: + :: + r"tests/scratch/nodes.shp" + output_link_csv (str): Output csv for roadway links. Default: + :: + r"tests/scratch/links.csv" + output_node_csv (str): Output csv for roadway nodes. Default: + :: + r"tests/scratch/nodes.csv" + output_link_txt (str): Output fixed format txt for roadway links. Default: + :: + r"tests/scratch/links.txt" + output_node_txt (str): Output fixed format txt for roadway nodes. Default: + :: + r"tests/scratch/nodes.txt" + output_link_header_width_txt (str): Header for txt roadway links. Default: + :: + r"tests/scratch/links_header_width.txt" + output_node_header_width_txt (str): Header for txt for roadway Nodes. Default: + :: + r"tests/scratch/nodes_header_width.txt" + output_cube_network_script (str): Cube script for importing + fixed-format roadway network. Default: + :: + r"tests/scratch/make_complete_network_from_fixed_width_file.s + + + + """ + +
[docs] def __init__(self, **kwargs): + """ + Time period and category splitting info + """ + if "time_periods_to_time" in kwargs: + self.time_periods_to_time = kwargs.get("time_periods_to_time") + else: + self.time_period_to_time = { + "EA": ("3:00", "6:00"), + "AM": ("6:00", "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + + #MTC + self.cube_time_periods = { + "1": "EA", + "2": "AM", + "3": "MD", + "4": "PM", + "5": "EV", + } + + self.taz_net_max_ft = 6 + self.maz_net_max_ft = 7 + # Potentially make a named tuple + self.taz_node_join_tolerance = (100, "US survey foot") + self.max_length_centroid_connector_when_none_in_taz = 999999999999999999999999999999999999 + + #TODO make this relative + self.taz_shape_file = r"C:\Users\USLP095001\code\MTC\travel-model-two\maz_taz\shapefiles\tazs_TM2_v2_2.shp" + self.maz_shape_file = r"C:\Users\USLP095001\code\MTC\travel-model-two\maz_taz\shapefiles\mazs_TM2_v2_2.shp" + + """ + #MC + self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7} + + self.route_type_mode_dict = {0: 8, 2: 9} + + self.cube_time_periods = {"1": "AM", "2": "MD"} + self.cube_time_periods_name = {"AM": "pk", "MD": "op"} + """ + if "categories" in kwargs: + self.categories = kwargs.get("categories") + else: + self.categories = { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + + # prefix, source variable, categories + self.properties_to_split = { + "lanes": { + "v": "lanes", + "time_periods": self.time_period_to_time, + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_period_to_time, + }, + "useclass": { + "v": "useclass", + "time_periods": self.time_period_to_time, + }, + } + + """ + Details for calculating the county based on the centroid of the link. + The NAME varible should be the name of a field in shapefile. + """ + #MTC + if 'lasso_base_dir' in kwargs: + self.base_dir = get_base_dir(lasso_base_dir = kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + + if 'data_file_location' in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "mtc_data") + + #MC + if "lasso_base_dir" in kwargs: + self.base_dir = get_base_dir(lasso_base_dir=kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + """ + if "data_file_location" in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "metcouncil_data") + """ + + #-------- + if "settings_location" in kwargs: + self.settings_location = kwargs.get("settings_location") + else: + self.settings_location = os.path.join(self.base_dir, "examples", "settings") + + if "scratch_location" in kwargs: + self.scratch_location = kwargs.get("scratch_location") + else: + self.scratch_location = os.path.join(self.base_dir, "tests", "scratch") + + ### COUNTIES + + self.county_shape = os.path.join( + self.data_file_location, "county", "county.shp" + ) + self.county_variable_shp = "NAME" + + #MTC + self.county_code_dict = { + 'San Francisco':1, + 'San Mateo':2, + 'Santa Clara':3, + 'Alameda':4, + 'Contra Costa':5, + 'Solano':6, + 'Napa':7, + 'Sonoma':8, + 'Marin':9, + 'External':10, + } + + self.county_centroid_range_dict = { + 'San Francisco':range(1,100000), + 'San Mateo':range(100001,200000), + 'Santa Clara':range(200001,300000), + 'Alameda':range(300001,400000), + 'Contra Costa':range(400001,500000), + 'Solano':range(500001,600000), + 'Napa':range(600001,700000), + 'Sonoma':range(700001,800000), + 'Marin':range(800001,900000), + 'External':range(900001,1000000) + } + + self.county_node_range_dict = { + 'San Francisco':range(1000000,1500000), + 'San Mateo':range(1500000,2000000), + 'Santa Clara':range(2000000,2500000), + 'Alameda':range(2500000,3000000), + 'Contra Costa':range(3000000,3500000), + 'Solano':range(3500000,4000000), + 'Napa':range(4000000,4500000), + 'Sonoma':range(4500000,5000000), + 'Marin':range(5000000,5500000), + } + + self.county_hov_node_range_dict = { + 'San Francisco':range(5500000,6000000), + 'San Mateo':range(6000000,6500000), + 'Santa Clara':range(6500000,7000000), + 'Alameda':range(7000000,7500000), + 'Contra Costa':range(7500000,8000000), + 'Solano':range(8000000,8500000), + 'Napa':range(8500000,9000000), + 'Sonoma':range(9000000,9500000), + 'Marin':range(9500000,10000000), + } + + self.county_link_range_dict = { + 'San Francisco':range(1,1000000), + 'San Mateo':range(1000000,2000000), + 'Santa Clara':range(2000000,3000000), + 'Alameda':range(3000000,4000000), + 'Contra Costa':range(4000000,5000000), + 'Solano':range(5000000,6000000), + 'Napa':range(6000000,7000000), + 'Sonoma':range(7000000,8000000), + 'Marin':range(8000000,9000000) + } + + #MC + """ + self.county_code_dict = { + "Anoka": 1, + "Carver": 2, + "Dakota": 3, + "Hennepin": 4, + "Ramsey": 5, + "Scott": 6, + "Washington": 7, + "external": 10, + "Chisago": 11, + "Goodhue": 12, + "Isanti": 13, + "Le Sueur": 14, + "McLeod": 15, + "Pierce": 16, + "Polk": 17, + "Rice": 18, + "Sherburne": 19, + "Sibley": 20, + "St. Croix": 21, + "Wright": 22, + } + """ + + self.mpo_counties = [ + 1, + 3, + 4, + 5, + 6, + 7, + 8, + 9 + ] + + + self.emme_drive_filter_criteria = [ + "" + ] + + self.taz_N_list = list(range(1, 10000)) + list(range(100001, 110000)) + list(range(200001, 210000)) + list(range(300001, 310000))\ + + list(range(400001, 410000)) + list(range(500001, 510000)) + list(range(600001, 610000)) + list(range(700001, 710000))\ + + list(range(800001, 810000)) + list(range(900001, 1000000)) + + self.maz_N_list = list(range(10001, 90000)) + list(range(110001, 190000)) + list(range(210001, 290000)) + list(range(310001, 390000))\ + + list(range(410001, 490000)) + list(range(510001, 590000)) + list(range(610001, 690000)) + list(range(710001, 790000))\ + + list(range(810001, 890000)) + + self.tap_N_list = list(range(90001, 99999)) + list(range(190001, 199999)) + list(range(290001, 299999)) + list(range(390001, 399999))\ + + list(range(490001, 499999)) + list(range(590001, 599999)) + list(range(690001, 699999)) + list(range(790001, 799999))\ + + list(range(890001, 899999)) + + self.tap_N_start = { + "San Francisco" : 90001, + "San Mateo" : 190001, + "Santa Clara" : 290001, + "Alameda" : 390001, + "Contra Costa" : 490001, + "Solano" : 590001, + "Napa" : 690001, + "Sonoma" : 790001, + "Marin" : 890001 + } + + #MTC + self.osm_facility_type_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_facility_type_crosswalk.csv" + ) + #MC + ### Lanes + self.lanes_lookup_file = os.path.join( + self.data_file_location, "lookups", "lanes.csv" + ) + + ### TAZS + + self.taz_shape = os.path.join( + self.data_file_location, "TAZ", "TAZOfficialWCurrentForecasts.shp" + ) + ###### + #MTC + self.osm_lanes_attributes = os.path.join( + self.data_file_location, "lookups", "osm_lanes_attributes.csv" + ) + + self.legacy_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "legacy_tm2_attributes.csv" + ) + + self.assignable_analysis = os.path.join( + self.data_file_location, "lookups", "assignable_analysis_links.csv" + ) + ### + ### AREA TYPE - MC + self.area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "ThriveMSP2040CommunityDesignation.shp", + ) + self.area_type_variable_shp = "COMDES2040" + # area type map from raw data to model category + + # source https://metrocouncil.org/Planning/Publications-And-Resources/Thrive-MSP-2040-Plan-(1)/7_ThriveMSP2040_LandUsePoliciesbyCD.aspx + # urban center + # urban + # suburban + # suburban edge + # emerging suburban edge + # rural center + # diversified rural + # rural residential + # agricultural + self.area_type_code_dict = { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + + self.downtown_area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "downtownzones_TAZ.shp", + ) + + self.downtown_area_type = int(5) + + self.centroid_connect_lanes = int(1) + + self.osm_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_asgngrp_crosswalk.csv" + ) + self.mrcc_roadway_class_shape = os.path.join( + self.data_file_location, "mrcc", "trans_mrcc_centerlines.shp" + ) + #### + ###MTC + self.tam_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "tam_tm2_attributes.csv" + ) + + self.sfcta_attributes = os.path.join( + self.data_file_location, "lookups", "sfcta_attributes.csv" + ) + + self.tomtom_attributes = os.path.join( + self.data_file_location, "lookups", "tomtom_attributes.csv" + ) + + self.pems_attributes = os.path.join( + self.data_file_location, "lookups", "pems_attributes.csv" + ) + + self.centroid_file = os.path.join( + self.data_file_location, "centroid", "centroid_node.pickle" + ) + #### + ###MC + self.widot_shst_data = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "widot.out.matched.geojson", + ) + #### + + self.centroid_connector_link_file = os.path.join( + self.data_file_location, "centroid", "cc_link.pickle" + ) + + self.centroid_connector_shape_file = os.path.join( + self.data_file_location, "centroid", "cc_shape.pickle" + ) + + self.tap_file = os.path.join( + self.data_file_location, "tap", "tap_node.pickle" + ) + + self.tap_connector_link_file = os.path.join( + self.data_file_location, "tap", "tap_link.pickle" + ) + + self.tap_connector_shape_file = os.path.join( + self.data_file_location, "tap", "tap_shape.pickle" + ) + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + ###MTC + self.log_to_net_crosswalk = os.path.join(self.settings_location, "log_to_net.csv") + + self.emme_name_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "emme_attribute_names.csv" + ) + #### + #MC + self.mndot_count_variable_shp = "AADT_mn" + + self.widot_county_shape = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "TRADAS_(counts).shp", + ) + ### + ###MTC + self.mode_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "gtfs_to_tm2_mode_crosswalk.csv" + ) + + self.veh_cap_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "transitSeatCap.csv" + ) + + self.faresystem_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "faresystem_crosswalk.txt" + ) + + # https://app.asana.com/0/12291104512575/1200287255197808/f + self.fare_2015_to_2010_deflator = 0.927 + self.fare_2015_to_2000_deflator = 180.20/258.27 + #### + #MC + self.widot_count_variable_shp = "AADT_wi" + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + self.log_to_net_crosswalk = os.path.join( + self.settings_location, "log_to_net.csv" + ) + + self.subregion_boundary_file = os.path.join( + self.data_file_location, 'emme', 'subregion_boundary_for_active_modes.shp' + ) + + self.subregion_boundary_id_variable = 'subregion' + #### + + self.output_variables = [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + #MTC + 'name', + "distance", + #"roadway", + #"name", + #MC + #"shape_id", + #"distance", + #"roadway", + #"name", + #"roadway_class", + #### + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "county", + "model_node_id", + "N", + "osm_node_id", + "geometry", + "X", + "Y", + "segment_id", + "managed", + "bus_only", + "rail_only", + #MTC + "assignable", + "cntype", + "useclass_AM", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "useclass_EA", + "transit", + "tollbooth", + "tollseg", + "ft", + "tap_drive", + "tollbooth", + "tollseg", + "farezone", + "tap_id", + #### + #MC + "bike_facility", + "mrcc_id", + "ROUTE_SYS", # mrcc functional class + ] + + self.output_link_shp = os.path.join(self.scratch_location, "links.shp") + self.output_node_shp = os.path.join(self.scratch_location, "nodes.shp") + self.output_link_csv = os.path.join(self.scratch_location, "links.csv") + self.output_node_csv = os.path.join(self.scratch_location, "nodes.csv") + self.output_link_txt = os.path.join(self.scratch_location, "links.txt") + self.output_node_txt = os.path.join(self.scratch_location, "nodes.txt") + self.output_link_header_width_txt = os.path.join( + self.scratch_location, "links_header_width.txt" + ) + self.output_node_header_width_txt = os.path.join( + self.scratch_location, "nodes_header_width.txt" + ) + self.output_cube_network_script = os.path.join( + self.scratch_location, "make_complete_network_from_fixed_width_file.s" + ) + self.output_dir = os.path.join(self.scratch_location) + self.output_proj = CRS("epsg:2875") + self.output_proj4 = '+proj=lcc +lat_0=32.1666666666667 +lon_0=-116.25 +lat_1=33.8833333333333 +lat_2=32.7833333333333 +x_0=2000000.0001016 +y_0=500000.0001016 +ellps=GRS80 +towgs84=-0.991,1.9072,0.5129,-1.25033e-07,-4.6785e-08,-5.6529e-08,0 +units=us-ft +no_defs +type=crs' + self.prj_file = os.path.join(self.data_file_location, 'projection', '2875.prj') + self.wkt_projection = 'PROJCS["NAD83(HARN) / California zone 6 (ftUS)",GEOGCS["NAD83(HARN)",DATUM["NAD83_High_Accuracy_Reference_Network",SPHEROID["GRS 1980",6378137,298.257222101],TOWGS84[-0.991,1.9072,0.5129,-1.25033E-07,-4.6785E-08,-5.6529E-08,0]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4152"]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["latitude_of_origin",32.1666666666667],PARAMETER["central_meridian",-116.25],PARAMETER["standard_parallel_1",33.8833333333333],PARAMETER["standard_parallel_2",32.7833333333333],PARAMETER["false_easting",6561666.667],PARAMETER["false_northing",1640416.667],UNIT["US survey foot",0.304800609601219],AXIS["Easting",EAST],AXIS["Northing",NORTH],AUTHORITY["EPSG","2875"]]' + + self.fare_matrix_output_variables = ["faresystem", "origin_farezone", "destination_farezone", "price"] + + self.zones = 4756 + """ + Create all the possible headway variable combinations based on the cube time periods setting + """ + self.time_period_properties_list = [ + p + "[" + str(t) + "]" + for p in ["HEADWAY", "FREQ"] + for t in self.cube_time_periods.keys() + ] + + self.int_col = [ + "model_link_id", + "model_node_id", + "A", + "B", + #MTC + #"county", + ### + #MC + # "lanes", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "roadway_class", + "assign_group", + #"county", + "area_type", + "trn_priority", + "AADT", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "centroidconnect", + "bike_facility", + #### + "drive_access", + "walk_access", + "bike_access", + "truck_access", + #MTC + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "ML_lanes_EA", + ### + #MC + "drive_node", + "walk_node", + "bike_node", + "transit_node", + # "ML_lanes", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_NT", + #### + "segment_id", + "managed", + "bus_only", + "rail_only", + "transit", + ##MTC + "ft", + "assignable", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EA", + "lanes_EV", + "useclass_AM", + "useclass_EA", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "tollseg", + "tollbooth", + "farezone", + "tap_id", + ] + + self.float_col = [ + "distance", + "price", + "X", + "Y" + "mrcc_id", + ] + + self.float_col = ["distance", "ttime_assert", "price", "X", "Y"] + + self.string_col = [ + "osm_node_id", + "name", + "roadway", + "shstGeometryId", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "ROUTE_SYS", + ] + + # paramters added for PNR simulation + self.pnr_node_location = os.path.join( + self.data_file_location, "lookups", "pnr_stations.csv" + ) + self.pnr_buffer = 20 + self.knr_buffer = 2.5 + self.walk_buffer = 0.75 + self.transfer_buffer = 1 + self.taz_list = os.path.join( + self.data_file_location, "lookups", "taz_lists.csv" + ) + self.sf_county = os.path.join( + self.data_file_location, "lookups", "SFcounty.shp" + ) + + self.__dict__.update(kwargs)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_modules/lasso/project/index.html b/branch/seperate_maz_and_taz/_modules/lasso/project/index.html new file mode 100644 index 0000000..65ad8b5 --- /dev/null +++ b/branch/seperate_maz_and_taz/_modules/lasso/project/index.html @@ -0,0 +1,1505 @@ + + + + + + lasso.project — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.project

+import json
+import os
+import re
+from typing import Any, Dict, Optional, Union, List
+from csv import reader
+
+from pandas.core import base
+
+import numpy as np
+import pandas as pd
+from pandas import DataFrame
+import geopandas as gpd
+
+from network_wrangler import ProjectCard
+from network_wrangler import RoadwayNetwork
+
+from .transit import CubeTransit, StandardTransit
+from .logger import WranglerLogger
+from .parameters import Parameters
+from .roadway import ModelRoadwayNetwork
+from .util import column_name_to_parts
+
+
+
[docs]class Project(object): + """A single or set of changes to the roadway or transit system. + + Compares a base and a build transit network or a base and build + highway network and produces project cards. + + .. highlight:: python + + Typical usage example: + :: + test_project = Project.create_project( + base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + test_project.evaluate_changes() + test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + + Attributes: + DEFAULT_PROJECT_NAME: a class-level constant that defines what + the project name will be if none is set. + STATIC_VALUES: a class-level constant which defines values that + are not evaluated when assessing changes. + card_data (dict): {"project": <project_name>, "changes": <list of change dicts>} + roadway_link_changes (DataFrame): pandas dataframe of CUBE roadway link changes. + roadway_node_changes (DataFrame): pandas dataframe of CUBE roadway node changes. + transit_changes (CubeTransit): + base_roadway_network (RoadwayNetwork): + base_cube_transit_network (CubeTransit): + build_cube_transit_network (CubeTransit): + project_name (str): name of the project, set to DEFAULT_PROJECT_NAME if not provided + parameters: an instance of the Parameters class which sets a bunch of parameters + """ + + DEFAULT_PROJECT_NAME = "USER TO define" + + STATIC_VALUES = [ + "model_link_id", + "area_type", + "county", + # "assign_group", + "centroidconnect", + ] + CALCULATED_VALUES = [ + "area_type", + "county", + "assign_group", + "centroidconnect", + ] + +
[docs] def __init__( + self, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[DataFrame] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = "", + evaluate: Optional[bool] = False, + parameters: Union[dict, Parameters] = {}, + ): + """ + ProjectCard constructor. + + args: + roadway_link_changes: dataframe of roadway changes read from a log file + roadway_node_changes: dataframe of roadway changes read from a log file + transit_changes: dataframe of transit changes read from a log file + base_roadway_network: RoadwayNetwork instance for base case + base_transit_network: StandardTransit instance for base case + base_cube_transit_network: CubeTransit instance for base transit network + build_cube_transit_network: CubeTransit instance for build transit network + project_name: name of the project + evaluate: defaults to false, but if true, will create card data + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + returns: instance of ProjectCard + """ + self.card_data = Dict[str, Dict[str, Any]] + + self.roadway_link_changes = roadway_link_changes + self.roadway_node_changes = roadway_node_changes + self.base_roadway_network = base_roadway_network + self.base_transit_network = base_transit_network + self.base_cube_transit_network = base_cube_transit_network + self.build_cube_transit_network = build_cube_transit_network + self.transit_changes = transit_changes + self.project_name = ( + project_name if project_name else Project.DEFAULT_PROJECT_NAME + ) + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + if base_roadway_network != None: + self.determine_roadway_network_changes_compatibility( + self.base_roadway_network, + self.roadway_link_changes, + self.roadway_node_changes, + self.parameters + ) + + if evaluate: + self.evaluate_changes()
+ +
[docs] def write_project_card(self, filename: str = None): + """ + Writes project cards. + + Args: + filename (str): File path to output .yml + + Returns: + None + """ + ProjectCard(self.card_data).write(out_filename=filename)
+ +
[docs] @staticmethod + def create_project( + roadway_log_file: Union[str, List[str], None] = None, + roadway_shp_file: Optional[str] = None, + roadway_csv_file: Optional[str] = None, + network_build_file: Optional[str] = None, + emme_node_id_crosswalk_file: Optional[str] = None, + emme_name_crosswalk_file: Optional[str] = None, + base_roadway_dir: Optional[str] = None, + base_transit_dir: Optional[str] = None, + base_cube_transit_source: Optional[str] = None, + build_cube_transit_source: Optional[str] = None, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[CubeTransit] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = None, + recalculate_calculated_variables: Optional[bool] = False, + recalculate_distance: Optional[bool] = False, + parameters: Optional[dict] = {}, + **kwargs, + ): + """ + Constructor for a Project instance. + + Args: + roadway_log_file: File path to consuming logfile or a list of logfile paths. + roadway_shp_file: File path to consuming shape file for roadway changes. + roadway_csv_file: File path to consuming csv file for roadway changes. + network_build_file: File path to consuming EMME network build for network changes. + base_roadway_dir: Folder path to base roadway network. + base_transit_dir: Folder path to base transit network. + base_cube_transit_source: Folder path to base transit network or cube line file string. + base_cube_transit_file: File path to base transit network. + build_cube_transit_source: Folder path to build transit network or cube line file string. + build_cube_transit_file: File path to build transit network. + roadway_link_changes: pandas dataframe of CUBE roadway link changes. + roadway_node_changes: pandas dataframe of CUBE roadway node changes. + transit_changes: build transit changes. + base_roadway_network: Base roadway network object. + base_cube_transit_network: Base cube transit network object. + build_cube_transit_network: Build cube transit network object. + project_name: If not provided, will default to the roadway_log_file filename if + provided (or the first filename if a list is provided) + recalculate_calculated_variables: if reading in a base network, if this is true it + will recalculate variables such as area type, etc. This only needs to be true + if you are creating project cards that are changing the calculated variables. + recalculate_distance: recalculate the distance variable. This only needs to be + true if you are creating project cards that change the distance. + parameters: dictionary of parameters + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in + the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables + in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + + Returns: + A Project instance. + """ + + if base_cube_transit_source and base_cube_transit_network: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_cube_transit_network' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_cube_transit_source: + base_cube_transit_network = CubeTransit.create_from_cube(base_cube_transit_source, parameters) + WranglerLogger.debug( + "Base network has {} lines".format(len(base_cube_transit_network.lines)) + ) + if len(base_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Base network lines: {}".format( + "\n - ".join(base_cube_transit_network.lines) + ) + ) + elif base_cube_transit_network: + pass + else: + msg = "No base cube transit network." + WranglerLogger.info(msg) + base_cube_transit_network = None + + if build_cube_transit_source and transit_changes: + msg = "Method takes only one of 'build_cube_transit_source' and 'transit_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if build_cube_transit_source: + WranglerLogger.debug("build") + build_cube_transit_network = CubeTransit.create_from_cube(build_cube_transit_source, parameters) + WranglerLogger.debug( + "Build network has {} lines".format(len(build_cube_transit_network.lines)) + ) + if len(build_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Build network lines: {}".format( + "\n - ".join(build_cube_transit_network.lines) + ) + ) + elif transit_changes: + pass + else: + msg = "No cube transit changes given or processed." + WranglerLogger.info(msg) + transit_changes = None + + if roadway_log_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_log_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_csv_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_csv_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_shp_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_shp_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and not project_name: + if type(roadway_log_file) == list: + project_name = os.path.splitext(os.path.basename(roadway_log_file[0]))[ + 0 + ] + WranglerLogger.info( + "No Project Name - Using name of first log file in list" + ) + else: + project_name = os.path.splitext(os.path.basename(roadway_log_file))[0] + WranglerLogger.info("No Project Name - Using name of log file") + if network_build_file and not project_name: + if type(network_build_file) == list: + with open(network_build_file[0]) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info( + "No Project Name - Using metadata of first network build file in list" + ) + else: + with open(network_build_file) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info("No Project Name - Using metadata of network build file") + if roadway_log_file: + roadway_link_changes, roadway_node_changes = Project.read_logfile(roadway_log_file) + elif roadway_shp_file: + roadway_changes = gpd.read_file(roadway_shp_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_link_changes = DataFrame(roadway_link_changes.drop("geometry", axis=1)) + roadway_node_changes = DataFrame(roadway_node_changes.drop("geometry", axis=1)) + roadway_node_changes["model_node_id"] = 0 + elif roadway_csv_file: + roadway_changes = pd.read_csv(roadway_csv_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_node_changes["model_node_id"] = 0 + elif network_build_file: + roadway_link_changes, roadway_node_changes, transit_changes = Project.read_network_build_file(network_build_file) + if emme_node_id_crosswalk_file: + # get wrangler IDs from emme element_id + roadway_link_changes, roadway_node_changes, transit_changes = Project.emme_id_to_wrangler_id( + roadway_link_changes, + roadway_node_changes, + transit_changes, + emme_node_id_crosswalk_file + ) + else: + msg = "User needs to specify emme node id crosswalk file using emme_node_id_crosswalk_file = " + WranglerLogger.error(msg) + raise ValueError(msg) + # rename emme attributes to wrangler attributes + if emme_name_crosswalk_file is None: + emme_name_crosswalk_file = parameters.emme_name_crosswalk_file + roadway_link_changes, roadway_node_changes = Project.emme_name_to_wrangler_name( + roadway_link_changes, + roadway_node_changes, + emme_name_crosswalk_file + ) + elif roadway_link_changes: + pass + elif roadway_node_changes: + pass + else: + msg = "No roadway changes given or processed." + WranglerLogger.info(msg) + roadway_link_changes = pd.DataFrame({}) + roadway_node_changes = pd.DataFrame({}) + + if base_roadway_network and base_roadway_dir: + msg = "Method takes only one of 'base_roadway_network' and 'base_roadway_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_roadway_dir: + base_roadway_network = ModelRoadwayNetwork.read( + os.path.join(base_roadway_dir, "link.json"), + os.path.join(base_roadway_dir, "node.geojson"), + os.path.join(base_roadway_dir, "shape.geojson"), + fast=True, + recalculate_calculated_variables=recalculate_calculated_variables, + recalculate_distance=recalculate_distance, + parameters=parameters, + **kwargs, + ) + base_roadway_network.split_properties_by_time_period_and_category() + elif base_roadway_network: + base_roadway_network.split_properties_by_time_period_and_category() + else: + msg = "No base roadway network." + WranglerLogger.info(msg) + base_roadway_network = None + + if base_cube_transit_source and base_transit_dir: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_transit_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_transit_dir: + base_transit_network = StandardTransit.read_gtfs( + gtfs_feed_dir=base_transit_dir, + parameters=parameters + ) + else: + msg = "No base transit network." + WranglerLogger.info(msg) + base_transit_network = None + + project = Project( + roadway_link_changes=roadway_link_changes, + roadway_node_changes=roadway_node_changes, + transit_changes=transit_changes, + base_roadway_network=base_roadway_network, + base_transit_network=base_transit_network, + base_cube_transit_network=base_cube_transit_network, + build_cube_transit_network=build_cube_transit_network, + evaluate=True, + project_name=project_name, + parameters=parameters, + ) + + return project
+ +
[docs] @staticmethod + def read_logfile(logfilename: Union[str, List[str]]): + """ + Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + logfilename (str or list[str]): File path to CUBE logfile or list of logfile paths. + + Returns: + A DataFrame reprsentation of the log file. + """ + if type(logfilename) == str: + logfilename = [logfilename] + + link_df = pd.DataFrame() + node_df = pd.DataFrame() + + for file in logfilename: + WranglerLogger.info("Reading logfile: {}".format(file)) + with open(file) as f: + _content = f.readlines() + + _node_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("N") + ] + WranglerLogger.debug("node lines: {}".format(_node_lines)) + _link_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("L") + ] + WranglerLogger.debug("link lines: {}".format(_link_lines)) + + _nodecol = ["OBJECT", "OPERATION", "GROUP"] + _node_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Node Cols: {}".format(_nodecol)) + _linkcol = ["OBJECT", "OPERATION", "GROUP"] + _link_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Link Cols: {}".format(_linkcol)) + + def split_log(x): + return list(reader([x], delimiter=',', quotechar='"'))[0] + + _node_df = pd.DataFrame([split_log(x) for x in _node_lines[1:]],columns = _nodecol) + WranglerLogger.debug("Node DF: {}".format(_node_df)) + _link_df = pd.DataFrame([split_log(x) for x in _link_lines[1:]],columns = _linkcol) + WranglerLogger.debug("Link DF: {}".format(_link_df)) + + node_df = pd.concat([node_df, _node_df]) + link_df = pd.concat([link_df, _link_df]) + + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + link_df.columns = [c.split("[")[0] for c in link_df.columns] + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + node_df.columns = [c.split("[")[0] for c in node_df.columns] + + if len(link_df) > 0: + # create operation history + action_history_df = ( + link_df.groupby(['A', 'B'])["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + link_df = pd.merge(link_df, action_history_df, on=['A', 'B'], how="left") + + if len(node_df) > 0: + action_history_df = ( + node_df.groupby('N')["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + node_df = pd.merge(node_df, action_history_df, on='N', how="left") + + WranglerLogger.info( + "Processed {} Node lines and {} Link lines".format( + node_df.shape[0], link_df.shape[0] + ) + ) + + return link_df, node_df
+ +
[docs] @staticmethod + def read_network_build_file(networkbuildfilename: Union[str, List[str]]): + """ + Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + networkbuildfilename (str or list[str]): File path to emme nework build file or list of network build file paths. + + Returns: + A DataFrame representation of the network build file + """ + if type(networkbuildfilename) == str: + networkbuildfilename = [networkbuildfilename] + + _link_command_history_df = DataFrame() + _node_command_history_df = DataFrame() + _transit_command_history_df = DataFrame() + + for file in networkbuildfilename: + WranglerLogger.info("Reading network build file: {}".format(file)) + with open(file) as f: + _content = json.load(f) + + _command_history = _content.get('command_history') + + # loop through all the commands + for command in _command_history: + if command.get('command') == 'set_attribute': + element_id = command.get('parameters').get('element_ids') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + _command_df[command.get('parameters').get('attribute_name')] = command.get('parameters').get('value') + + if command.get('command') in ['create_link', 'create_node']: + if command.get('command') == 'create_link': + element_id = command.get('results').get('changes').get('added').get('LINK') + if command.get('command') == 'create_node': + element_id = command.get('results').get('changes').get('added').get('NODE') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + for attribute_name, attribute_value in command.get('parameters').get('attributes').items(): + _command_df[attribute_name] = attribute_value + + if command.get('command') == 'delete_link': + element_id = command.get('results').get('changes').get('removed').get('LINK') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + if command.get('command') == 'modify_transit_line': + element_id = command.get('parameters').get('line_id') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : pd.Series(element_id), + 'object' : pd.Series(object), + 'operation' : pd.Series(operation) + } + ) + + _command_df['new_itinerary'] = [command.get('parameters').get('new_itinerary')] + + if ('L' in _command_df['object'].unique()): + _link_command_history_df = _link_command_history_df.append( + _command_df[_command_df['object'] == 'L'], + sort = False, + ignore_index = True + ) + + if ('N' in _command_df['object'].unique()): + _node_command_history_df = _node_command_history_df.append( + _command_df[_command_df['object'] == 'N'], + sort = False, + ignore_index = True + ) + + if ( + ('TRANSIT_LINE' in _command_df['object'].unique()) | + ('TRANSIT_STOP' in _command_df['object'].unique()) | + ('TRANSIT_SHAPE' in _command_df['object'].unique()) + ): + _transit_command_history_df = _transit_command_history_df.append( + _command_df[_command_df['object'].isin(['TRANSIT_LINE', 'TRANSIT_STOP', 'TRANSIT_SHAPE'])], + sort = False, + ignore_index = True + ) + + if len(_link_command_history_df) > 0: + # create operation history + link_action_history_df = ( + _link_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + link_action_history_df["operation_final"] = link_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + # get the last none null value for each element + # consolidate elements to single record + def get_last_valid(series): + if len(series.dropna()) > 0: + return series.dropna().iloc[-1] + else: + return np.nan + + #_command_history_df = _command_history_df.groupby(['element_id']).apply(get_last_valid).reset_index() + _link_command_history_df = _link_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _link_command_history_df = pd.merge(_link_command_history_df, link_action_history_df, on='element_id', how="left") + + if len(_node_command_history_df) > 0: + # create node operation history + node_action_history_df = ( + _node_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + node_action_history_df["operation_final"] = node_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + _node_command_history_df = _node_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _node_command_history_df = pd.merge(_node_command_history_df, node_action_history_df, on='element_id', how="left") + + WranglerLogger.info( + "Processed {} link element commands, {} node element commands".format( + _link_command_history_df.shape[0], + _node_command_history_df.shape[0] + ) + ) + + return _link_command_history_df, _node_command_history_df, _transit_command_history_df
+ +
[docs] @staticmethod + def emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file): + """ + rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder + """ + WranglerLogger.info('Reading emme node id crosswalk file from {}'.format(emme_node_id_crosswalk_file)) + emme_node_id_crosswalk_df = pd.read_csv(emme_node_id_crosswalk_file) + emme_node_id_dict = dict(zip(emme_node_id_crosswalk_df['emme_node_id'], emme_node_id_crosswalk_df['model_node_id'])) + + # get node changes + if len(emme_node_change_df) > 0: + emme_node_change_df['emme_id'] = emme_node_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + + # get new emme nodes + new_emme_node_id_list = [ + n for n in emme_node_change_df['emme_id'] if n not in emme_node_id_crosswalk_df['emme_node_id'] + ] + WranglerLogger.info('New emme node id list {}'.format(new_emme_node_id_list)) + new_wrangler_node = emme_node_id_crosswalk_df['model_node_id'].max() + + # add crosswalk for new emme nodes + for new_emme_node in new_emme_node_id_list: + new_wrangler_node = new_wrangler_node + 1 + emme_node_id_dict.update({new_emme_node : new_wrangler_node}) + + # for nodes update model_node_id + emme_node_change_df['model_node_id'] = emme_node_change_df['emme_id'].map(emme_node_id_dict).fillna(0) + + if len(emme_link_change_df) > 0: + emme_link_change_df['A'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + emme_link_change_df['B'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[-1])) + # for links update A,B nodes + emme_link_change_df['A'] = emme_link_change_df['A'].map(emme_node_id_dict) + emme_link_change_df['B'] = emme_link_change_df['B'].map(emme_node_id_dict) + + if len(emme_transit_changes_df) > 0: + emme_transit_changes_df['i_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-3] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + emme_transit_changes_df['j_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-2] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + # update i,j nodes + emme_transit_changes_df['i_node'] = emme_transit_changes_df[ + 'i_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + emme_transit_changes_df['j_node'] = emme_transit_changes_df[ + 'j_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + + # update routing nodes + emme_transit_changes_df['new_itinerary'] = emme_transit_changes_df.apply( + lambda x: [emme_node_id_dict.get(n) for n in x['new_itinerary']] if x['object'] == 'TRANSIT_SHAPE' else 0, + axis = 1 + ) + + return emme_link_change_df, emme_node_change_df, emme_transit_changes_df
+ +
[docs] def get_object_from_network_build_command(row): + """ + determine the network build object is node or link + + Args: + row: network build command history dataframe + + Returns: + 'N' for node, 'L' for link + """ + + if row.get('command') == 'create_link': + return 'L' + + if row.get('command') == 'create_node': + return 'N' + + if row.get('command') == 'delete_link': + return 'L' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'L' + if row.get('parameters').get('element_type') == 'NODE': + return 'N' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'TRANSIT_LINE' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'TRANSIT_STOP' + + if row.get('command') == 'modify_transit_line': + return 'TRANSIT_SHAPE'
+ +
[docs] def get_operation_from_network_build_command(row): + """ + determine the network build object action type + + Args: + row: network build command history dataframe + + Returns: + 'A', 'C', 'D' + """ + + if row.get('command') == 'create_link': + return 'A' + + if row.get('command') == 'create_node': + return 'A' + + if row.get('command') == 'delete_link': + return 'D' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'C' + if row.get('parameters').get('element_type') == 'NODE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'C' + + if row.get('command') == 'modify_transit_line': + return 'C'
+ +
[docs] @staticmethod + def emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file): + """ + rename emme names to wrangler names using crosswalk file + """ + + WranglerLogger.info('Reading emme attribute name crosswalk file {}'.format(emme_name_crosswalk_file)) + emme_name_crosswalk_df = pd.read_csv(emme_name_crosswalk_file) + emme_name_crosswalk_dict = dict(zip(emme_name_crosswalk_df['emme_name'], emme_name_crosswalk_df['wrangler_name'])) + + # drop columns we don't need from emme to avoid confusion + ignore_columns = [ + c for c in emme_link_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'A', 'B'] + ] + WranglerLogger.info('Ignoring link changes in {}'.format(ignore_columns)) + emme_link_change_df = emme_link_change_df.drop(ignore_columns, axis = 1) + + ignore_columns = [ + c for c in emme_node_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'model_node_id'] + ] + WranglerLogger.info('Ignoring node changes in {}'.format(ignore_columns)) + emme_node_change_df = emme_node_change_df.drop(ignore_columns, axis = 1) + + # rename emme name to wrangler name + emme_link_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + emme_node_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + + return emme_link_change_df, emme_node_change_df
+ +
[docs] @staticmethod + def determine_roadway_network_changes_compatability( + base_roadway_network: ModelRoadwayNetwork, + roadway_link_changes: DataFrame, + roadway_node_changes: DataFrame, + parameters: Parameters, + ): + """ + Checks to see that any links or nodes that change exist in base roadway network. + """ + WranglerLogger.info( + "Evaluating compatibility between roadway network changes and base network. Not evaluating deletions." + ) + + # CUBE log file saves all variable names in upper cases, need to convert them to be same as network + log_to_net_df = pd.read_csv(parameters.log_to_net_crosswalk) + log_to_net_dict = dict(zip(log_to_net_df["log"], log_to_net_df["net"])) + + dbf_to_net_df = pd.read_csv(parameters.net_to_dbf_crosswalk) + dbf_to_net_dict = dict(zip(dbf_to_net_df["dbf"], dbf_to_net_df["net"])) + + for c in roadway_link_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_link_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_link_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_link_changes.rename(columns=dbf_to_net_dict, inplace=True) + + for c in roadway_node_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_node_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_node_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_node_changes.rename(columns=dbf_to_net_dict, inplace=True) + + # for links "L" that change "C", + # find locations where there isn't a base roadway link + if len(roadway_link_changes) > 0: + link_changes_df = roadway_link_changes[ + roadway_link_changes["operation_final"] == "C" + ].copy() + + link_merge_df = pd.merge( + link_changes_df[["A", "B"]].astype(str), + base_roadway_network.links_df[["A", "B", "model_link_id"]].astype(str), + how="left", + on=["A", "B"], + ) + + missing_links = link_merge_df.loc[link_merge_df["model_link_id"].isna()] + + if missing_links.shape[0]: + msg = "Network missing the following AB links:\n{}".format(missing_links) + WranglerLogger.error(msg) + raise ValueError(msg) + + # for links "N" that change "C", + # find locations where there isn't a base roadway node + if len(roadway_node_changes) > 0: + node_changes_df = roadway_node_changes[ + roadway_node_changes["operation_final"] == "C" + ].copy() + + node_merge_df = pd.merge( + node_changes_df[["model_node_id"]], + base_roadway_network.nodes_df[["model_node_id", "geometry"]], + how="left", + on=["model_node_id"], + ) + missing_nodes = node_merge_df.loc[node_merge_df["geometry"].isna()] + if missing_nodes.shape[0]: + msg = "Network missing the following nodes:\n{}".format(missing_nodes) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] def evaluate_changes(self): + """ + Determines which changes should be evaluated, initiates + self.card_data to be an aggregation of transit and highway changes. + """ + highway_change_list = [] + transit_change_list = [] + + WranglerLogger.info("Evaluating project changes.") + + if (not self.roadway_link_changes.empty) | (not self.roadway_node_changes.empty): + highway_change_list = self.add_highway_changes() + + if (not self.transit_changes.empty) or ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + self.card_data = { + "project": self.project_name, + "changes": transit_change_list + highway_change_list, + }
+ +
[docs] def add_transit_changes(self): + """ + Evaluates changes between base and build transit objects and + adds entries into the self.card_data dictionary. + """ + if self.build_cube_transit_network: + transit_change_list = self.build_cube_transit_network.evaluate_differences( + self.base_cube_transit_network + ) + elif self.base_transit_network: + transit_change_list = self.base_transit_network.evaluate_differences( + self.transit_changes + ) + return transit_change_list
+ + @staticmethod + def _final_op(x): + if x["operation_history"][-1] == "D": + if "A" in x["operation_history"][:-1]: + return "N" + else: + return "D" + elif x["operation_history"][-1] == "A": + if "D" in x["operation_history"][:-1]: + return "C" + else: + return "A" + else: + if "A" in x["operation_history"][:-1]: + return "A" + else: + return "C" + +
[docs] def add_highway_changes(self, limit_variables_to_existing_network=False): + """ + Evaluates changes from the log file based on the base highway object and + adds entries into the self.card_data dictionary. + + Args: + limit_variables_to_existing_network (bool): True if no ad-hoc variables. Default to False. + """ + + for c in self.parameters.string_col: + if c in self.roadway_link_changes.columns: + self.roadway_link_changes[c] = self.roadway_link_changes[c].str.lstrip(" ") + if c in self.roadway_node_changes.columns: + self.roadway_node_changes[c] = self.roadway_node_changes[c].str.lstrip(" ") + + ## if worth it, could also add some functionality to network wrangler itself. + node_changes_df = self.roadway_node_changes.copy() + + link_changes_df = self.roadway_link_changes.copy() + + def _process_deletions(link_changes_df): + """ + create deletion section in project card + """ + WranglerLogger.debug("Processing link deletions") + + cube_delete_df = link_changes_df[link_changes_df["operation_final"] == "D"].copy() + + # make sure columns has the same type as base network + cube_delete_df['A'] = cube_delete_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_delete_df['B'] = cube_delete_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_delete_df.columns: + cube_delete_df.drop(['model_link_id'], axis = 1, inplace = True) + + cube_delete_df = pd.merge( + cube_delete_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if len(cube_delete_df) > 0: + links_to_delete = cube_delete_df["model_link_id"].tolist() + delete_link_dict = { + "category": "Roadway Deletion", + "links": {"model_link_id": links_to_delete}, + } + WranglerLogger.debug("{} Links Deleted.".format(len(links_to_delete))) + else: + delete_link_dict = None + WranglerLogger.debug("No link deletions processed") + + return delete_link_dict + + def _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ): + """""" + WranglerLogger.debug("Processing link additions") + cube_add_df = link_changes_df[link_changes_df["operation_final"] == "A"] + if len(cube_add_df) == 0: + WranglerLogger.debug("No link additions processed") + return {} + + if limit_variables_to_existing_network: + add_col = [ + c + for c in cube_add_df.columns + if c in self.base_roadway_network.links_df.columns + ] + else: + add_col = [ + c for c in cube_add_df.columns if c not in ["operation_final"] + ] + # can leave out "operation_final" from writing out, is there a reason to write it out? + + for x in add_col: + cube_add_df[x] = cube_add_df[x].astype(self.base_roadway_network.links_df[x].dtype) + + add_link_properties = cube_add_df[add_col].to_dict("records") + + # WranglerLogger.debug("Add Link Properties: {}".format(add_link_properties)) + WranglerLogger.debug("{} Links Added".format(len(add_link_properties))) + + return {"category": "Add New Roadway", "links": add_link_properties} + + def _process_node_additions(node_add_df): + """""" + WranglerLogger.debug("Processing node additions") + + if len(node_add_df) == 0: + WranglerLogger.debug("No node additions processed") + return [] + + node_add_df = node_add_df.drop(["operation_final"], axis=1) + + for x in node_add_df.columns: + node_add_df[x] = node_add_df[x].astype(self.base_roadway_network.nodes_df[x].dtype) + + add_nodes_dict_list = node_add_df.to_dict( + "records" + ) + WranglerLogger.debug("{} Nodes Added".format(len(add_nodes_dict_list))) + + return add_nodes_dict_list + + def _process_single_link_change(change_row, changeable_col): + """""" + + # 1. Find associated base year network values + base_df = self.base_roadway_network.links_df[ + (self.base_roadway_network.links_df["A"] == int(change_row.A)) + & (self.base_roadway_network.links_df["B"] == int(change_row.B)) + ] + + if not base_df.shape[0]: + msg = "No match found in network for AB combination: ({},{}). Incompatible base network.".format( + change_row.A, change_row.B + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + elif base_df.shape[0] > 1: + WranglerLogger.warning( + "Found more than one match in base network for AB combination: ({},{}). Selecting first one to operate on but AB should be unique to network.".format( + change_row.A, change_row.B + ) + ) + + base_row = base_df.iloc[0] + # WranglerLogger.debug("Properties with changes: {}".format(changeable_col)) + + # 2. find columns that changed (enough) + changed_col = [] + for col in changeable_col: + WranglerLogger.debug("Assessing Column: {}".format(col)) + # if it is the same as before, or a static value, don't process as a change + if str(change_row[col]).strip('"\'') == str(base_row[col]).strip('"\''): + continue + # if it is NaN or None, don't process as a change + if (change_row[col] != change_row[col]) | (change_row[col] is None): + continue + if (col == "roadway_class") & (change_row[col] == 0): + continue + # only look at distance if it has significantly changed + if col == "distance": + if ( + abs( + (change_row[col] - float(base_row[col])) + / base_row[col].astype(float) + ) + > 0.01 + ): + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + else: + continue + else: + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + + WranglerLogger.debug( + "Properties with changes that will be processed: {}".format(changed_col) + ) + + if not changed_col: + return pd.DataFrame() + + # 3. Iterate through columns with changed values and structure the changes as expected in project card + property_dict_list = [] + processed_properties = [] + + # check if it's a manged lane change + for c in changed_col: + if c.startswith("ML_"): + # TODO ML project card skeleton + msg = "Detected managed lane changes, please create managed lane project card!" + WranglerLogger.error(msg) + raise ValueError(msg) + return + + # regular roadway property change + for c in changed_col: + # WranglerLogger.debug("Processing Column: {}".format(c)) + ( + p_base_name, + p_time_period, + p_category, + managed_lane, + ) = column_name_to_parts(c, self.parameters) + + _d = { + "existing": base_row[c], + "set": change_row[c], + } + if c in Project.CALCULATED_VALUES: + _d = { + "set": change_row[c], + } + if p_time_period: + if managed_lane == 1: + _d["time"] = list( + self.parameters.time_period_to_time[p_time_period] + ) + if p_category: + _d["category"] = p_category + + # iterate through existing properties that have been changed and see if you should just add + if (p_base_name in processed_properties) & (managed_lane == 1): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + processed_p["timeofday"] += [_d] + elif (p_base_name in processed_properties) & (managed_lane == 0): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + if processed_p["set"] != change_row[c]: + msg = "Detected different changes for split-property variables on regular roadway links: " + msg += "conflicting \"{}\" values \"{}\", \"{}\"".format(p_base_name, processed_p["set"], change_row[c]) + WranglerLogger.error(msg) + raise ValueError(msg) + elif p_time_period: + if managed_lane == 1: + property_dict = {"property": p_base_name, "timeofday": [_d]} + processed_properties.append(p_base_name) + property_dict_list.append(property_dict) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + + card_df = pd.DataFrame( + { + "properties": pd.Series([property_dict_list]), + "model_link_id": pd.Series(base_row["model_link_id"]), + } + ) + + # WranglerLogger.debug('single change card_df:\n {}'.format(card_df)) + + return card_df + + def _process_link_changes(link_changes_df, changeable_col): + """""" + cube_change_df = link_changes_df[link_changes_df["operation_final"] == "C"].copy() + + # make sure columns has the same type as base network + cube_change_df['A'] = cube_change_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_change_df['B'] = cube_change_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_change_df.columns: + cube_change_df.drop('model_link_id', axis = 1, inplace = True) + + cube_change_df = pd.merge( + cube_change_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if not cube_change_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + change_link_dict_df = pd.DataFrame(columns=["properties", "model_link_id"]) + + for index, row in cube_change_df.iterrows(): + card_df = _process_single_link_change(row, changeable_col) + + change_link_dict_df = pd.concat( + [change_link_dict_df, card_df], ignore_index=True, sort=False + ) + + if not change_link_dict_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + # WranglerLogger.debug('change_link_dict_df Unaggregated:\n {}'.format(change_link_dict_df)) + + # Have to change to string so that it is a hashable type for the aggregation + change_link_dict_df["properties"] = change_link_dict_df[ + "properties" + ].astype(str) + # Group the changes that are the same + change_link_dict_df = ( + change_link_dict_df.groupby("properties")[["model_link_id"]] + .agg(lambda x: list(x)) + .reset_index() + ) + # WranglerLogger.debug('change_link_dict_df Aggregated:\n {}'.format(change_link_dict_df)) + + # Reformat model link id to correct "facility" format + change_link_dict_df["facility"] = change_link_dict_df.apply( + lambda x: {"link": [{"model_link_id": x.model_link_id}]}, axis=1 + ) + + # WranglerLogger.debug('change_link_dict_df 3: {}'.format(change_link_dict_df)) + change_link_dict_df["properties"] = change_link_dict_df["properties"].apply( + lambda x: json.loads( + x.replace("'\"", "'").replace("\"'", "'").replace("'", '"') + ) + ) + + change_link_dict_df["category"] = "Roadway Property Change" + + change_link_dict_list = change_link_dict_df[ + ["category", "facility", "properties"] + ].to_dict("record") + + WranglerLogger.debug( + "{} Changes Processed".format(len(change_link_dict_list)) + ) + return change_link_dict_list + + def _consolidate_actions(log, base, key_list): + log_df = log.copy() + # will be changed if to allow new variables being added/changed that are not in base network + changeable_col = [x for x in log_df.columns if x in base.columns] + #print(log_df) + #for x in changeable_col: + # print(x) + #log_df[x] = log_df[x].astype(base[x].dtype) + + if 'operation_final' not in log_df.columns: + action_history_df = ( + log_df.groupby(key_list)["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + + log_df = pd.merge(log_df, action_history_df, on=key_list, how="left") + log_df.drop_duplicates(subset=key_list, keep="last", inplace=True) + log_df["operation_final"] = log_df.apply(lambda x: Project._final_op(x), axis=1) + + return log_df[changeable_col + ["operation_final"]] + + delete_link_dict = None + add_link_dict = None + change_link_dict_list = [] + + if len(link_changes_df) != 0: + link_changes_df = _consolidate_actions( + link_changes_df, self.base_roadway_network.links_df, ["A", "B"] + ) + + # process deletions + delete_link_dict = _process_deletions(link_changes_df) + + # process additions + add_link_dict = _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ) + + # process changes + WranglerLogger.debug("Processing changes") + WranglerLogger.debug(link_changes_df) + changeable_col = list( + ( + set(link_changes_df.columns) + & set(self.base_roadway_network.links_df.columns) + ) + - set(Project.STATIC_VALUES) + ) + + cols_in_changes_not_in_net = list( + set(link_changes_df.columns) + - set(self.base_roadway_network.links_df.columns) + ) + + if cols_in_changes_not_in_net: + WranglerLogger.warning( + "The following attributes are specified in the changes but do not exist in the base network: {}".format( + cols_in_changes_not_in_net + ) + ) + + change_link_dict_list = _process_link_changes(link_changes_df, changeable_col) + + if len(node_changes_df) != 0: + node_changes_df = _consolidate_actions( + node_changes_df, self.base_roadway_network.nodes_df, ["model_node_id"] + ) + + # print error message for node change and node deletion + if ( + len(node_changes_df[node_changes_df["operation_final"].isin(["C", "D"])]) + > 0 + ): + msg = "NODE changes and deletions are not allowed!" + WranglerLogger.warning(msg) + #raise ValueError(msg) + node_add_df = node_changes_df[node_changes_df["operation_final"] == "A"] + + if add_link_dict: + add_link_dict["nodes"] = _process_node_additions(node_add_df) + else: + add_link_dict = {"category": "Add New Roadway", "nodes": _process_node_additions(node_add_df)} + + else: + None + + # combine together + + highway_change_list = list( + filter(None, [delete_link_dict] + [add_link_dict] + change_link_dict_list) + ) + + return highway_change_list
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_modules/lasso/roadway/index.html b/branch/seperate_maz_and_taz/_modules/lasso/roadway/index.html new file mode 100644 index 0000000..be27108 --- /dev/null +++ b/branch/seperate_maz_and_taz/_modules/lasso/roadway/index.html @@ -0,0 +1,2036 @@ + + + + + + lasso.roadway — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.roadway

+import copy
+import glob
+import os
+from typing import Optional, Union
+
+import geopandas as gpd
+import pandas as pd
+
+from geopandas import GeoDataFrame
+from pandas import DataFrame
+import numpy as np
+
+from network_wrangler import RoadwayNetwork
+from .parameters import Parameters
+from .logger import WranglerLogger
+
+
+
[docs]class ModelRoadwayNetwork(RoadwayNetwork): + """ + Subclass of network_wrangler class :ref:`RoadwayNetwork <network_wrangler:RoadwayNetwork>` + + A representation of the physical roadway network and its properties. + """ + + CALCULATED_VALUES = [ + "area_type", + "county", + "centroidconnect", + ] + +
[docs] def __init__( + self, + nodes: GeoDataFrame, + links: DataFrame, + shapes: GeoDataFrame, + parameters: Union[Parameters, dict] = {}, + **kwargs, + ): + """ + Constructor + + Args: + nodes: geodataframe of nodes + links: dataframe of links + shapes: geodataframe of shapes + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. + If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + """ + super().__init__(nodes, links, shapes, **kwargs) + + # will have to change if want to alter them + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.links_metcouncil_df = None + self.nodes_metcouncil_df = None + + self.fill_na() + self.convert_int()
+ # self.shapes_metcouncil_df = None + ##todo also write to file + # WranglerLogger.debug("Used PARAMS\n", '\n'.join(['{}: {}'.format(k,v) for k,v in self.parameters.__dict__.items()])) + +
[docs] @staticmethod + def read( + link_filename: str, + node_filename: str, + shape_filename: str, + fast: bool = False, + recalculate_calculated_variables=False, + recalculate_distance=False, + parameters: Union[dict, Parameters] = {}, + **kwargs, + ): + """ + Reads in links and nodes network standard. + + Args: + link_filename (str): File path to link json. + node_filename (str): File path to node geojson. + shape_filename (str): File path to link true shape geojson + fast (bool): boolean that will skip validation to speed up read time. + recalculate_calculated_variables (bool): calculates fields from spatial joins, etc. + recalculate_distance (bool): re-calculates distance. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + Returns: + ModelRoadwayNetwork + """ + + nodes_df, links_df, shapes_df = RoadwayNetwork.load_transform_network( + node_filename, + link_filename, + shape_filename, + validate_schema=not fast, + **kwargs, + ) + + m_road_net = ModelRoadwayNetwork( + nodes_df, + links_df, + shapes_df, + parameters=parameters, + **kwargs, + ) + + if recalculate_calculated_variables: + m_road_net.create_calculated_variables() + if recalculate_distance: + m_road_net.calculate_distance(overwrite=True) + + m_road_net.fill_na() + # this method is making period values as string "NaN", need to revise. + m_road_net.split_properties_by_time_period_and_category() + for c in m_road_net.links_df.columns: + m_road_net.links_df[c] = m_road_net.links_df[c].replace("NaN", np.nan) + m_road_net.convert_int() + + return m_road_net
+ +
[docs] @staticmethod + def from_RoadwayNetwork( + roadway_network_object, + parameters: Union[dict, Parameters] = {}, + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + roadway_network_object (RoadwayNetwork). + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + Returns: + ModelRoadwayNetwork + """ + + additional_params_dict = { + k: v + for k, v in roadway_network_object.__dict__.items() + if k not in ["nodes_df", "links_df", "shapes_df", "parameters"] + } + + return ModelRoadwayNetwork( + roadway_network_object.nodes_df, + roadway_network_object.links_df, + roadway_network_object.shapes_df, + parameters=parameters, + **additional_params_dict, + )
+ +
[docs] def split_properties_by_time_period_and_category(self, properties_to_split=None): + """ + Splits properties by time period, assuming a variable structure of + + Args: + properties_to_split: dict + dictionary of output variable prefix mapped to the source variable and what to stratify it by + e.g. + { + 'lanes' : {'v':'lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'ML_lanes' : {'v':'ML_lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'use' : {'v':'use', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + } + + """ + import itertools + + if properties_to_split == None: + properties_to_split = self.parameters.properties_to_split + + for out_var, params in properties_to_split.items(): + if params["v"] not in self.links_df.columns: + WranglerLogger.warning( + "Specified variable to split: {} not in network variables: {}. Returning 0.".format( + params["v"], str(self.links_df.columns) + ) + ) + if params.get("time_periods") and params.get("categories"): + + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + time_suffix + "_" + category_suffix + ] = 0 + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[out_var + "_" + time_suffix] = 0 + elif params.get("time_periods") and params.get("categories"): + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + category_suffix + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=params["categories"][category_suffix], + time_period=params["time_periods"][time_suffix], + ) + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[ + out_var + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=None, + time_period=params["time_periods"][time_suffix], + ) + else: + raise ValueError( + "Shoudn't have a category without a time period: {}".format(params) + )
+ +
[docs] def create_calculated_variables(self): + """ + Creates calculated roadway variables. + + Args: + None + """ + WranglerLogger.info("Creating calculated roadway variables.") + + #MTC + self.create_ML_variable() + #/MTC + #MC + self.calculate_area_type() + self.calculate_county() + self.calculate_mpo() + self.add_counts() + self.create_ML_variable() + self.create_hov_corridor_variable() + self.create_managed_variable()
+ #/MC + +
[docs] def calculate_county( + self, + county_shape=None, + county_shape_variable=None, + network_variable="county", + county_codes_dict=None, + overwrite=False, + ): + """ + #MC + Calculates county variable. + + This uses the centroid of the geometry field to determine which county it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + county_shape (str): The File path to county geodatabase. + county_shape_variable (str): The variable name of county in county geodadabase. + network_variable (str): The variable name of county in network standard. Default to "county". + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing County Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "County Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + county_shape = county_shape if county_shape else self.parameters.county_shape + + county_shape_variable = ( + county_shape_variable + if county_shape_variable + else self.parameters.county_variable_shp + ) + + WranglerLogger.info( + "Adding roadway network variable for county using a spatial join with: {}".format( + county_shape + ) + ) + + county_codes_dict = ( + county_codes_dict if county_codes_dict else self.parameters.county_code_dict + ) + if not county_codes_dict: + msg = "No county codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + county_gdf = gpd.read_file(county_shape) + county_gdf = county_gdf.to_crs(epsg=self.crs) + joined_gdf = gpd.sjoin(centroids_gdf, county_gdf, how="left", op="intersects") + + joined_gdf[county_shape_variable] = ( + joined_gdf[county_shape_variable] + .map(county_codes_dict) + .fillna(10) + .astype(int) + ) + + self.links_df[network_variable] = joined_gdf[county_shape_variable] + + WranglerLogger.info( + "Finished Calculating county variable: {}".format(network_variable) + )
+ +
[docs] def calculate_area_type( + self, + area_type_shape=None, + area_type_shape_variable=None, + network_variable="area_type", + area_type_codes_dict=None, + downtown_area_type_shape=None, + downtown_area_type=None, + overwrite=False, + ): + """ + #MC + Calculates area type variable. + + This uses the centroid of the geometry field to determine which area it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + area_type_shape (str): The File path to area geodatabase. + area_type_shape_variable (str): The variable name of area type in area geodadabase. + network_variable (str): The variable name of area type in network standard. Default to "area_type". + area_type_codes_dict: The dictionary to map input area_type_shape_variable to network_variable + downtown_area_type_shape: The file path to the downtown area type boundary. + downtown_area_type (int): Integer value of downtown area type + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Area Type Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Area Type Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Area Type from Spatial Data and adding as roadway network variable: {}".format( + network_variable + ) + ) + + """ + Verify inputs + """ + + area_type_shape = ( + area_type_shape if area_type_shape else self.parameters.area_type_shape + ) + + if not area_type_shape: + msg = "No area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(area_type_shape): + msg = "File not found for area type shape: {}".format(area_type_shape) + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_shape_variable = ( + area_type_shape_variable + if area_type_shape_variable + else self.parameters.area_type_variable_shp + ) + + if not area_type_shape_variable: + msg = "No area type shape varible specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_codes_dict = ( + area_type_codes_dict + if area_type_codes_dict + else self.parameters.area_type_code_dict + ) + if not area_type_codes_dict: + msg = "No area type codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type_shape = ( + downtown_area_type_shape + if downtown_area_type_shape + else self.parameters.downtown_area_type_shape + ) + + if not downtown_area_type_shape: + msg = "No downtown area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(downtown_area_type_shape): + msg = "File not found for downtown area type shape: {}".format( + downtown_area_type_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type = ( + downtown_area_type + if downtown_area_type + else self.parameters.downtown_area_type + ) + if not downtown_area_type: + msg = "No downtown area type value specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + WranglerLogger.debug("Reading Area Type Shapefile {}".format(area_type_shape)) + area_type_gdf = gpd.read_file(area_type_shape) + area_type_gdf = area_type_gdf.to_crs(epsg=self.crs) + + downtown_gdf = gpd.read_file(downtown_area_type_shape) + downtown_gdf = downtown_gdf.to_crs(epsg=self.crs) + + joined_gdf = gpd.sjoin( + centroids_gdf, area_type_gdf, how="left", op="intersects" + ) + + joined_gdf[area_type_shape_variable] = ( + joined_gdf[area_type_shape_variable] + .map(area_type_codes_dict) + .fillna(1) + .astype(int) + ) + + WranglerLogger.debug("Area Type Codes Used: {}".format(area_type_codes_dict)) + + d_joined_gdf = gpd.sjoin( + centroids_gdf, downtown_gdf, how="left", op="intersects" + ) + + d_joined_gdf["downtown_area_type"] = d_joined_gdf["Id"].fillna(-99).astype(int) + + joined_gdf.loc[ + d_joined_gdf["downtown_area_type"] == 0, area_type_shape_variable + ] = downtown_area_type + + WranglerLogger.debug( + "Downtown Area Type used boundary file: {}".format(downtown_area_type_shape) + ) + + self.links_df[network_variable] = joined_gdf[area_type_shape_variable] + + WranglerLogger.info( + "Finished Calculating Area Type from Spatial Data into variable: {}".format( + network_variable + ) + )
+ +
[docs] def calculate_mpo( + self, + county_network_variable="county", + network_variable="mpo", + as_integer=True, + mpo_counties=None, + overwrite=False, + ): + """ + Calculates mpo variable. + #MC + Args: + county_variable (str): Name of the variable where the county names are stored. Default to "county". + network_variable (str): Name of the variable that should be written to. Default to "mpo". + as_integer (bool): If true, will convert true/false to 1/0s. + mpo_counties (list): List of county names that are within mpo region. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating MPO as roadway network variable: {}".format(network_variable) + ) + """ + Verify inputs + """ + county_network_variable = ( + county_network_variable + if county_network_variable + else self.parameters.county_network_variable + ) + + if not county_network_variable: + msg = "No variable specified as containing 'county' in the network." + WranglerLogger.error(msg) + raise ValueError(msg) + if county_network_variable not in self.links_df.columns: + msg = "Specified county network variable: {} does not exist in network. Try running or debuging county calculation." + WranglerLogger.error(msg) + raise ValueError(msg) + + mpo_counties = mpo_counties if mpo_counties else self.parameters.mpo_counties + + if not mpo_counties: + msg = "No MPO Counties specified in method call or in parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("MPO Counties: {}".format(",".join(str(mpo_counties)))) + + """ + Start actual process + """ + + mpo = self.links_df[county_network_variable].isin(mpo_counties) + + if as_integer: + mpo = mpo.astype(int) + + self.links_df[network_variable] = mpo + + WranglerLogger.info( + "Finished calculating MPO variable: {}".format(network_variable) + )
+ +
[docs] def add_variable_using_shst_reference( + self, + var_shst_csvdata=None, + shst_csv_variable=None, + network_variable=None, + network_var_type=int, + overwrite=False, + ): + """ + Join network links with source data, via SHST API node match result. + + Args: + var_shst_csvdata (str): File path to SHST API return. + shst_csv_variable (str): Variable name in the source data. + network_variable (str): Name of the variable that should be written to. + network_var_type : Variable type in the written network. + overwrite (bool): True is overwriting existing variable. Default to False. + + Returns: + None + + """ + WranglerLogger.info( + "Adding Variable {} using Shared Streets Reference from {}".format( + network_variable, var_shst_csvdata + ) + ) + + var_shst_df = pd.read_csv(var_shst_csvdata) + + if "shstReferenceId" not in var_shst_df.columns: + msg = "'shstReferenceId' required but not found in {}".format(var_shst_data) + WranglerLogger.error(msg) + raise ValueError(msg) + + if shst_csv_variable not in var_shst_df.columns: + msg = "{} required but not found in {}".format( + shst_csv_variable, var_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + join_gdf = pd.merge( + self.links_df, + var_shst_df[["shstReferenceId", shst_csv_variable]], + how="left", + on="shstReferenceId", + ) + + join_gdf[shst_csv_variable].fillna(0, inplace=True) + + if network_variable in self.links_df.columns and not overwrite: + join_gdf.loc[join_gdf[network_variable] > 0, network_variable] = join_gdf[ + shst_csv_variable + ].astype(network_var_type) + else: + join_gdf[network_variable] = join_gdf[shst_csv_variable].astype( + network_var_type + ) + + self.links_df[network_variable] = join_gdf[network_variable] + + WranglerLogger.info( + "Added variable: {} using Shared Streets Reference".format(network_variable) + )
+ +
[docs] def add_counts( + self, + network_variable="AADT", + mndot_count_shst_data=None, + widot_count_shst_data=None, + mndot_count_variable_shp=None, + widot_count_variable_shp=None, + ): + + """ + Adds count variable. + #MC + join the network with count node data, via SHST API node match result + + Args: + network_variable (str): Name of the variable that should be written to. Default to "AADT". + mndot_count_shst_data (str): File path to MNDOT count location SHST API node match result. + widot_count_shst_data (str): File path to WIDOT count location SHST API node match result. + mndot_count_variable_shp (str): File path to MNDOT count location geodatabase. + widot_count_variable_shp (str): File path to WIDOT count location geodatabase. + + Returns: + None + """ + + WranglerLogger.info("Adding Counts") + + """ + Verify inputs + """ + + mndot_count_shst_data = ( + mndot_count_shst_data + if mndot_count_shst_data + else self.parameters.mndot_count_shst_data + ) + widot_count_shst_data = ( + widot_count_shst_data + if widot_count_shst_data + else self.parameters.widot_count_shst_data + ) + mndot_count_variable_shp = ( + mndot_count_variable_shp + if mndot_count_variable_shp + else self.parameters.mndot_count_variable_shp + ) + widot_count_variable_shp = ( + widot_count_variable_shp + if widot_count_variable_shp + else self.parameters.widot_count_variable_shp + ) + + for varname, var in { + "mndot_count_shst_data": mndot_count_shst_data, + "widot_count_shst_data": widot_count_shst_data, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(var): + msg = "{}' not found at following location: {}.".format(varname, var) + WranglerLogger.error(msg) + raise ValueError(msg) + + for varname, var in { + "mndot_count_variable_shp": mndot_count_variable_shp, + "widot_count_variable_shp": widot_count_variable_shp, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + WranglerLogger.debug( + "Adding MNDOT Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + mndot_count_shst_data, mndot_count_variable_shp, network_variable + ) + ) + # Add Minnesota Counts + self.add_variable_using_shst_reference( + var_shst_csvdata=mndot_count_shst_data, + shst_csv_variable=mndot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=True, + ) + WranglerLogger.debug( + "Adding WiDot Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + widot_count_shst_data, widot_count_variable_shp, network_variable + ) + ) + # Add Wisconsin Counts, but don't overwrite Minnesota + self.add_variable_using_shst_reference( + var_shst_csvdata=widot_count_shst_data, + shst_csv_variable=widot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=False, + ) + + self.links_df["count_AM"] = self.links_df[network_variable] / 4 + self.links_df["count_MD"] = self.links_df[network_variable] / 4 + self.links_df["count_PM"] = self.links_df[network_variable] / 4 + self.links_df["count_NT"] = self.links_df[network_variable] / 4 + + self.links_df["count_daily"] = self.links_df[network_variable] + self.links_df["count_year"] = 2017 + + WranglerLogger.info( + "Finished adding counts variable: {}".format(network_variable) + )
+ +
[docs] @staticmethod + def read_match_result(path): + """ + Reads the shst geojson match returns. + + Returns shst dataframe. + + Reading lots of same type of file and concatenating them into a single DataFrame. + + Args: + path (str): File path to SHST match results. + + Returns: + geodataframe: geopandas geodataframe + + ##todo + not sure why we need, but should be in utilities not this class + """ + refId_gdf = DataFrame() + refid_file = glob.glob(path) + for i in refid_file: + new = gpd.read_file(i) + refId_gdf = pd.concat([refId_gdf, new], ignore_index=True, sort=False) + return refId_gdf
+ +
[docs] @staticmethod + def get_attribute( + links_df, + join_key, # either "shstReferenceId", or "shstGeometryId", tests showed the latter gave better coverage + source_shst_ref_df, # source shst refId + source_gdf, # source dataframe + field_name, # , # targetted attribute from source + ): + """ + Gets attribute from source data using SHST match result. + + Args: + links_df (dataframe): The network dataframe that new attribute should be written to. + join_key (str): SHST ID variable name used to join source data with network dataframe. + source_shst_ref_df (str): File path to source data SHST match result. + source_gdf (str): File path to source data. + field_name (str): Name of the attribute to get from source data. + + Returns: + None + """ + # join based on shared streets geometry ID + # pp_link_id is shared streets match return + # source_ink_id is mrcc + WranglerLogger.debug( + "source ShSt rename_variables_for_dbf columns\n{}".format( + source_shst_ref_df.columns + ) + ) + WranglerLogger.debug("source gdf columns\n{}".format(source_gdf.columns)) + # end up with OSM network with the MRCC Link ID + # could also do with route_sys...would that be quicker? + join_refId_df = pd.merge( + links_df, + source_shst_ref_df[[join_key, "pp_link_id", "score"]].rename( + columns={"pp_link_id": "source_link_id", "score": "source_score"} + ), + how="left", + on=join_key, + ) + + # joined with MRCC dataframe to get route_sys + + join_refId_df = pd.merge( + join_refId_df, + source_gdf[["LINK_ID", field_name]].rename( + columns={"LINK_ID": "source_link_id"} + ), + how="left", + on="source_link_id", + ) + + # drop duplicated records with same field value + + join_refId_df.drop_duplicates( + subset=["model_link_id", "shstReferenceId", field_name], inplace=True + ) + + # more than one match, take the best score + + join_refId_df.sort_values( + by=["model_link_id", "source_score"], + ascending=True, + na_position="first", + inplace=True, + ) + + join_refId_df.drop_duplicates( + subset=["model_link_id"], keep="last", inplace=True + ) + + # self.links_df[field_name] = join_refId_df[field_name] + + return join_refId_df[links_df.columns.tolist() + [field_name, "source_link_id"]]
+ +
[docs] def calculate_use( + self, + network_variable="use", + as_integer=True, + overwrite=False, + ): + """ + Calculates use variable. + + Args: + network_variable (str): Variable that should be written to in the network. Default to "use" + as_integer (bool): If True, will convert true/false to 1/0s. Defauly to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "'use' Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating hov and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + #MTC + self.links_df[network_variable] = int(1) + #/MTC + + self.links_df[network_variable] = 0 + + self.links_df.loc[ + (self.links_df["assign_group"] == 8) | (self.links_df["access"] == "hov"), + network_variable, + ] = 100 + #/MC + + + if as_integer: + self.links_df[network_variable] = self.links_df[network_variable].astype( + int + ) + WranglerLogger.info( + "Finished calculating hov variable: {}".format(network_variable) + )
+ +
[docs] def create_ML_variable( + self, + network_variable="ML_lanes", + overwrite=False, + ): + """ + Created ML lanes placeholder for project to write out ML changes + + ML lanes default to 0, ML info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing ML Variable '{}' already in network".format( + network_variable + ) + ) + self.links_df[network_variable] = int(0) + else: + WranglerLogger.info( + "ML Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + WranglerLogger.info( + "Finished creating ML lanes variable: {}".format(network_variable) + )
+ +
[docs] def create_hov_corridor_variable( + self, + network_variable="segment_id", + overwrite=False, + ): + """ + Created hov corridor placeholder for project to write out corridor changes + + hov corridor id default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov corridor Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Hov corridor Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating hov corridor variable: {}".format(network_variable) + )
+ +
[docs] def create_managed_variable( + self, + network_variable="managed", + overwrite=False, + ): + """ + Created placeholder for project to write out managed + + managed default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing managed Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Managed Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating managed variable: {}".format(network_variable) + )
+ +
[docs] def calculate_centroidconnect( + self, + parameters, + network_variable="centroidconnect", + highest_taz_number=None, + as_integer=True, + overwrite=False, + ): + """ + Calculates centroid connector variable. + + Args: + parameters (Parameters): A Lasso Parameters, which stores input files. + network_variable (str): Variable that should be written to in the network. Default to "centroidconnect" + highest_taz_number (int): the max TAZ number in the network. + as_integer (bool): If True, will convert true/false to 1/0s. Default to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + Returns: + RoadwayNetwork + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Centroid Connector Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Centroid Connector Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Centroid Connector and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + highest_taz_number = ( + highest_taz_number if highest_taz_number else parameters.highest_taz_number + ) + + if not highest_taz_number: + msg = "No highest_TAZ number specified in method variable or in parameters" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug( + "Calculating Centroid Connectors using highest TAZ number: {}".format( + highest_taz_number + ) + ) + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + self.links_df[network_variable] = False + + self.links_df.loc[ + (self.links_df["A"] <= highest_taz_number) + | (self.links_df["B"] <= highest_taz_number), + network_variable, + ] = True + + if as_integer: + self.links_df[network_variable] = self.links_df[ + network_variable + ].astype(int) + WranglerLogger.info( + "Finished calculating centroid connector variable: {}".format(network_variable) + )
+ + +
[docs] def calculate_distance( + self, network_variable="distance", centroidconnect_only=False, overwrite=False + ): + """ + calculate link distance in miles + + Args: + centroidconnect_only (Bool): True if calculating distance for centroidconnectors only. Default to False. + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing distance Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Distance Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + #MC + if ("centroidconnect" not in self.links_df) & ("taz" not in self.links_df.roadway.unique()): + if centroidconnect_only: + msg = "No variable specified for centroid connector, calculating centroidconnect first" + WranglerLogger.error(msg) + raise ValueError(msg) + #/MC + + """ + Start actual process + """ + + temp_links_gdf = self.links_df.copy() + temp_links_gdf.crs = "EPSG:4326" + temp_links_gdf = temp_links_gdf.to_crs(epsg=26915) + + #MTC + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MTC + #MC + if centroidconnect_only: + WranglerLogger.info( + "Calculating {} for centroid connectors".format(network_variable) + ) + temp_links_gdf[network_variable] = np.where( + temp_links_gdf.centroidconnect == 1, + temp_links_gdf.geometry.length / 1609.34, + temp_links_gdf[network_variable], + ) + else: + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MC + + self.links_df[network_variable] = temp_links_gdf[network_variable]
+ +
[docs] def convert_int(self, int_col_names=[]): + """ + Convert integer columns + """ + + #MTC + WranglerLogger.info( + "Converting variable type to mtc standard" + ) + + int_col_names = self.parameters.int_col + #/MTC + #MC + """ + WranglerLogger.info("Converting variable type to MetCouncil standard") + + if not int_col_names: + int_col_names = self.parameters.int_col + #/MC + """ + ##Why are we doing this? + # int_col_names.remove("lanes") + + for c in list(set(self.links_df.columns) & set(int_col_names)): + try: + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + self.links_df[c] = self.links_df[c].replace("", 0) + self.links_df[c] = self.links_df[c].astype(int) + except ValueError: + try: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + except: + msg = f"Could not convert column {c} to integer." + WranglerLogger.error(msg) + raise ValueError(msg) + + for c in list(set(self.nodes_df.columns) & set(int_col_names)): + self.nodes_df[c] = self.nodes_df[c].replace("", 0) + self.nodes_df[c] = self.nodes_df[c].astype(int)
+ +
[docs] def fill_na(self): + """ + Fill na values from create_managed_lane_network() + """ + + WranglerLogger.info("Filling nan for network from network wrangler") + + num_col = self.parameters.int_col + self.parameters.float_col + + for x in list(self.links_df.columns): + if x in num_col: + self.links_df[x].fillna(0, inplace=True) + self.links_df[x] = self.links_df[x].apply( + lambda k: 0 if k in [np.nan, "", float("nan"), "NaN"] else k + ) + + else: + self.links_df[x].fillna("", inplace=True) + + for x in list(self.nodes_df.columns): + if x in num_col: + self.nodes_df[x].fillna(0, inplace=True) + else: + self.nodes_df[x].fillna("", inplace=True)
+ + +
[docs] def roadway_standard_to_met_council_network(self, output_epsg=None): + """ + Rename and format roadway attributes to be consistent with what metcouncil's model is expecting. + #MC + Args: + output_epsg (int): epsg number of output network. + + Returns: + None + """ + + WranglerLogger.info( + "Renaming roadway attributes to be consistent with what metcouncil's model is expecting" + ) + + """ + Verify inputs + """ + + output_epsg = output_epsg if output_epsg else self.parameters.output_epsg + + """ + Start actual process + """ + if "managed" in self.links_df.columns: + WranglerLogger.info("Creating managed lane network.") + self.create_managed_lane_network(in_place=True) + + # when ML and assign_group projects are applied together, assign_group is filled as "" by wrangler for ML links + for c in ModelRoadwayNetwork.CALCULATED_VALUES: + if c in self.links_df.columns and c in self.parameters.int_col: + self.links_df[c] = self.links_df[c].replace("", 0) + else: + WranglerLogger.info("Didn't detect managed lanes in network.") + + self.calculate_centroidconnect(self.parameters) + self.create_calculated_variables() + self.calculate_distance(overwrite=True) + + self.fill_na() + # no method to calculate price yet, will be hard coded in project card + WranglerLogger.info("Splitting variables by time period and category") + self.split_properties_by_time_period_and_category() + self.convert_int() + + self.links_metcouncil_df = self.links_df.copy() + self.nodes_metcouncil_df = self.nodes_df.copy() + + self.links_metcouncil_df = pd.merge( + self.links_metcouncil_df.drop( + "geometry", axis=1 + ), # drop the stick geometry in links_df + self.shapes_df[["shape_id", "geometry"]], + how="left", + on="shape_id", + ) + + self.links_metcouncil_df.crs = "EPSG:4326" + self.nodes_metcouncil_df.crs = "EPSG:4326" + WranglerLogger.info("Setting Coordinate Reference System to EPSG 26915") + self.links_metcouncil_df = self.links_metcouncil_df.to_crs(epsg=26915) + self.nodes_metcouncil_df = self.nodes_metcouncil_df.to_crs(epsg=26915) + + self.nodes_metcouncil_df["X"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.x + ) + self.nodes_metcouncil_df["Y"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.y + ) + + # CUBE expect node id to be N + self.nodes_metcouncil_df.rename(columns={"model_node_id": "N"}, inplace=True)
+ +
[docs] def rename_variables_for_dbf( + self, + input_df, + variable_crosswalk: str = None, + output_variables: list = None, + convert_geometry_to_xy=False, + ): + """ + Rename attributes for DBF/SHP, make sure length within 10 chars. + + Args: + input_df (dataframe): Network standard DataFrame. + variable_crosswalk (str): File path to variable name crosswalk from network standard to DBF names. + output_variables (list): List of strings for DBF variables. + convert_geometry_to_xy (bool): True if converting node geometry to X/Y + + Returns: + dataframe + + """ + WranglerLogger.info("Renaming variables so that they are DBF-safe") + + """ + Verify inputs + """ + + variable_crosswalk = ( + variable_crosswalk + if variable_crosswalk + else self.parameters.net_to_dbf_crosswalk + ) + + output_variables = ( + output_variables if output_variables else self.parameters.output_variables + ) + + """ + Start actual process + """ + + crosswalk_df = pd.read_csv(variable_crosswalk) + WranglerLogger.debug( + "Variable crosswalk: {} \n {}".format(variable_crosswalk, crosswalk_df) + ) + net_to_dbf_dict = dict(zip(crosswalk_df["net"], crosswalk_df["dbf"])) + + dbf_name_list = [] + + dbf_df = copy.deepcopy(input_df) + + # only write out variables that we specify + # if variable is specified in the crosswalk, rename it to that variable + for c in dbf_df.columns: + if c in output_variables: + try: + dbf_df.rename(columns={c: net_to_dbf_dict[c]}, inplace=True) + dbf_name_list += [net_to_dbf_dict[c]] + except: + dbf_name_list += [c] + + if "geometry" in dbf_df.columns: + if str(dbf_df["geometry"].iloc[0].geom_type) == "Point": + dbf_df["X"] = dbf_df.geometry.apply(lambda g: g.x) + dbf_df["Y"] = dbf_df.geometry.apply(lambda g: g.y) + dbf_name_list += ["X", "Y"] + + WranglerLogger.debug("DBF Variables: {}".format(",".join(dbf_name_list))) + + return dbf_df[dbf_name_list]
+ +
[docs] def write_roadway_as_shp( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + data_to_csv: bool = True, + data_to_dbf: bool = False, + output_link_shp: str = None, + output_node_shp: str = None, + output_link_csv: str = None, + output_node_csv: str = None, + output_gpkg: str = None, + output_link_gpkg_layer: str = None, + output_node_gpkg_layer: str = None, + output_gpkg_link_filter: str = None + ): + """ + Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names. + + Args: + output_dir (str): File path to directory + node_output_variables (list): List of strings for node output variables. + link_output_variables (list): List of strings for link output variables. + data_to_csv (bool): True if write network in csv format. + data_to_dbf (bool): True if write network in dbf/shp format. + output_link_shp (str): File name to output link dbf/shp. + output_node_shp (str): File name of output node dbf/shp. + output_link_csv (str): File name to output link csv. + output_node_csv (str): File name to output node csv. + output_gpkg (str): File name to output GeoPackage. + output_link_gpkg_layer (str): Layer name within output_gpkg to output links. + output_node_gpkg_layer (str): Layer name within output_gpkg to output links. + output_gpkg_link_filter (str): Optional column name to additional output link subset layers + + Returns: + None + """ + + WranglerLogger.info("Writing Network as Shapefile") + WranglerLogger.debug( + "Output Variables: \n - {}".format( + "\n - ".join(self.parameters.output_variables) + ) + ) + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_met_council_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + # unless specified that all the data goes to the DBF, only output A and B + dbf_link_output_variables = ( + #MTC + link_output_variables if link_output_variables else ["A", "B", "geometry"] + #MC + #link_output_variables if data_to_dbf else ["A", "B", "shape_id", "geometry"] + ) + + # Removing code to set this to versions from parameters + # User can use these as arg + + """ + Start Process + """ + # rename these to short only for shapefile option + if output_node_shp: + WranglerLogger.info("Renaming DBF Node Variables") + nodes_dbf_df = self.rename_variables_for_dbf(self.nodes_mtc_df, output_variables=node_output_variables) + else: + WranglerLogger.debug("nodes_mtc_df columns: {}".format(list(self.nodes_mtc_df.columns))) + nodes_dbf_df = self.nodes_mtc_df[node_output_variables] + + if output_link_shp: + WranglerLogger.info("Renaming DBF Link Variables") + links_dbf_df = self.rename_variables_for_dbf(self.links_mtc_df, output_variables=dbf_link_output_variables) + else: + WranglerLogger.debug("links_mtc_df columns: {}".format(list(self.links_mtc_df.columns))) + links_dbf_df = self.links_mtc_df[dbf_link_output_variables] + + links_dbf_df = gpd.GeoDataFrame(links_dbf_df, geometry=links_dbf_df["geometry"]) + + # temp debug + WranglerLogger.debug("links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))]:\n{}".format( + links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))] + )) + + if output_node_shp: + WranglerLogger.info("Writing Node Shapes: {}".format(os.path.join(output_dir, output_node_shp))) + nodes_dbf_df.to_file(os.path.join(output_dir, output_node_shp)) + + if output_gpkg and output_node_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Node Layer {}".format(os.path.join(output_dir, output_gpkg), output_node_gpkg_layer)) + nodes_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_node_gpkg_layer, driver="GPKG") + + if output_link_shp: + WranglerLogger.info("Writing Link Shapes: {}".format(os.path.join(output_dir, output_link_shp))) + links_dbf_df.to_file(os.path.join(output_dir, output_link_shp)) + + # debug test + link_schema = { + "properties": { + "A" : "int:8", + "B" : "int:8", + "model_link_id" : "int:10", + "shstGeometryId": "str:32", + "name" : "str:84", + "ft" : "int:2", + "assignable" : "int:18", + "cntype" : "str:80", + "distance" : "float", + "county" : "str:15", + "bike_access" : "int:2", + "drive_access" : "int:2", + "walk_access" : "int:2", + "rail_only" : "int:2", + "bus_only" : "int:2", + "transit" : "int:2", + "managed" : "int:2", + "tollbooth" : "int:2", + "tollseg" : "int:2", + "segment_id" : "int:4", + "lanes_EA" : "int:2", + "heuristic_num" : "int:2", + "lanes_AM" : "int:2", + "lanes_MD" : "int:2", + "lanes_PM" : "int:2", + "lanes_EV" : "int:2", + "useclass_EA" : "int:2", + "useclass_AM" : "int:2", + "useclass_MD" : "int:2", + "useclass_PM" : "int:2", + "useclass_EV" : "int:2" + }, + "geometry": "LineString" + } + if output_gpkg and output_link_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Link Layer {}".format(os.path.join(output_dir, output_gpkg), output_link_gpkg_layer)) + links_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_link_gpkg_layer, schema=link_schema, driver="GPKG") + + # output additional link layers if filter column is specified + # e.g. if county-subsets are output + if output_gpkg_link_filter: + link_value_counts = links_dbf_df[output_gpkg_link_filter].value_counts() + for filter_val,filter_count in link_value_counts.items(): + gpkg_layer_name = "{}_{}".format(output_link_gpkg_layer, filter_val) + gpkg_layer_name = gpkg_layer_name.replace(" ","_") + WranglerLogger.info("Writing GeoPackage {} with Link Layer {} for {} rows".format( + os.path.join(output_dir, output_gpkg), gpkg_layer_name, filter_count)) + links_dbf_df.loc[ links_dbf_df[output_gpkg_link_filter]==filter_val ].to_file( + os.path.join(output_dir, output_gpkg), layer=gpkg_layer_name, schema=link_schema, driver="GPKG") + + + + + if data_to_csv: + WranglerLogger.info( + "Writing Network Data to CSVs:\n - {}\n - {}".format( + output_link_csv, output_node_csv + ) + ) + self.links_mtc_df[link_output_variables].to_csv( + output_link_csv, index=False + ) + self.nodes_mtc_df[node_output_variables].to_csv( + output_node_csv, index=False + )
+ + + # this should be moved to util +
[docs] @staticmethod + def dataframe_to_fixed_width(df): + """ + Convert dataframe to fixed width format, geometry column will not be transformed. + + Args: + df (pandas DataFrame). + + Returns: + pandas dataframe: dataframe with fixed width for each column. + dict: dictionary with columns names as keys, column width as values. + """ + WranglerLogger.info("Starting fixed width conversion") + + # get the max length for each variable column + max_width_dict = dict( + [ + (v, df[v].apply(lambda r: len(str(r)) if r != None else 0).max()) + for v in df.columns.values + if v != "geometry" + ] + ) + + fw_df = df.drop("geometry", axis=1).copy() + for c in fw_df.columns: + fw_df[c] = fw_df[c].apply(lambda x: str(x)) + fw_df["pad"] = fw_df[c].apply(lambda x: " " * (max_width_dict[c] - len(x))) + fw_df[c] = fw_df.apply(lambda x: x["pad"] + x[c], axis=1) + + return fw_df, max_width_dict
+ +
[docs] def write_roadway_as_fixedwidth( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + output_link_txt: str = None, + output_node_txt: str = None, + output_link_header_width_txt: str = None, + output_node_header_width_txt: str = None, + output_cube_network_script: str = None, + drive_only: bool = False, + ): + """ + Writes out fixed width file. + + This function does: + 1. write out link and node fixed width data files for cube. + 2. write out header and width correspondence. + 3. write out cube network building script with header and width specification. + + Args: + output_dir (str): File path to where links, nodes and script will be written and run + node_output_variables (list): list of node variable names. + link_output_variables (list): list of link variable names. + output_link_txt (str): File name of output link database (within output_dir) + output_node_txt (str): File name of output node database (within output_dir) + output_link_header_width_txt (str): File name of link column width records (within output_dir) + output_node_header_width_txt (str): File name of node column width records (within output_dir) + output_cube_network_script (str): File name of CUBE network building script (within output_dir) + drive_only (bool): If True, only writes drive nodes and links + + Returns: + None + + """ + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_mtc_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + output_link_txt = ( + output_link_txt if output_link_txt else self.parameters.output_link_txt + ) + + output_node_txt = ( + output_node_txt if output_node_txt else self.parameters.output_node_txt + ) + + output_link_header_width_txt = ( + output_link_header_width_txt + if output_link_header_width_txt + else self.parameters.output_link_header_width_txt + ) + + output_node_header_width_txt = ( + output_node_header_width_txt + if output_node_header_width_txt + else self.parameters.output_node_header_width_txt + ) + + output_cube_network_script = ( + output_cube_network_script + if output_cube_network_script + else self.parameters.output_cube_network_script + ) + + """ + Start Process + """ + #MTC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_mtc_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df['drive_access'] == 1] + #/MTC + """ + #MC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_metcouncil_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df["drive_access"] == 1] + #/MC + """ + WranglerLogger.info("Writing out link database") + + link_ff_df.to_csv(os.path.join(output_dir, output_link_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out link header and width ----") + link_max_width_df = DataFrame( + list(link_max_width_dict.items()), columns=["header", "width"] + ) + link_max_width_df.to_csv(os.path.join(output_dir, output_link_header_width_txt), index=False) + + #MTC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_mtc_df[node_output_variables] + ) + #/MTC + """ + #MC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_metcouncil_df[node_output_variables] + ) + #/MC + """ + WranglerLogger.info("Writing out node database") + + if drive_only: + node_ff_df = node_ff_df.loc[node_ff_df["drive_node"] == 1] + + + node_ff_df.to_csv(os.path.join(output_dir, output_node_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out node header and width") + node_max_width_df = DataFrame( + list(node_max_width_dict.items()), columns=["header", "width"] + ) + node_max_width_df.to_csv(os.path.join(output_dir, output_node_header_width_txt), index=False) + + # write out cube script + s = 'RUN PGM = NETWORK MSG = "Read in network from fixed width file" \n' + s += 'FILEI LINKI[1] = "{}",'.format(output_link_txt) + start_pos = 1 + for i in range(len(link_max_width_df)): + s += " VAR=" + link_max_width_df.header.iloc[i] + + if ( + self.links_mtc_df.dtypes.loc[link_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(link_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(link_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += link_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += "\n" + s += 'FILEI NODEI[1] = "{}",'.format(output_node_txt) + start_pos = 1 + for i in range(len(node_max_width_df)): + s += " VAR=" + node_max_width_df.header.iloc[i] + + if ( + self.nodes_mtc_df.dtypes.loc[node_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(node_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(node_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += node_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += '\n' + s += 'FILEO NETO = "complete_network.net"\n\n' + s += ' ZONES = {}\n\n'.format(self.parameters.zones) + s += '; Trim leading whitespace from string variables\n' + # todo: The below should be built above based on columns that are strings + s += ' phase=NODEMERGE\n' + s += ' county = LTRIM(county)\n' + s += ' endphase\n' + s += ' phase=LINKMERGE\n' + s += ' name = LTRIM(name)\n' + s += ' county = LTRIM(county)\n' + s += ' cntype = LTRIM(cntype)\n' + s += ' endphase\n' + s += '\nENDRUN\n' + + with open(os.path.join(output_dir, output_cube_network_script), "w") as f: + f.write(s) + + # run the cube script to create the cube network + import subprocess + env = copy.copy(os.environ) + cube_cmd = '"C:\\Program Files\\Citilabs\\CubeVoyager\\runtpp.exe" {}'.format(output_cube_network_script) + try: + WranglerLogger.info("Running [{}] in cwd [{}]".format(cube_cmd, output_dir)) + ret = subprocess.run(cube_cmd, cwd=output_dir, capture_output=True, check=True) + + WranglerLogger.info("return code: {}".format(ret.returncode)) + + for line in ret.stdout.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stdout: {}".format(line)) + + for line in ret.stderr.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stderr: {}".format(line)) + + except Exception as e: + WranglerLogger.error(e)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_modules/lasso/transit/index.html b/branch/seperate_maz_and_taz/_modules/lasso/transit/index.html new file mode 100644 index 0000000..e075d49 --- /dev/null +++ b/branch/seperate_maz_and_taz/_modules/lasso/transit/index.html @@ -0,0 +1,2037 @@ + + + + + + lasso.transit — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.transit

+"""Transit-related classes to parse, compare, and write standard and cube transit files.
+
+  Typical usage example:
+
+    tn = CubeTransit.create_from_cube(CUBE_DIR)
+    transit_change_list = tn.evaluate_differences(base_transit_network)
+
+    cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+    cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+"""
+import os
+import copy
+import csv
+import datetime, time
+from typing import Any, Dict, Optional, Union
+
+from lark import Lark, Transformer, v_args
+from pandas import DataFrame
+
+import pandas as pd
+import partridge as ptg
+import numpy as np
+
+from network_wrangler import TransitNetwork
+
+from .logger import WranglerLogger
+from .parameters import Parameters
+
+
[docs]class CubeTransit(object): + """Class for storing information about transit defined in Cube line + files. + + Has the capability to: + + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + + .. highlight:: python + + Typical usage example: + :: + tn = CubeTransit.create_from_cube(CUBE_DIR) + transit_change_list = tn.evaluate_differences(base_transit_network) + + Attributes: + lines (list): list of strings representing unique line names in + the cube network. + line_properties (dict): dictionary of line properties keyed by line name. Property + values are stored in a dictionary by property name. These + properties are directly read from the cube line files and haven't + been translated to standard transit values. + shapes (dict): dictionary of shapes + keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns: + - 'node_id' (int): positive integer of node id + - 'node' (int): node number, with negative indicating a non-stop + - 'stop' (boolean): indicates if it is a stop + - 'order' (int): order within this shape + program_type (str): Either PT or TRNBLD + parameters (Parameters): + Parameters instance that will be applied to this instance which + includes information about time periods and variables. + source_list (list): + List of cube line file sources that have been read and added. + diff_dict (dict): + """ + +
[docs] def __init__(self, parameters: Union[Parameters, dict] = {}): + """ + Constructor for CubeTransit + + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + WranglerLogger.debug("Creating a new Cube Transit instance") + + self.lines = [] + + self.line_properties = {} + self.shapes = {} + + self.program_type = None + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.source_list = [] + + self.diff_dict = Dict[str, Any]
+ +
[docs] def add_cube(self, transit_source: str): + """Reads a .lin file and adds it to existing TransitNetwork instance. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + """ + + """ + Figure out what kind of transit source it is + """ + + parser = Lark(TRANSIT_LINE_FILE_GRAMMAR, debug="debug", parser="lalr") + + if "NAME=" in transit_source: + WranglerLogger.debug("reading transit source as string") + self.source_list.append("input_str") + parse_tree = parser.parse(transit_source) + elif os.path.isfile(transit_source): + print("reading: {}".format(transit_source)) + with open(transit_source) as file: + WranglerLogger.debug( + "reading transit source: {}".format(transit_source) + ) + self.source_list.append(transit_source) + parse_tree = parser.parse(file.read()) + elif os.path.isdir(transit_source): + import glob + + for lin_file in glob.glob(os.path.join(transit_source, "*.LIN")): + self.add_cube(lin_file) + return + else: + msg = "{} not a valid transit line string, directory, or file" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("finished parsing cube line file") + # WranglerLogger.debug("--Parse Tree--\n {}".format(parse_tree.pretty())) + transformed_tree_data = CubeTransformer().transform(parse_tree) + # WranglerLogger.debug("--Transformed Parse Tree--\n {}".format(transformed_tree_data)) + + _line_data = transformed_tree_data["lines"] + + line_properties_dict = {k: v["line_properties"] for k, v in _line_data.items()} + line_shapes_dict = {k: v["line_shape"] for k, v in _line_data.items()} + new_lines = list(line_properties_dict.keys()) + """ + Before adding lines, check to see if any are overlapping with existing ones in the network + """ + + overlapping_lines = set(new_lines) & set(self.lines) + if overlapping_lines: + msg = "Overlapping lines found when adding from {}. \nSource files:\n{}\n{} Overlapping Lines of {} total new lines.\n-->{}".format( + transit_source, + "\n - ".join(self.source_list), + len(new_lines), + len(overlapping_lines), + overlapping_lines, + ) + print(msg) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.program_type = transformed_tree_data.get("program_type", None) + + self.lines += new_lines + self.line_properties.update(line_properties_dict) + self.shapes.update(line_shapes_dict) + + WranglerLogger.debug("Added lines to CubeTransit: \n".format(new_lines))
+ +
[docs] @staticmethod + def create_from_cube(transit_source: str, parameters: Optional[dict] = {}): + """ + Reads a cube .lin file and stores as TransitNetwork object. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + Returns: + A ::CubeTransit object created from the transit_source. + """ + + tn = CubeTransit(parameters) + tn.add_cube(transit_source) + + return tn
+ +
[docs] def evaluate_differences(self, base_transit): + """ + 1. Identifies what routes need to be updated, deleted, or added + 2. For routes being added or updated, identify if the time periods + have changed or if there are multiples, and make duplicate lines if so + 3. Create project card dictionaries for each change. + + Args: + base_transit (CubeTransit): an instance of this class for the base condition + + Returns: + A list of dictionaries containing project card changes + required to evaluate the differences between the base network + and this transit network instance. + """ + transit_change_list = [] + + """ + Identify what needs to be evaluated + """ + lines_to_update = [l for l in self.lines if l in base_transit.lines] + lines_to_delete = [l for l in base_transit.lines if l not in self.lines] + lines_to_add = [l for l in self.lines if l not in base_transit.lines] + + project_card_changes = [] + + """ + Evaluate Property Updates + """ + + for line in lines_to_update: + WranglerLogger.debug( + "Finding differences in time periods for: {}".format(line) + ) + + """ + Find any additional time periods that might need to add or delete. + """ + base_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + base_transit.line_properties[line] + ) + ) + + try: + assert len(base_cube_time_period_numbers) == 1 + except: + msg = "Base network line {} should only have one time period per route, but {} found".format( + line, base_cube_time_period_numbers + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + base_cube_time_period_number = base_cube_time_period_numbers[0] + + build_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + + time_periods_to_add = [ + tp + for tp in build_cube_time_period_numbers + if tp not in base_cube_time_period_numbers + ] + + for tp in time_periods_to_add: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + time_periods_to_delete = [ + tp + for tp in base_cube_time_period_numbers + if tp not in build_cube_time_period_numbers + ] + + for tp in time_periods_to_delete: + lines_to_delete.append(line) + + WranglerLogger.debug("Evaluating differences in: {}".format(line)) + updated_properties = self.evaluate_route_property_differences( + self.line_properties[line], + base_transit.line_properties[line], + base_cube_time_period_number, + ) + updated_shapes = CubeTransit.evaluate_route_shape_changes( + self.shapes[line].node, base_transit.shapes[line].node + ) + if updated_properties: + update_prop_card_dict = self.create_update_route_card_dict( + line, updated_properties + ) + project_card_changes.append(update_prop_card_dict) + + if updated_shapes: + update_shape_card_dict = self.create_update_route_card_dict( + line, updated_shapes + ) + project_card_changes.append(update_shape_card_dict) + + """ + Evaluate Deletions + """ + for line in lines_to_delete: + delete_card_dict = self.create_delete_route_card_dict( + line, base_transit.line_properties[line] + ) + project_card_changes.append(delete_card_dict) + + """ + Evaluate Additions + + First assess if need to add multiple routes if there are multiple time periods + """ + for line in lines_to_add: + time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + if len(time_period_numbers) > 1: + for tp in time_period_numbers[1:]: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + for line in lines_to_add: + add_card_dict = self.create_add_route_card_dict(line) + project_card_changes.append(add_card_dict) + + return project_card_changes
+ +
[docs] def add_additional_time_periods( + self, new_time_period_number: int, orig_line_name: str + ): + """ + Copies a route to another cube time period with appropriate + values for time-period-specific properties. + + New properties are stored under the new name in: + - ::self.shapes + - ::self.line_properties + + Args: + new_time_period_number (int): cube time period number + orig_line_name(str): name of the originating line, from which + the new line will copy its properties. + + Returns: + Line name with new time period. + """ + WranglerLogger.debug( + "adding time periods {} to line {}".format( + new_time_period_number, orig_line_name + ) + ) + + ( + route_id, + _init_time_period, + agency_id, + direction_id, + ) = CubeTransit.unpack_route_name(orig_line_name) + new_time_period_name = self.parameters.cube_time_periods[new_time_period_number] + new_tp_line_name = CubeTransit.build_route_name( + route_id=route_id, + time_period=new_time_period_name, + agency_id=agency_id, + direction_id=direction_id, + ) + + try: + assert new_tp_line_name not in self.lines + except: + msg = "Trying to add a new time period {} to line {}, but constructed name {} is already in line list.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + WrangerLogger.error(msg) + raise ValueError(msg) + + # copy to a new line and add it to list of lines to add + self.line_properties[new_tp_line_name] = copy.deepcopy( + self.line_properties[orig_line_name] + ) + self.shapes[new_tp_line_name] = copy.deepcopy(self.shapes[orig_line_name]) + self.line_properties[new_tp_line_name]["NAME"] = new_tp_line_name + + """ + Remove entries that aren't for this time period from the new line's properties list. + """ + this_time_period_properties_list = [ + p + "[" + str(new_time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + self.line_properties[new_tp_line_name].pop(k, None) + + """ + Remove entries for time period from the original line's properties list. + """ + for k in this_time_period_properties_list: + self.line_properties[orig_line_name].pop(k, None) + + """ + Add new line to list of lines to add. + """ + WranglerLogger.debug( + "Adding new time period {} for line {} as {}.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + ) + return new_tp_line_name
+ +
[docs] def create_update_route_card_dict(self, line: str, updated_properties_dict: dict): + """ + Creates a project card change formatted dictionary for updating + the line. + + Args: + line: name of line that is being updated + updated_properties_dict: dictionary of attributes to update as + 'property': <property name>, + 'set': <new property value> + + Returns: + A project card change-formatted dictionary for the attribute update. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.split("_")[-2].strip("d\"")), + "shape_id": line.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + "properties": updated_properties_dict, + } + WranglerLogger.debug( + "Updating {} route to changes:\n{}".format(line, str(update_card_dict)) + ) + + return update_card_dict
+ +
[docs] def create_delete_route_card_dict( + self, line: str, base_transit_line_properties_dict: dict + ): + """ + Creates a project card change formatted dictionary for deleting a line. + + Args: + line: name of line that is being deleted + base_transit_line_properties_dict: dictionary of cube-style + attribute values in order to find time periods and + start and end times. + + Returns: + A project card change-formatted dictionary for the route deletion. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + base_transit_line_properties_dict + ) + + delete_card_dict = { + "category": "Delete Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + } + WranglerLogger.debug( + "Deleting {} route to changes:\n{}".format(line, delete_card_dict) + ) + + return delete_card_dict
+ +
[docs] def create_add_route_card_dict(self, line: str): + """ + Creates a project card change formatted dictionary for adding + a route based on the information in self.route_properties for + the line. + + Args: + line: name of line that is being updated + + Returns: + A project card change-formatted dictionary for the route addition. + """ + start_time_str, end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + standard_properties = self.cube_properties_to_standard_properties( + self.line_properties[line] + ) + + routing_properties = { + "property": "routing", + "set": self.shapes[line]["node"].tolist(), + } + + add_card_dict = { + "category": "New Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('_')[-2]), + "start_time": start_time_str, + "end_time": end_time_str, + "agency_id": line.strip('_')[0], + }, + "properties": standard_properties + [routing_properties], + } + + WranglerLogger.debug( + "Adding {} route to changes:\n{}".format(line, add_card_dict) + ) + return add_card_dict
+ +
[docs] @staticmethod + def get_time_period_numbers_from_cube_properties(properties_list: list): + """ + Finds properties that are associated with time periods and the + returns the numbers in them. + + Args: + properties_list (list): list of all properties. + + Returns: + list of strings of the time period numbers found + """ + time_periods_list = [] + for p in properties_list: + if ("[" not in p) or ("]" not in p): + continue + tp_num = p.split("[")[1][0] + if tp_num and tp_num not in time_periods_list: + time_periods_list.append(tp_num) + return time_periods_list
+ +
[docs] @staticmethod + def build_route_name( + route_id: str = "", + time_period: str = "", + agency_id: str = 0, + direction_id: str = 1, + ): + """ + Create a route name by contatenating route, time period, agency, and direction + + Args: + route_id: i.e. 452-111 + time_period: i.e. pk + direction_id: i.e. 1 + agency_id: i.e. 0 + + Returns: + constructed line_name i.e. "0_452-111_452_pk1" + """ + + return ( + str(agency_id) + + "_" + + str(route_id) + + "_" + + str(route_id.split("-")[0]) + + "_" + + str(time_period) + + str(direction_id) + )
+ +
[docs] @staticmethod + def unpack_route_name(line_name: str): + """ + Unpacks route name into direction, route, agency, and time period info + + Args: + line_name (str): i.e. "0_452-111_452_pk1" + + Returns: + route_id (str): 452-111 + time_period (str): i.e. pk + direction_id (str) : i.e. 1 + agency_id (str) : i.e. 0 + """ + + line_name = line_name.strip('"') + + agency_id, route_id, _rtid, _tp_direction = line_name.split("_") + time_period = _tp_direction[0:-1] + direction_id = _tp_direction[-1] + + return route_id, time_period, agency_id, direction_id
+ +
[docs] def calculate_start_end_times(self, line_properties_dict: dict): + """ + Calculate the start and end times of the property change + WARNING: Doesn't take care of discongruous time periods!!!! + + Args: + line_properties_dict: dictionary of cube-flavor properties for a transit line + """ + start_time_m = 24 * 60 + end_time_m = 0 * 60 + + WranglerLogger.debug( + "parameters.time_period_properties_list: {}".format( + self.parameters.time_period_properties_list + ) + ) + current_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + line_properties_dict + ) + ) + + WranglerLogger.debug( + "current_cube_time_period_numbers:{}".format( + current_cube_time_period_numbers + ) + ) + + for tp in current_cube_time_period_numbers: + time_period_name = self.parameters.cube_time_periods[tp] + WranglerLogger.debug("time_period_name:{}".format(time_period_name)) + _start_time, _end_time = self.parameters.time_period_to_time[ + time_period_name + ] + + # change from "HH:MM" to integer # of seconds + _start_time_m = (int(_start_time.split(":")[0]) * 60) + int( + _start_time.split(":")[1] + ) + _end_time_m = (int(_end_time.split(":")[0]) * 60) + int( + _end_time.split(":")[1] + ) + + # find bounding start and end times + if _start_time_m < start_time_m: + start_time_m = _start_time_m + if _end_time_m > end_time_m: + end_time_m = _end_time_m + + if start_time_m > end_time_m: + msg = "Start time ({}) is after end time ({})".format( + start_time_m, end_time_m + ) + #WranglerLogger.error(msg) + #raise ValueError(msg) + + start_time_str = "{:02d}:{:02d}".format(*divmod(start_time_m, 60)) + end_time_str = "{:02d}:{:02d}".format(*divmod(end_time_m, 60)) + return start_time_str, end_time_str
+ +
[docs] @staticmethod + def cube_properties_to_standard_properties(cube_properties_dict: dict): + """ + Converts cube style properties to standard properties. + + This is most pertinent to time-period specific variables like headway, + and varibles that have stnadard units like headway, which is minutes + in cube and seconds in standard format. + + Args: + cube_properties_dict: <cube style property name> : <property value> + + Returns: + A list of dictionaries with values for `"property": <standard + style property name>, "set" : <property value with correct units>` + + """ + standard_properties_list = [] + for k, v in cube_properties_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + change_item["set"] = v * 60 + else: + change_item["property"] = k + change_item["set"] = v + standard_properties_list.append(change_item) + + return standard_properties_list
+ +
[docs] def evaluate_route_property_differences( + self, + properties_build: dict, + properties_base: dict, + time_period_number: str, + absolute: bool = True, + validate_base: bool = False, + ): + """ + Checks if any values have been updated or added for a specific + route and creates project card entries for each. + + Args: + properties_build: ::<property_name>: <property_value> + properties_base: ::<property_name>: <property_value> + time_period_number: time period to evaluate + absolute: if True, will use `set` command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway + validate_base: if True, will add the `existing` line in the project card + + Returns: + transit_change_list (list): a list of dictionary values suitable for writing to a project card + `{ + 'property': <property_name>, + 'set': <set value>, + 'change': <change from existing value>, + 'existing': <existing value to check>, + }` + + """ + + # Remove time period specific values for things that aren't part of the time period in question + this_time_period_properties_list = [ + p + "[" + str(time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + properties_build.pop(k, None) + properties_base.pop(k, None) + + difference_dict = dict( + set(properties_build.items()) ^ set(properties_base.items()) + ) + + # Iterate through properties list to build difference project card list + + properties_list = [] + for k, v in difference_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + + if absolute: + change_item["set"] = ( + v * 60 + ) # project cards are in secs, cube is in minutes + else: + change_item["change"] = ( + properties_build[k] - properties_base[k] + ) * 60 + if validate_base or not absolute: + change_item["existing"] = properties_base[k] * 60 + else: + change_item["property"] = k + change_item["set"] = v + if validate_base: + change_item["existing"] = properties_base[k] + + properties_list.append(change_item) + WranglerLogger.debug( + "Evaluated Route Changes: \n {})".format( + "\n".join(map(str, properties_list)) + ) + ) + return properties_list
+ +
[docs] @staticmethod + def evaluate_route_shape_changes( + shape_build: DataFrame, shape_base: DataFrame + ): + """ + Compares two route shapes and constructs returns list of changes + suitable for a project card. + + Args: + shape_build: DataFrame of the build-version of the route shape. + shape_base: dDataFrame of the base-version of the route shape. + + Returns: + List of shape changes formatted as a project card-change dictionary. + + """ + + if shape_build.equals(shape_base): + return None + + shape_change_list = [] + + base_node_list = shape_base.tolist() + build_node_list = shape_build.tolist() + + sort_len = max(len(base_node_list), len(build_node_list)) + + start_pos = None + end_pos = None + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + start_pos = i - 1 + break + if base_node_list[i] != build_node_list[i]: + start_pos = i + break + else: + continue + + j = -1 + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + end_pos = j + 1 + break + if base_node_list[j] != build_node_list[j]: + end_pos = j + break + else: + j -= 1 + + if start_pos or end_pos: + existing = base_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + set = build_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + + shape_change_list.append( + {"property": "routing", "existing": existing, "set": set} + ) + + return shape_change_list
+ + +
[docs]class StandardTransit(object): + """Holds a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's + Cube Line files. + + .. highlight:: python + Typical usage example: + :: + cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) + cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) + + Attributes: + feed: Partridge Feed object containing read-only access to GTFS feed + parameters (Parameters): Parameters instance containing information + about time periods and variables. + """ + +
[docs] def __init__(self, ptg_feed, parameters: Union[Parameters, dict] = {}): + """ + + Args: + ptg_feed: partridge feed object + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + self.feed = ptg_feed + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] @staticmethod + def fromTransitNetwork( + transit_network_object: TransitNetwork, parameters: Union[Parameters, dict] = {} + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + transit_network_object: Reference to an instance of TransitNetwork. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit + """ + return StandardTransit(transit_network_object.feed, parameters=parameters)
+ +
[docs] @staticmethod + def read_gtfs(gtfs_feed_dir: str, parameters: Union[Parameters, dict] = {}): + """ + Reads GTFS files from a directory and returns a StandardTransit + instance. + + Args: + gtfs_feed_dir: location of the GTFS files + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit instance + """ + return StandardTransit(ptg.load_feed(gtfs_feed_dir), parameters=parameters)
+ +
[docs] def write_as_cube_lin(self, outpath: str = None): + """ + Writes the gtfs feed as a cube line file after + converting gtfs properties to MetCouncil cube properties. + #MC + Args: + outpath: File location for output cube line file. + + """ + if not outpath: + outpath = os.path.join(self.parameters.scratch_location, "outtransit.lin") + trip_cube_df = self.route_properties_gtfs_to_cube(self) + + trip_cube_df["LIN"] = trip_cube_df.apply(self.cube_format, axis=1) + + l = trip_cube_df["LIN"].tolist() + + with open(outpath, "w") as f: + f.write("\n".join(l))
+ +
[docs] @staticmethod + def route_properties_gtfs_to_cube(self): + """ + Prepare gtfs for cube lin file. + #MC + Does the following operations: + 1. Combines route, frequency, trip, and shape information + 2. Converts time of day to time periods + 3. Calculates cube route name from gtfs route name and properties + 4. Assigns a cube-appropriate mode number + 5. Assigns a cube-appropriate operator number + + Returns: + trip_df (DataFrame): DataFrame of trips with cube-appropriate values for: + - NAME + - ONEWAY + - OPERATOR + - MODE + - HEADWAY + """ + WranglerLogger.info( + "Converting GTFS Standard Properties to MetCouncil's Cube Standard" + ) + metro_operator_dict = { + "0": 3, + "1": 3, + "2": 3, + "3": 4, + "4": 2, + "5": 5, + "6": 8, + "7": 1, + "8": 1, + "9": 10, + "10": 3, + "11": 9, + "12": 3, + "13": 4, + "14": 4, + "15": 3, + } + + shape_df = self.feed.shapes.copy() + trip_df = self.feed.trips.copy() + + """ + Add information from: routes, frequencies, and routetype to trips_df + """ + trip_df = pd.merge(trip_df, self.feed.routes, how="left", on="route_id") + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + inv_cube_time_periods_map = { + v: k for k, v in self.parameters.cube_time_periods.items() + } + trip_df["tod_num"] = trip_df.tod_name.map(inv_cube_time_periods_map) + trip_df["tod_name"] = trip_df.tod_name.map( + self.parameters.cube_time_periods_name + ) + + trip_df["NAME"] = trip_df.apply( + lambda x: x.agency_id + + "_" + + x.route_id + + "_" + + x.route_short_name + + "_" + + x.tod_name + + str(x.direction_id), + axis=1, + ) + + trip_df["LONGNAME"] = trip_df["route_long_name"] + trip_df["HEADWAY"] = (trip_df["headway_secs"] / 60).astype(int) + trip_df["MODE"] = trip_df.apply(self.calculate_cube_mode, axis=1) + trip_df["ONEWAY"] = "T" + trip_df["OPERATOR"] = trip_df["agency_id"].map(metro_operator_dict) + + return trip_df
+ +
[docs] def calculate_cube_mode(self, row): + """ + Assigns a cube mode number by following logic. + #MC + For rail, uses GTFS route_type variable: + https://developers.google.com/transit/gtfs/reference + + :: + # route_type : cube_mode + route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9} # Rail + + For buses, uses route id numbers and route name to find + express and suburban buses as follows: + + :: + if not cube_mode: + if 'express' in row['LONGNAME'].lower(): + cube_mode = 7 # Express + elif int(row['route_id'].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + Args: + row: A DataFrame row with route_type, route_long_name, and route_id + + Returns: + cube mode number + """ + # route_type : cube_mode + route_type_to_cube_mode = { + 0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9, + } # Rail + + cube_mode = route_type_to_cube_mode[row["route_type"]] + + if not cube_mode: + if "express" in row["route_long_name"].lower(): + cube_mode = 7 # Express + elif int(row["route_id"].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + return cube_mode
+ +
[docs] def time_to_cube_time_period( + self, start_time_secs: int, as_str: bool = True, verbose: bool = False + ): + """ + Converts seconds from midnight to the cube time period. + + Args: + start_time_secs: start time for transit trip in seconds + from midnight + as_str: if True, returns the time period as a string, + otherwise returns a numeric time period + + Returns: + this_tp_num: if as_str is False, returns the numeric + time period + this_tp: if as_str is True, returns the Cube time period + name abbreviation + """ + from .util import hhmmss_to_datetime, secs_to_datetime + + # set initial time as the time that spans midnight + + start_time_dt = secs_to_datetime(start_time_secs) + + # set initial time as the time that spans midnight + this_tp = "NA" + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + _dt_end_time = hhmmss_to_datetime(_end_time) + if _dt_start_time > _dt_end_time: + this_tp = tp_name + break + + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + if start_time_dt >= _dt_start_time: + this_time = _dt_start_time + this_tp = tp_name + + if verbose: + WranglerLogger.debug( + "Finding Cube Time Period from Start Time: \ + \n - start_time_sec: {} \ + \n - start_time_dt: {} \ + \n - this_tp: {}".format( + start_time_secs, start_time_dt, this_tp + ) + ) + + if as_str: + return this_tp + + name_to_num = {v: k for k, v in self.parameters.cube_time_periods.items()} + this_tp_num = name_to_num.get(this_tp) + + if not this_tp_num: + msg = ( + "Cannot find time period number in {} for time period name: {}".format( + name_to_num, this_tp + ) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + return this_tp_num
+ +
[docs] def shape_gtfs_to_dict_list(self, trip_id: str, shape_id: str, add_nntime: bool): + """ + This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of + stepping through the routed nodes and corresponding them with shape nodes. + + TODO: eliminate this necessity by tagging the stop nodes in the shapes to begin with when + the transit routing on the roadway network is first performed. + + As such, I'm copying the code from StandardTransit.shape_gtfs_to_cube() with minimal modifications. + + Args: + trip_id of the trip in question + shape_id of the trip in question + Returns: + list of dict records with columns: + trip_id + shape_id + shape_pt_sequence + shape_mode_node_id + is_stop + access + stop_sequence + """ + # get the stop times for this route + # https://developers.google.com/transit/gtfs/reference#stop_timestxt + trip_stop_times_df = self.feed.stop_times.loc[ self.feed.stop_times.trip_id == trip_id, + ['trip_id','arrival_time','departure_time','stop_id','stop_sequence','pickup_type','drop_off_type']].copy() + trip_stop_times_df.sort_values(by='stop_sequence', inplace=True) + trip_stop_times_df.reset_index(drop=True, inplace=True) + # print("trip_stop_times_df:\n{}".format(trip_stop_times_df)) + # print("trip_stop_times_df.dtypes:\n{}".format(trip_stop_times_df.dtypes)) + # trip_stop_times_df: + # trip_id arrival_time departure_time stop_id stop_sequence pickup_type drop_off_type + # 0 10007 0 0 7781 1 0 NaN + # 1 10007 120 120 7845 2 0 NaN + # 2 10007 300 300 7790 3 0 NaN + # 3 10007 360 360 7854 4 0 NaN + # 4 10007 390 390 7951 5 0 NaN + # 5 10007 720 720 7950 6 0 NaN + # 6 10007 810 810 7850 7 0 NaN + # 7 10007 855 855 7945 8 0 NaN + # 8 10007 900 900 7803 9 0 NaN + # 9 10007 930 930 7941 10 0 NaN + # trip_stop_times_df.dtypes: + # trip_id object + # arrival_time object + # departure_time object + # stop_id object + # stop_sequence int64 + # pickup_type object + # drop_off_type object + + # get the shapes for this route + # https://developers.google.com/transit/gtfs/reference#shapestxt + trip_node_df = self.feed.shapes.loc[self.feed.shapes.shape_id == shape_id].copy() + trip_node_df.sort_values(by="shape_pt_sequence", inplace = True) + trip_node_df.reset_index(drop=True, inplace=True) + # print("trip_node_df.head(20):\n{}".format(trip_node_df.head(20))) + # print("trip_node_df.dtypes:\n{}".format(trip_node_df.dtypes)) + # trip_node_df: + # shape_id shape_pt_sequence shape_osm_node_id shape_shst_node_id shape_model_node_id shape_pt_lat shape_pt_lon + # 0 696 1 1429334016 35cb440c505534e8aedbd3a286b70eab 2139625 NaN NaN + # 1 696 2 444242480 39e263722d5849b3c732b48734671400 2164862 NaN NaN + # 2 696 3 5686705779 4c41c608c35f457079fd673bce5556e5 2169898 NaN NaN + # 3 696 4 3695761874 d0f5b2173189bbb1b5dbaa78a004e8c4 2021876 NaN NaN + # 4 696 5 1433982749 60726971f0fb359a57e9d8df30bf384b 2002078 NaN NaN + # 5 696 6 1433982740 634c301424647d5883191edf522180e3 2156807 NaN NaN + # 6 696 7 4915736746 f03c3d7f1aa0358a91c165f53dac1e20 2145185 NaN NaN + # 7 696 8 65604864 68b8df24f1572d267ecf834107741393 2120788 NaN NaN + # 8 696 9 65604866 e412a013ad45af6649fa1b396f74c127 2066513 NaN NaN + # 9 696 10 956664242 657e1602aa8585383ed058f28f7811ed 2006476 NaN NaN + # 10 696 11 291642561 726b03cced023a6459d7333885927208 2133933 NaN NaN + # 11 696 12 291642583 709a0c00811f213f7476349a2c002003 2159991 NaN NaN + # 12 696 13 291642745 c5aaab62e0c78c34d93ee57795f06953 2165343 NaN NaN + # 13 696 14 5718664845 c7f1f4aa88887071a0d28154fc84604b 2007965 NaN NaN + # 14 696 15 291642692 0ef007a79b391e8ba98daf4985f26f9b 2160569 NaN NaN + # 15 696 16 5718664843 2ce63288e77747abc3a4124f0e28efcf 2047955 NaN NaN + # 16 696 17 3485537279 ec0c8eb524f41072a9fd87ecfd45e15f 2169094 NaN NaN + # 17 696 18 5718664419 57ca23828db4adea39355a92fb0fc3ff 2082102 NaN NaN + # 18 696 19 5718664417 4aba41268ada1058ee58e99a84e28d37 2019974 NaN NaN + # 19 696 20 65545418 d4f815a2f6da6c95d2f032a3cd61020c 2025374 NaN NaN # trip_node_df.dtypes: + # shape_id object + # shape_pt_sequence int64 + # shape_osm_node_id object + # shape_shst_node_id object + # shape_model_node_id object + # shape_pt_lat object + # shape_pt_lon object + + # we only need: shape_id, shape_pt_sequence, shape_model_node_id + trip_node_df = trip_node_df[['shape_id','shape_pt_sequence','shape_model_node_id']] + + if 'trip_id' in self.feed.stops.columns: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on=['trip_id', "stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + # print("trip_stop_times_df:\n{}".format(trip_stop_times_df)) + # print("trip_stop_times_df.dtypes:\n{}".format(trip_stop_times_df.dtypes)) + # trip_stop_times_df.dtypes: + # trip_id object + # arrival_time object + # departure_time object + # stop_id object + # stop_sequence int64 + # pickup_type object + # drop_off_type object + # stop_name object + # stop_lat float64 + # stop_lon float64 + # zone_id object + # agency_raw_name object + # stop_code object + # location_type float64 + # parent_station object + # stop_desc object + # stop_url object + # stop_timezone object + # wheelchair_boarding float64 + # platform_code object + # position object + # direction object + # * used by routes object + # osm_node_id object + # shst_node_id object + # model_node_id object + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + # this is the same as shape_gtfs_to_cube but we'll build up a list of dicts with shape/stop information + shape_stop_dict_list = [] + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + # add this stop to shape_stop_df + node_dict = trip_node_df.iloc[nodeIdx].to_dict() + node_dict['trip_id' ] = trip_id + node_dict['is_stop' ] = True + node_dict['access' ] = access_v + node_dict['stop_sequence'] = stop_seq + shape_stop_dict_list.append(node_dict) + + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + # add this stop to shape_stop_df + node_dict = trip_node_df.iloc[nodeIdx].to_dict() + node_dict['trip_id'] = trip_id + node_dict['is_stop'] = False + shape_stop_dict_list.append(node_dict) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + # print("node_list_str: {}".format(node_list_str)) + return shape_stop_dict_list
+ +
[docs] def shape_gtfs_to_cube(self, row, add_nntime = False): + """ + Creates a list of nodes that for the route in appropriate + cube format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a string representation of the node list + for a route in cube format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + if 'trip_id' in self.feed.stops.columns: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on=['trip_id', "stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + return node_list_str
+ + +
[docs] def cube_format(self, row): + """ + Creates a string represnting the route in cube line file notation. + #MC + Args: + row: row of a DataFrame representing a cube-formatted trip, with the Attributes + trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR + + Returns: + string representation of route in cube line file notation + """ + + s = '\nLINE NAME="{}",'.format(row.NAME) + s += '\n LONGNAME="{}",'.format(row.LONGNAME) + s += "\n HEADWAY[{}]={},".format(row.tod_num, row.HEADWAY) + s += "\n MODE={},".format(row.MODE) + s += "\n ONEWAY={},".format(row.ONEWAY) + s += "\n OPERATOR={},".format(row.OPERATOR) + s += "\n NODES={}".format(self.shape_gtfs_to_cube(row)) + + return s
+ +
[docs] def shape_gtfs_to_emme(self, trip_row): + """ + Creates transit segment for the trips in appropriate + emme format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a dataframe representation of the transit segment + for a trip in emme format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == trip_row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == trip_row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # node list + stop_seq = 0 + nntimes = [] + allow_alightings=[] + allow_boardings=[] + stop_names=[] + + if trip_row.TM2_line_haul_name in ["Light rail", "Heavy rail", "Commuter rail", "Ferry service"]: + add_nntime = True + else: + add_nntime = False + + for nodeIdx in range(len(trip_node_list)): + + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + nntimes.append(nntime_v) + else: + nntimes.append(0) + + pickup_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"pickup_type"].iloc[0] + if pickup_type in [1, "1"]: + allow_alightings.append(0) + else: + allow_alightings.append(1) + + drop_off_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"drop_off_type"].iloc[0] + if drop_off_type in [1, "1"]: + allow_boardings.append(0) + else: + allow_boardings.append(1) + + stop_name = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"stop_name"].iloc[0] + stop_names.append(stop_name) + + else: + nntimes.append(0) + allow_alightings.append(0) + allow_boardings.append(0) + stop_names.append("") + + trip_node_df['time_minutes'] = nntimes + trip_node_df['allow_alightings'] = allow_alightings + trip_node_df['allow_boardings'] = allow_boardings + trip_node_df['stop_name'] = stop_names + trip_node_df['line_id'] = trip_row['line_id'] + trip_node_df['node_id'] = trip_node_df['shape_model_node_id'].astype(int) + trip_node_df['stop_order'] = trip_node_df['shape_pt_sequence'] + + return trip_node_df
+ +
[docs] def evaluate_differences(self, transit_changes): + """ + Compare changes from the transit_changes dataframe with the standard transit network + returns the project card changes in dictionary format + """ + + # simple properties change + trip_df = self.feed.trips.copy() + + mode_crosswalk = pd.read_csv(self.parameters.mode_crosswalk_file) + mode_crosswalk.drop_duplicates(subset = ["agency_raw_name", "route_type", "is_express_bus"], inplace = True) + + trip_df = pd.merge(trip_df, self.feed.routes.drop("agency_raw_name", axis = 1), how="left", on="route_id") + + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod"] = trip_df.start_time.apply(self.time_to_cube_time_period, as_str = False) + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + + trip_df["headway_minutes"] = (trip_df["headway_secs"] / 60).astype(int) + + trip_df = pd.merge(trip_df, self.feed.agency[["agency_name", "agency_raw_name", "agency_id"]], how = "left", on = ["agency_raw_name", "agency_id"]) + + # identify express bus + # moved this here from top since this StandardTransit shouldn't depend on mtc... + from .mtc import _is_express_bus + trip_df["is_express_bus"] = trip_df.apply(lambda x: _is_express_bus(x), axis = 1) + trip_df.drop("agency_name", axis = 1 , inplace = True) + + trip_df = pd.merge( + trip_df, + mode_crosswalk.drop("agency_id", axis = 1), + how = "left", + on = ["agency_raw_name", "route_type", "is_express_bus"] + ) + + trip_df["line_id"] = trip_df.apply( + lambda x: str(x.TM2_operator) + + "_" + + str(x.route_id) + + "_" + + x.tod_name + + "_" + + "d" + + str(int(x.direction_id)) + + "_s" + + x.shape_id, + axis=1, + ) + + trip_df["line_id"] = trip_df["line_id"].str.slice(stop = 28) + + project_card_changes = [] + + # lines updated + transit_changes['line_id'] = transit_changes.apply( + lambda x: '-'.join(x['element_id'].split('-')[:-3]) if + x['object'] == 'TRANSIT_STOP' else + x['element_id'], + axis = 1 + ) + + lines_updated_df = transit_changes[ + (transit_changes['operation'] == 'C') & + (transit_changes['line_id'].isin(trip_df['line_id'].tolist())) + ].copy() + + ######################### + # simple property changes + ######################### + + property_changes_df = lines_updated_df[ + lines_updated_df.object == 'TRANSIT_LINE' + ].copy() + + property_attribute_list = ['headway_secs'] + + for index, row in property_changes_df.iterrows(): + line_id = row['line_id'] + properties_list = [] + change_item = {} + for c in property_attribute_list: + existing_value = int(trip_df[ + trip_df['line_id'] == line_id + ][c].iloc[0]) + + change_item["existing"] = existing_value + + if c == 'headway_secs': + change_item["set"] = row['headway'] * 60 + else: + change_item["set"] = row[c] + + change_item["property"] = c + + properties_list.append(change_item) + + property_changes_df.loc[index, 'properties'] = properties_list + + ############### + # shape changes + ############### + + shape_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_SHAPE']) + ].copy() + + for index, row in shape_changes_df.iterrows(): + line_id = row.line_id + + # get base shape + trip_row = trip_df[trip_df.line_id == line_id].copy().squeeze() + + base_shape = self.shape_gtfs_to_emme( + trip_row=trip_row + ) + base_shape['shape_model_node_id'] = base_shape['shape_model_node_id'].astype(int) + + # get build shape + build_shape = row.new_itinerary + + updated_shapes = CubeTransit.evaluate_route_shape_changes( + shape_base = base_shape.shape_model_node_id, + shape_build = pd.Series(row.new_itinerary) + ) + updated_shapes[0]['property'] = 'shapes' + shape_changes_df.loc[index, 'properties'] = updated_shapes + + ############## + # stop changes + ############## + stop_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_STOP']) + ].copy() + + stop_attribute_list = ['allow_alightings', 'allow_boardings'] + + stop_changes_df = stop_changes_df.groupby( + ['line_id','i_node'] + )[stop_attribute_list].last().reset_index() + + stop_attribute_changes_df = pd.DataFrame() + + for attribute in stop_attribute_list: + + attribute_df = stop_changes_df.groupby( + ['line_id', attribute] + )['i_node'].apply(list).reset_index() + attribute_df['properties'] = attribute_df.apply( + lambda x: { + 'property' : attribute if x[attribute] == True else 'no_'+attribute.split('_')[-1], + 'set': x['i_node']}, + axis = 1 + ) + + stop_attribute_changes_df = pd.concat( + [stop_attribute_changes_df, + attribute_df[['line_id', 'properties']]], + sort = False, + ignore_index = True + ) + + ############## + # combine all transit changes + ############## + transit_changes_df = pd.concat( + [ + property_changes_df, + shape_changes_df, + stop_attribute_changes_df + ], + sort = False, + ignore_index = True + ) + + # groupby line_id + transit_changes_df = transit_changes_df.groupby( + ['line_id'] + )['properties'].apply(list).reset_index() + + # create change items by line_id + for index, row in transit_changes_df.iterrows(): + line_id = row['line_id'] + base_start_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[0] + + base_end_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[1] + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line_id.split("_")[1], + "direction_id": int(line_id.split("_")[-2].strip("d\"")), + "shape_id": line_id.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str + }, + "properties": row['properties'], + } + + project_card_changes.append(update_card_dict) + + return project_card_changes
+ +class CubeTransformer(Transformer): + """A lark-parsing Transformer which transforms the parse-tree to + a dictionary. + + .. highlight:: python + Typical usage example: + :: + transformed_tree_data = CubeTransformer().transform(parse_tree) + + Attributes: + line_order (int): a dynamic counter to hold the order of the nodes within + a route shape + lines_list (list): a list of the line names + """ + + def __init__(self): + self.line_order = 0 + self.lines_list = [] + + def lines(self, line): + # WranglerLogger.debug("lines: \n {}".format(line)) + + # This MUST be a tuple because it returns to start in the tree + lines = {k: v for k, v in line} + return ("lines", lines) + + @v_args(inline=True) + def program_type_line(self, PROGRAM_TYPE, whitespace=None): + # WranglerLogger.debug("program_type_line:{}".format(PROGRAM_TYPE)) + self.program_type = PROGRAM_TYPE.value + + # This MUST be a tuple because it returns to start in the tree + return ("program_type", PROGRAM_TYPE.value) + + @v_args(inline=True) + def line(self, lin_attributes, nodes): + # WranglerLogger.debug("line...attributes:\n {}".format(lin_attributes)) + # WranglerLogger.debug("line...nodes:\n {}".format(nodes)) + lin_name = lin_attributes["NAME"] + + self.line_order = 0 + # WranglerLogger.debug("parsing: {}".format(lin_name)) + + return (lin_name, {"line_properties": lin_attributes, "line_shape": nodes}) + + @v_args(inline=True) + def lin_attributes(self, *lin_attr): + lin_attr = {k: v for (k, v) in lin_attr} + # WranglerLogger.debug("lin_attributes: {}".format(lin_attr)) + return lin_attr + + @v_args(inline=True) + def lin_attr(self, lin_attr_name, attr_value, SEMICOLON_COMMENT=None): + # WranglerLogger.debug("lin_attr {}: {}".format(lin_attr_name, attr_value)) + return lin_attr_name, attr_value + + def lin_attr_name(self, args): + attr_name = args[0].value.upper() + # WranglerLogger.debug(".......args {}".format(args)) + if attr_name in ["FREQ", "HEADWAY"]: + attr_name = attr_name + "[" + str(args[2]) + "]" + return attr_name + + def attr_value(self, attr_value): + try: + return int(attr_value[0].value) + except: + return attr_value[0].value + + def nodes(self, lin_node): + lin_node = DataFrame(lin_node) + # WranglerLogger.debug("nodes:\n {}".format(lin_node)) + + return lin_node + + @v_args(inline=True) + def lin_node(self, NODE_NUM, SEMICOLON_COMMENT=None, *lin_nodeattr): + self.line_order += 1 + n = int(NODE_NUM.value) + return {"node_id": abs(n), "node": n, "stop": n > 0, "order": self.line_order} + + start = dict + + +TRANSIT_LINE_FILE_GRAMMAR = r""" + +start : program_type_line? lines +WHITESPACE : /[ \t\r\n]/+ +STRING : /("(?!"").*?(?<!\\)(\\\\)*?"|'(?!'').*?(?<!\\)(\\\\)*?')/i +SEMICOLON_COMMENT : /;[^\n]*/ +BOOLEAN : "T"i | "F"i +program_type_line : ";;<<" PROGRAM_TYPE ">><<LINE>>;;" WHITESPACE? +PROGRAM_TYPE : "PT" | "TRNBUILD" + +lines : line* +line : "LINE" lin_attributes nodes + +lin_attributes : lin_attr+ +lin_attr : lin_attr_name "=" attr_value "," SEMICOLON_COMMENT* +TIME_PERIOD : "1".."5" +!lin_attr_name : "allstops"i + | "color"i + | ("freq"i "[" TIME_PERIOD "]") + | ("headway"i "[" TIME_PERIOD "]") + | "mode"i + | "name"i + | "oneway"i + | "owner"i + | "runtime"i + | "timefac"i + | "xyspeed"i + | "longname"i + | "shortname"i + | ("usera1"i) + | ("usera2"i) + | "circular"i + | "vehicletype"i + | "operator"i + | "faresystem"i + +attr_value : BOOLEAN | STRING | SIGNED_INT | FLOAT + +nodes : lin_node+ +lin_node : ("N" | "NODES")? "="? NODE_NUM ","? SEMICOLON_COMMENT? lin_nodeattr* +NODE_NUM : SIGNED_INT +lin_nodeattr : lin_nodeattr_name "=" attr_value ","? SEMICOLON_COMMENT* +!lin_nodeattr_name : "access_c"i + | "access"i + | "delay"i + | "xyspeed"i + | "timefac"i + | "nntime"i + | "time"i + +operator : SEMICOLON_COMMENT* "OPERATOR" opmode_attr* SEMICOLON_COMMENT* +mode : SEMICOLON_COMMENT* "MODE" opmode_attr* SEMICOLON_COMMENT* +opmode_attr : ( (opmode_attr_name "=" attr_value) ","? ) +opmode_attr_name : "number" | "name" | "longname" + +%import common.SIGNED_INT +%import common.FLOAT +%import common.WS +%ignore WS + +""" +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_modules/lasso/util/index.html b/branch/seperate_maz_and_taz/_modules/lasso/util/index.html new file mode 100644 index 0000000..4b81954 --- /dev/null +++ b/branch/seperate_maz_and_taz/_modules/lasso/util/index.html @@ -0,0 +1,256 @@ + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.util

+from functools import partial
+import pyproj
+from shapely.ops import transform
+from shapely.geometry import Point, Polygon
+import re
+from unidecode import unidecode
+
+
[docs]def get_shared_streets_intersection_hash(lat, long, osm_node_id=None): + """ + Calculated per: + https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565 + Expected in/out + -93.0965985, 44.952112199999995 osm_node_id = 954734870 + 69f13f881649cb21ee3b359730790bb9 + + """ + import hashlib + + message = "Intersection {0:.5f} {0:.5f}".format(long, lat) + if osm_node_id: + message += " {}".format(osm_node_id) + unhashed = message.encode("utf-8") + hash = hashlib.md5(unhashed).hexdigest() + return hash
+ + +
[docs]def hhmmss_to_datetime(hhmmss_str: str): + """ + Creates a datetime time object from a string of hh:mm:ss + + Args: + hhmmss_str: string of hh:mm:ss + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = datetime.time(*[int(i) for i in hhmmss_str.split(":")]) + + return dt
+ + +
[docs]def secs_to_datetime(secs: int): + """ + Creates a datetime time object from a seconds from midnight + + Args: + secs: seconds from midnight + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = (datetime.datetime.min + datetime.timedelta(seconds=secs)).time() + + return dt
+ + +
[docs]def geodesic_point_buffer(lat, lon, meters): + """ + creates circular buffer polygon for node + + Args: + lat: node lat + lon: node lon + meters: buffer distance, radius of circle + Returns: + Polygon + """ + proj_wgs84 = pyproj.Proj('+proj=longlat +datum=WGS84') + # Azimuthal equidistant projection + aeqd_proj = '+proj=aeqd +lat_0={lat} +lon_0={lon} +x_0=0 +y_0=0' + project = partial( + pyproj.transform, + pyproj.Proj(aeqd_proj.format(lat=lat, lon=lon)), + proj_wgs84) + buf = Point(0, 0).buffer(meters) # distance in meters + return Polygon(transform(project, buf).exterior.coords[:])
+ +
[docs]def create_locationreference(node, link): + node['X'] = node['geometry'].apply(lambda p: p.x) + node['Y'] = node['geometry'].apply(lambda p: p.y) + node['point'] = [list(xy) for xy in zip(node.X, node.Y)] + node_dict = dict(zip(node.model_node_id, node.point)) + + link['A_point'] = link['A'].map(node_dict) + link['B_point'] = link['B'].map(node_dict) + link['locationReferences'] = link.apply(lambda x: [{'sequence':1, + 'point': x['A_point'], + 'distanceToNextRef':x['length'], + 'bearing' : 0, + 'intersectionId':x['fromIntersectionId']}, + {'sequence':2, + 'point': x['B_point'], + 'intersectionId':x['toIntersectionId']}], + axis = 1)
+ +
[docs]def column_name_to_parts(c, parameters=None): + + if not parameters: + from .parameters import Parameters + + parameters = Parameters() + + if c[0:2] == "ML": + managed = True + else: + managed = False + + time_period = None + category = None + + if c.split("_")[0] not in parameters.properties_to_split.keys(): + return c, None, None, managed + + tps = parameters.time_period_to_time.keys() + cats = parameters.categories.keys() + + if c.split("_")[-1] in tps: + time_period = c.split("_")[-1] + base_name = c.split(time_period)[-2][:-1] + if c.split("_")[-2] in cats: + category = c.split("_")[-2] + base_name = c.split(category)[-2][:-1] + elif c.split("_")[-1] in cats: + category = c.split("_")[-1] + base_name = c.split(category)[-2][:-1] + else: + msg = "Can't split property correctly: {}".format(c) + WranglerLogger.error(msg) + + return base_name, time_period, category, managed
+ +
[docs]def shorten_name(name): + if type(name) == str: + name_list = name.split(',') + else: + name_list = name + name_list = [re.sub(r'\W+', ' ', c).replace('nan', '').strip(' ') for c in name_list] + + name_list = list(set(name_list)) + #name_list.remove('') + + name_new = ' '.join(name_list).strip(' ') + + # convert non english character to english + name_new = unidecode(name_new) + + return name_new
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_modules/shapely/geometry/point/index.html b/branch/seperate_maz_and_taz/_modules/shapely/geometry/point/index.html new file mode 100644 index 0000000..63b367a --- /dev/null +++ b/branch/seperate_maz_and_taz/_modules/shapely/geometry/point/index.html @@ -0,0 +1,252 @@ + + + + + + shapely.geometry.point — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.point

+"""Points and related utilities
+"""
+import numpy as np
+
+import shapely
+from shapely.errors import DimensionError
+from shapely.geometry.base import BaseGeometry
+
+__all__ = ["Point"]
+
+
+
[docs]class Point(BaseGeometry): + """ + A geometry type that represents a single coordinate with + x,y and possibly z values. + + A point is a zero-dimensional feature and has zero length and zero area. + + Parameters + ---------- + args : float, or sequence of floats + The coordinates can either be passed as a single parameter, or as + individual float values using multiple parameters: + + 1) 1 parameter: a sequence or array-like of with 2 or 3 values. + 2) 2 or 3 parameters (float): x, y, and possibly z. + + Attributes + ---------- + x, y, z : float + Coordinate values + + Examples + -------- + Constructing the Point using separate parameters for x and y: + + >>> p = Point(1.0, -1.0) + + Constructing the Point using a list of x, y coordinates: + + >>> p = Point([1.0, -1.0]) + >>> print(p) + POINT (1 -1) + >>> p.y + -1.0 + >>> p.x + 1.0 + """ + + __slots__ = [] + + def __new__(self, *args): + if len(args) == 0: + # empty geometry + # TODO better constructor + return shapely.from_wkt("POINT EMPTY") + elif len(args) > 3: + raise TypeError(f"Point() takes at most 3 arguments ({len(args)} given)") + elif len(args) == 1: + coords = args[0] + if isinstance(coords, Point): + return coords + + # Accept either (x, y) or [(x, y)] + if not hasattr(coords, "__getitem__"): # generators + coords = list(coords) + coords = np.asarray(coords).squeeze() + else: + # 2 or 3 args + coords = np.array(args).squeeze() + + if coords.ndim > 1: + raise ValueError( + f"Point() takes only scalar or 1-size vector arguments, got {args}" + ) + if not np.issubdtype(coords.dtype, np.number): + coords = [float(c) for c in coords] + geom = shapely.points(coords) + if not isinstance(geom, Point): + raise ValueError("Invalid values passed to Point constructor") + return geom + + # Coordinate getters and setters + + @property + def x(self): + """Return x coordinate.""" + return shapely.get_x(self) + + @property + def y(self): + """Return y coordinate.""" + return shapely.get_y(self) + + @property + def z(self): + """Return z coordinate.""" + if not shapely.has_z(self): + raise DimensionError("This point has no z coordinate.") + # return shapely.get_z(self) -> get_z only supported for GEOS 3.7+ + return self.coords[0][2] + + @property + def __geo_interface__(self): + return {"type": "Point", "coordinates": self.coords[0]} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG circle element for the Point geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG circle diameter. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + return ( + '<circle cx="{0.x}" cy="{0.y}" r="{1}" ' + 'stroke="#555555" stroke-width="{2}" fill="{3}" opacity="{4}" />' + ).format(self, 3.0 * scale_factor, 1.0 * scale_factor, fill_color, opacity)
+ + @property + def xy(self): + """Separate arrays of X and Y coordinate values + + Example: + >>> x, y = Point(0, 0).xy + >>> list(x) + [0.0] + >>> list(y) + [0.0] + """ + return self.coords.xy
+ + +shapely.lib.registry[0] = Point +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_modules/shapely/geometry/polygon/index.html b/branch/seperate_maz_and_taz/_modules/shapely/geometry/polygon/index.html new file mode 100644 index 0000000..3f094ba --- /dev/null +++ b/branch/seperate_maz_and_taz/_modules/shapely/geometry/polygon/index.html @@ -0,0 +1,462 @@ + + + + + + shapely.geometry.polygon — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.polygon

+"""Polygons and their linear ring components
+"""
+
+import numpy as np
+
+import shapely
+from shapely.algorithms.cga import is_ccw_impl, signed_area
+from shapely.errors import TopologicalError
+from shapely.geometry.base import BaseGeometry
+from shapely.geometry.linestring import LineString
+from shapely.geometry.point import Point
+
+__all__ = ["orient", "Polygon", "LinearRing"]
+
+
+def _unpickle_linearring(wkb):
+    linestring = shapely.from_wkb(wkb)
+    srid = shapely.get_srid(linestring)
+    linearring = shapely.linearrings(shapely.get_coordinates(linestring))
+    if srid:
+        linearring = shapely.set_srid(linearring, srid)
+    return linearring
+
+
+class LinearRing(LineString):
+    """
+    A geometry type composed of one or more line segments
+    that forms a closed loop.
+
+    A LinearRing is a closed, one-dimensional feature.
+    A LinearRing that crosses itself or touches itself at a single point is
+    invalid and operations on it may fail.
+
+    Parameters
+    ----------
+    coordinates : sequence
+        A sequence of (x, y [,z]) numeric coordinate pairs or triples, or
+        an array-like with shape (N, 2) or (N, 3).
+        Also can be a sequence of Point objects.
+
+    Notes
+    -----
+    Rings are automatically closed. There is no need to specify a final
+    coordinate pair identical to the first.
+
+    Examples
+    --------
+    Construct a square ring.
+
+    >>> ring = LinearRing( ((0, 0), (0, 1), (1 ,1 ), (1 , 0)) )
+    >>> ring.is_closed
+    True
+    >>> list(ring.coords)
+    [(0.0, 0.0), (0.0, 1.0), (1.0, 1.0), (1.0, 0.0), (0.0, 0.0)]
+    >>> ring.length
+    4.0
+
+    """
+
+    __slots__ = []
+
+    def __new__(self, coordinates=None):
+        if coordinates is None:
+            # empty geometry
+            # TODO better way?
+            return shapely.from_wkt("LINEARRING EMPTY")
+        elif isinstance(coordinates, LineString):
+            if type(coordinates) == LinearRing:
+                # return original objects since geometries are immutable
+                return coordinates
+            elif not coordinates.is_valid:
+                raise TopologicalError("An input LineString must be valid.")
+            else:
+                # LineString
+                # TODO convert LineString to LinearRing more directly?
+                coordinates = coordinates.coords
+
+        else:
+            if hasattr(coordinates, "__array__"):
+                coordinates = np.asarray(coordinates)
+            if isinstance(coordinates, np.ndarray) and np.issubdtype(
+                coordinates.dtype, np.number
+            ):
+                pass
+            else:
+                # check coordinates on points
+                def _coords(o):
+                    if isinstance(o, Point):
+                        return o.coords[0]
+                    else:
+                        return [float(c) for c in o]
+
+                coordinates = np.array([_coords(o) for o in coordinates])
+                if not np.issubdtype(coordinates.dtype, np.number):
+                    # conversion of coords to 2D array failed, this might be due
+                    # to inconsistent coordinate dimensionality
+                    raise ValueError("Inconsistent coordinate dimensionality")
+
+        if len(coordinates) == 0:
+            # empty geometry
+            # TODO better constructor + should shapely.linearrings handle this?
+            return shapely.from_wkt("LINEARRING EMPTY")
+
+        geom = shapely.linearrings(coordinates)
+        if not isinstance(geom, LinearRing):
+            raise ValueError("Invalid values passed to LinearRing constructor")
+        return geom
+
+    @property
+    def __geo_interface__(self):
+        return {"type": "LinearRing", "coordinates": tuple(self.coords)}
+
+    def __reduce__(self):
+        """WKB doesn't differentiate between LineString and LinearRing so we
+        need to move the coordinate sequence into the correct geometry type"""
+        return (_unpickle_linearring, (shapely.to_wkb(self, include_srid=True),))
+
+    @property
+    def is_ccw(self):
+        """True is the ring is oriented counter clock-wise"""
+        return bool(is_ccw_impl()(self))
+
+    @property
+    def is_simple(self):
+        """True if the geometry is simple, meaning that any self-intersections
+        are only at boundary points, else False"""
+        return bool(shapely.is_simple(self))
+
+
+shapely.lib.registry[2] = LinearRing
+
+
+class InteriorRingSequence:
+
+    _parent = None
+    _ndim = None
+    _index = 0
+    _length = 0
+
+    def __init__(self, parent):
+        self._parent = parent
+        self._ndim = parent._ndim
+
+    def __iter__(self):
+        self._index = 0
+        self._length = self.__len__()
+        return self
+
+    def __next__(self):
+        if self._index < self._length:
+            ring = self._get_ring(self._index)
+            self._index += 1
+            return ring
+        else:
+            raise StopIteration
+
+    def __len__(self):
+        return shapely.get_num_interior_rings(self._parent)
+
+    def __getitem__(self, key):
+        m = self.__len__()
+        if isinstance(key, int):
+            if key + m < 0 or key >= m:
+                raise IndexError("index out of range")
+            if key < 0:
+                i = m + key
+            else:
+                i = key
+            return self._get_ring(i)
+        elif isinstance(key, slice):
+            res = []
+            start, stop, stride = key.indices(m)
+            for i in range(start, stop, stride):
+                res.append(self._get_ring(i))
+            return res
+        else:
+            raise TypeError("key must be an index or slice")
+
+    def _get_ring(self, i):
+        return shapely.get_interior_ring(self._parent, i)
+
+
+
[docs]class Polygon(BaseGeometry): + """ + A geometry type representing an area that is enclosed by a linear ring. + + A polygon is a two-dimensional feature and has a non-zero area. It may + have one or more negative-space "holes" which are also bounded by linear + rings. If any rings cross each other, the feature is invalid and + operations on it may fail. + + Parameters + ---------- + shell : sequence + A sequence of (x, y [,z]) numeric coordinate pairs or triples, or + an array-like with shape (N, 2) or (N, 3). + Also can be a sequence of Point objects. + holes : sequence + A sequence of objects which satisfy the same requirements as the + shell parameters above + + Attributes + ---------- + exterior : LinearRing + The ring which bounds the positive space of the polygon. + interiors : sequence + A sequence of rings which bound all existing holes. + + Examples + -------- + Create a square polygon with no holes + + >>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.)) + >>> polygon = Polygon(coords) + >>> polygon.area + 1.0 + """ + + __slots__ = [] + + def __new__(self, shell=None, holes=None): + if shell is None: + # empty geometry + # TODO better way? + return shapely.from_wkt("POLYGON EMPTY") + elif isinstance(shell, Polygon): + # return original objects since geometries are immutable + return shell + else: + shell = LinearRing(shell) + + if holes is not None: + if len(holes) == 0: + # shapely constructor cannot handle holes=[] + holes = None + else: + holes = [LinearRing(ring) for ring in holes] + + geom = shapely.polygons(shell, holes=holes) + if not isinstance(geom, Polygon): + raise ValueError("Invalid values passed to Polygon constructor") + return geom + + @property + def exterior(self): + return shapely.get_exterior_ring(self) + + @property + def interiors(self): + if self.is_empty: + return [] + return InteriorRingSequence(self) + + @property + def coords(self): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not" + ) + + def __eq__(self, other): + if not isinstance(other, BaseGeometry): + return NotImplemented + if not isinstance(other, Polygon): + return False + check_empty = (self.is_empty, other.is_empty) + if all(check_empty): + return True + elif any(check_empty): + return False + my_coords = [self.exterior.coords] + [ + interior.coords for interior in self.interiors + ] + other_coords = [other.exterior.coords] + [ + interior.coords for interior in other.interiors + ] + if not len(my_coords) == len(other_coords): + return False + # equal_nan=False is the default, but not yet available for older numpy + return np.all( + [ + np.array_equal(left, right) # , equal_nan=False) + for left, right in zip(my_coords, other_coords) + ] + ) + + def __hash__(self): + return super().__hash__() + + @property + def __geo_interface__(self): + if self.exterior == LinearRing(): + coords = [] + else: + coords = [tuple(self.exterior.coords)] + for hole in self.interiors: + coords.append(tuple(hole.coords)) + return {"type": "Polygon", "coordinates": tuple(coords)} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG path element for the Polygon geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG stroke-width. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + exterior_coords = [["{},{}".format(*c) for c in self.exterior.coords]] + interior_coords = [ + ["{},{}".format(*c) for c in interior.coords] for interior in self.interiors + ] + path = " ".join( + [ + "M {} L {} z".format(coords[0], " L ".join(coords[1:])) + for coords in exterior_coords + interior_coords + ] + ) + return ( + '<path fill-rule="evenodd" fill="{2}" stroke="#555555" ' + 'stroke-width="{0}" opacity="{3}" d="{1}" />' + ).format(2.0 * scale_factor, path, fill_color, opacity)
+ +
[docs] @classmethod + def from_bounds(cls, xmin, ymin, xmax, ymax): + """Construct a `Polygon()` from spatial bounds.""" + return cls([(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin)])
+ + +shapely.lib.registry[3] = Polygon + + +def orient(polygon, sign=1.0): + s = float(sign) + rings = [] + ring = polygon.exterior + if signed_area(ring) / s >= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + for ring in polygon.interiors: + if signed_area(ring) / s <= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + return Polygon(rings[0], rings[1:]) +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_modules/shapely/ops/index.html b/branch/seperate_maz_and_taz/_modules/shapely/ops/index.html new file mode 100644 index 0000000..48045cb --- /dev/null +++ b/branch/seperate_maz_and_taz/_modules/shapely/ops/index.html @@ -0,0 +1,845 @@ + + + + + + shapely.ops — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.ops

+"""Support for various GEOS geometry operations
+"""
+
+from warnings import warn
+
+import shapely
+from shapely.algorithms.polylabel import polylabel  # noqa
+from shapely.errors import GeometryTypeError, ShapelyDeprecationWarning
+from shapely.geometry import (
+    GeometryCollection,
+    LineString,
+    MultiLineString,
+    MultiPoint,
+    Point,
+    Polygon,
+    shape,
+)
+from shapely.geometry.base import BaseGeometry, BaseMultipartGeometry
+from shapely.geometry.polygon import orient as orient_
+from shapely.prepared import prep
+
+__all__ = [
+    "cascaded_union",
+    "linemerge",
+    "operator",
+    "polygonize",
+    "polygonize_full",
+    "transform",
+    "unary_union",
+    "triangulate",
+    "voronoi_diagram",
+    "split",
+    "nearest_points",
+    "validate",
+    "snap",
+    "shared_paths",
+    "clip_by_rect",
+    "orient",
+    "substring",
+]
+
+
+class CollectionOperator:
+    def shapeup(self, ob):
+        if isinstance(ob, BaseGeometry):
+            return ob
+        else:
+            try:
+                return shape(ob)
+            except (ValueError, AttributeError):
+                return LineString(ob)
+
+    def polygonize(self, lines):
+        """Creates polygons from a source of lines
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        collection = shapely.polygonize(obs)
+        return collection.geoms
+
+    def polygonize_full(self, lines):
+        """Creates polygons from a source of lines, returning the polygons
+        and leftover geometries.
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+
+        Returns a tuple of objects: (polygons, cut edges, dangles, invalid ring
+        lines). Each are a geometry collection.
+
+        Dangles are edges which have one or both ends which are not incident on
+        another edge endpoint. Cut edges are connected at both ends but do not
+        form part of polygon. Invalid ring lines form rings which are invalid
+        (bowties, etc).
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        return shapely.polygonize_full(obs)
+
+    def linemerge(self, lines, directed=False):
+        """Merges all connected lines from a source
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.  Returns a
+        LineString or MultiLineString when lines are not contiguous.
+        """
+        source = None
+        if getattr(lines, "geom_type", None) == "MultiLineString":
+            source = lines
+        elif hasattr(lines, "geoms"):
+            # other Multi geometries
+            source = MultiLineString([ls.coords for ls in lines.geoms])
+        elif hasattr(lines, "__iter__"):
+            try:
+                source = MultiLineString([ls.coords for ls in lines])
+            except AttributeError:
+                source = MultiLineString(lines)
+        if source is None:
+            raise ValueError(f"Cannot linemerge {lines}")
+        return shapely.line_merge(source, directed=directed)
+
+    def cascaded_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        .. deprecated:: 1.8
+            This function was superseded by :meth:`unary_union`.
+        """
+        warn(
+            "The 'cascaded_union()' function is deprecated. "
+            "Use 'unary_union()' instead.",
+            ShapelyDeprecationWarning,
+            stacklevel=2,
+        )
+        return shapely.union_all(geoms, axis=None)
+
+    def unary_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        Usually used to convert a collection into the smallest set of polygons
+        that cover the same area.
+        """
+        return shapely.union_all(geoms, axis=None)
+
+
+operator = CollectionOperator()
+polygonize = operator.polygonize
+polygonize_full = operator.polygonize_full
+linemerge = operator.linemerge
+cascaded_union = operator.cascaded_union
+unary_union = operator.unary_union
+
+
+def triangulate(geom, tolerance=0.0, edges=False):
+    """Creates the Delaunay triangulation and returns a list of geometries
+
+    The source may be any geometry type. All vertices of the geometry will be
+    used as the points of the triangulation.
+
+    From the GEOS documentation:
+    tolerance is the snapping tolerance used to improve the robustness of
+    the triangulation computation. A tolerance of 0.0 specifies that no
+    snapping will take place.
+
+    If edges is False, a list of Polygons (triangles) will be returned.
+    Otherwise the list of LineString edges is returned.
+
+    """
+    collection = shapely.delaunay_triangles(geom, tolerance=tolerance, only_edges=edges)
+    return [g for g in collection.geoms]
+
+
+def voronoi_diagram(geom, envelope=None, tolerance=0.0, edges=False):
+    """
+    Constructs a Voronoi Diagram [1] from the given geometry.
+    Returns a list of geometries.
+
+    Parameters
+    ----------
+    geom: geometry
+        the input geometry whose vertices will be used to calculate
+        the final diagram.
+    envelope: geometry, None
+        clipping envelope for the returned diagram, automatically
+        determined if None. The diagram will be clipped to the larger
+        of this envelope or an envelope surrounding the sites.
+    tolerance: float, 0.0
+        sets the snapping tolerance used to improve the robustness
+        of the computation. A tolerance of 0.0 specifies that no
+        snapping will take place.
+    edges: bool, False
+        If False, return regions as polygons. Else, return only
+        edges e.g. LineStrings.
+
+    GEOS documentation can be found at [2]
+
+    Returns
+    -------
+    GeometryCollection
+        geometries representing the Voronoi regions.
+
+    Notes
+    -----
+    The tolerance `argument` can be finicky and is known to cause the
+    algorithm to fail in several cases. If you're using `tolerance`
+    and getting a failure, try removing it. The test cases in
+    tests/test_voronoi_diagram.py show more details.
+
+
+    References
+    ----------
+    [1] https://en.wikipedia.org/wiki/Voronoi_diagram
+    [2] https://geos.osgeo.org/doxygen/geos__c_8h_source.html  (line 730)
+    """
+    try:
+        result = shapely.voronoi_polygons(
+            geom, tolerance=tolerance, extend_to=envelope, only_edges=edges
+        )
+    except shapely.GEOSException as err:
+        errstr = "Could not create Voronoi Diagram with the specified inputs "
+        errstr += f"({err!s})."
+        if tolerance:
+            errstr += " Try running again with default tolerance value."
+        raise ValueError(errstr) from err
+
+    if result.geom_type != "GeometryCollection":
+        return GeometryCollection([result])
+    return result
+
+
+def validate(geom):
+    return shapely.is_valid_reason(geom)
+
+
+
[docs]def transform(func, geom): + """Applies `func` to all coordinates of `geom` and returns a new + geometry of the same type from the transformed coordinates. + + `func` maps x, y, and optionally z to output xp, yp, zp. The input + parameters may iterable types like lists or arrays or single values. + The output shall be of the same type. Scalars in, scalars out. + Lists in, lists out. + + For example, here is an identity function applicable to both types + of input. + + def id_func(x, y, z=None): + return tuple(filter(None, [x, y, z])) + + g2 = transform(id_func, g1) + + Using pyproj >= 2.1, this example will accurately project Shapely geometries: + + import pyproj + + wgs84 = pyproj.CRS('EPSG:4326') + utm = pyproj.CRS('EPSG:32618') + + project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform + + g2 = transform(project, g1) + + Note that the always_xy kwarg is required here as Shapely geometries only support + X,Y coordinate ordering. + + Lambda expressions such as the one in + + g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1) + + also satisfy the requirements for `func`. + """ + if geom.is_empty: + return geom + if geom.geom_type in ("Point", "LineString", "LinearRing", "Polygon"): + + # First we try to apply func to x, y, z sequences. When func is + # optimized for sequences, this is the fastest, though zipping + # the results up to go back into the geometry constructors adds + # extra cost. + try: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)(zip(*func(*zip(*geom.coords)))) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)(zip(*func(*zip(*geom.exterior.coords)))) + holes = list( + type(ring)(zip(*func(*zip(*ring.coords)))) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + # A func that assumes x, y, z are single values will likely raise a + # TypeError, in which case we'll try again. + except TypeError: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)([func(*c) for c in geom.coords]) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)([func(*c) for c in geom.exterior.coords]) + holes = list( + type(ring)([func(*c) for c in ring.coords]) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + elif geom.geom_type.startswith("Multi") or geom.geom_type == "GeometryCollection": + return type(geom)([transform(func, part) for part in geom.geoms]) + else: + raise GeometryTypeError(f"Type {geom.geom_type!r} not recognized")
+ + +def nearest_points(g1, g2): + """Returns the calculated nearest points in the input geometries + + The points are returned in the same order as the input geometries. + """ + seq = shapely.shortest_line(g1, g2) + if seq is None: + if g1.is_empty: + raise ValueError("The first input geometry is empty") + else: + raise ValueError("The second input geometry is empty") + + p1 = shapely.get_point(seq, 0) + p2 = shapely.get_point(seq, 1) + return (p1, p2) + + +def snap(g1, g2, tolerance): + """ + Snaps an input geometry (g1) to reference (g2) geometry's vertices. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + tolerance : float + The snapping tolerance + + Refer to :func:`shapely.snap` for full documentation. + """ + + return shapely.snap(g1, g2, tolerance) + + +def shared_paths(g1, g2): + """Find paths shared between the two given lineal geometries + + Returns a GeometryCollection with two elements: + - First element is a MultiLineString containing shared paths with the + same direction for both inputs. + - Second element is a MultiLineString containing shared paths with the + opposite direction for the two inputs. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + """ + if not isinstance(g1, LineString): + raise GeometryTypeError("First geometry must be a LineString") + if not isinstance(g2, LineString): + raise GeometryTypeError("Second geometry must be a LineString") + return shapely.shared_paths(g1, g2) + + +class SplitOp: + @staticmethod + def _split_polygon_with_line(poly, splitter): + """Split a Polygon with a LineString""" + if not isinstance(poly, Polygon): + raise GeometryTypeError("First argument must be a Polygon") + if not isinstance(splitter, LineString): + raise GeometryTypeError("Second argument must be a LineString") + + union = poly.boundary.union(splitter) + + # greatly improves split performance for big geometries with many + # holes (the following contains checks) with minimal overhead + # for common cases + poly = prep(poly) + + # some polygonized geometries may be holes, we do not want them + # that's why we test if the original polygon (poly) contains + # an inner point of polygonized geometry (pg) + return [ + pg for pg in polygonize(union) if poly.contains(pg.representative_point()) + ] + + @staticmethod + def _split_line_with_line(line, splitter): + """Split a LineString with another (Multi)LineString or (Multi)Polygon""" + + # if splitter is a polygon, pick it's boundary + if splitter.geom_type in ("Polygon", "MultiPolygon"): + splitter = splitter.boundary + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, LineString) and not isinstance( + splitter, MultiLineString + ): + raise GeometryTypeError( + "Second argument must be either a LineString or a MultiLineString" + ) + + # | s\l | Interior | Boundary | Exterior | + # |----------|----------|----------|----------| + # | Interior | 0 or F | * | * | At least one of these two must be 0 + # | Boundary | 0 or F | * | * | So either '0********' or '[0F]**0*****' + # | Exterior | * | * | * | No overlapping interiors ('1********') + relation = splitter.relate(line) + if relation[0] == "1": + # The lines overlap at some segment (linear intersection of interiors) + raise ValueError("Input geometry segment overlaps with the splitter.") + elif relation[0] == "0" or relation[3] == "0": + # The splitter crosses or touches the line's interior --> return multilinestring from the split + return line.difference(splitter) + else: + # The splitter does not cross or touch the line's interior --> return collection with identity line + return [line] + + @staticmethod + def _split_line_with_point(line, splitter): + """Split a LineString with a Point""" + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, Point): + raise GeometryTypeError("Second argument must be a Point") + + # check if point is in the interior of the line + if not line.relate_pattern(splitter, "0********"): + # point not on line interior --> return collection with single identity line + # (REASONING: Returning a list with the input line reference and creating a + # GeometryCollection at the general split function prevents unnecessary copying + # of linestrings in multipoint splitting function) + return [line] + elif line.coords[0] == splitter.coords[0]: + # if line is a closed ring the previous test doesn't behave as desired + return [line] + + # point is on line, get the distance from the first point on line + distance_on_line = line.project(splitter) + coords = list(line.coords) + # split the line at the point and create two new lines + current_position = 0.0 + for i in range(len(coords) - 1): + point1 = coords[i] + point2 = coords[i + 1] + dx = point1[0] - point2[0] + dy = point1[1] - point2[1] + segment_length = (dx**2 + dy**2) ** 0.5 + current_position += segment_length + if distance_on_line == current_position: + # splitter is exactly on a vertex + return [LineString(coords[: i + 2]), LineString(coords[i + 1 :])] + elif distance_on_line < current_position: + # splitter is between two vertices + return [ + LineString(coords[: i + 1] + [splitter.coords[0]]), + LineString([splitter.coords[0]] + coords[i + 1 :]), + ] + return [line] + + @staticmethod + def _split_line_with_multipoint(line, splitter): + """Split a LineString with a MultiPoint""" + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, MultiPoint): + raise GeometryTypeError("Second argument must be a MultiPoint") + + chunks = [line] + for pt in splitter.geoms: + new_chunks = [] + for chunk in filter(lambda x: not x.is_empty, chunks): + # add the newly split 2 lines or the same line if not split + new_chunks.extend(SplitOp._split_line_with_point(chunk, pt)) + chunks = new_chunks + + return chunks + + @staticmethod + def split(geom, splitter): + """ + Splits a geometry by another geometry and returns a collection of geometries. This function is the theoretical + opposite of the union of the split geometry parts. If the splitter does not split the geometry, a collection + with a single geometry equal to the input geometry is returned. + The function supports: + - Splitting a (Multi)LineString by a (Multi)Point or (Multi)LineString or (Multi)Polygon + - Splitting a (Multi)Polygon by a LineString + + It may be convenient to snap the splitter with low tolerance to the geometry. For example in the case + of splitting a line by a point, the point must be exactly on the line, for the line to be correctly split. + When splitting a line by a polygon, the boundary of the polygon is used for the operation. + When splitting a line by another line, a ValueError is raised if the two overlap at some segment. + + Parameters + ---------- + geom : geometry + The geometry to be split + splitter : geometry + The geometry that will split the input geom + + Example + ------- + >>> pt = Point((1, 1)) + >>> line = LineString([(0,0), (2,2)]) + >>> result = split(line, pt) + >>> result.wkt + 'GEOMETRYCOLLECTION (LINESTRING (0 0, 1 1), LINESTRING (1 1, 2 2))' + """ + + if geom.geom_type in ("MultiLineString", "MultiPolygon"): + return GeometryCollection( + [i for part in geom.geoms for i in SplitOp.split(part, splitter).geoms] + ) + + elif geom.geom_type == "LineString": + if splitter.geom_type in ( + "LineString", + "MultiLineString", + "Polygon", + "MultiPolygon", + ): + split_func = SplitOp._split_line_with_line + elif splitter.geom_type == "Point": + split_func = SplitOp._split_line_with_point + elif splitter.geom_type == "MultiPoint": + split_func = SplitOp._split_line_with_multipoint + else: + raise GeometryTypeError( + f"Splitting a LineString with a {splitter.geom_type} is not supported" + ) + + elif geom.geom_type == "Polygon": + if splitter.geom_type == "LineString": + split_func = SplitOp._split_polygon_with_line + else: + raise GeometryTypeError( + f"Splitting a Polygon with a {splitter.geom_type} is not supported" + ) + + else: + raise GeometryTypeError( + f"Splitting {geom.geom_type} geometry is not supported" + ) + + return GeometryCollection(split_func(geom, splitter)) + + +split = SplitOp.split + + +def substring(geom, start_dist, end_dist, normalized=False): + """Return a line segment between specified distances along a LineString + + Negative distance values are taken as measured in the reverse + direction from the end of the geometry. Out-of-range index + values are handled by clamping them to the valid range of values. + + If the start distance equals the end distance, a Point is returned. + + If the start distance is actually beyond the end distance, then the + reversed substring is returned such that the start distance is + at the first coordinate. + + Parameters + ---------- + geom : LineString + The geometry to get a substring of. + start_dist : float + The distance along `geom` of the start of the substring. + end_dist : float + The distance along `geom` of the end of the substring. + normalized : bool, False + Whether the distance parameters are interpreted as a + fraction of the geometry's length. + + Returns + ------- + Union[Point, LineString] + The substring between `start_dist` and `end_dist` or a Point + if they are at the same location. + + Raises + ------ + TypeError + If `geom` is not a LineString. + + Examples + -------- + >>> from shapely.geometry import LineString + >>> from shapely.ops import substring + >>> ls = LineString((i, 0) for i in range(6)) + >>> ls.wkt + 'LINESTRING (0 0, 1 0, 2 0, 3 0, 4 0, 5 0)' + >>> substring(ls, start_dist=1, end_dist=3).wkt + 'LINESTRING (1 0, 2 0, 3 0)' + >>> substring(ls, start_dist=3, end_dist=1).wkt + 'LINESTRING (3 0, 2 0, 1 0)' + >>> substring(ls, start_dist=1, end_dist=-3).wkt + 'LINESTRING (1 0, 2 0)' + >>> substring(ls, start_dist=0.2, end_dist=-0.6, normalized=True).wkt + 'LINESTRING (1 0, 2 0)' + + Returning a `Point` when `start_dist` and `end_dist` are at the + same location. + + >>> substring(ls, 2.5, -2.5).wkt + 'POINT (2.5 0)' + """ + + if not isinstance(geom, LineString): + raise GeometryTypeError( + "Can only calculate a substring of LineString geometries. " + f"A {geom.geom_type} was provided." + ) + + # Filter out cases in which to return a point + if start_dist == end_dist: + return geom.interpolate(start_dist, normalized) + elif not normalized and start_dist >= geom.length and end_dist >= geom.length: + return geom.interpolate(geom.length, normalized) + elif not normalized and -start_dist >= geom.length and -end_dist >= geom.length: + return geom.interpolate(0, normalized) + elif normalized and start_dist >= 1 and end_dist >= 1: + return geom.interpolate(1, normalized) + elif normalized and -start_dist >= 1 and -end_dist >= 1: + return geom.interpolate(0, normalized) + + if normalized: + start_dist *= geom.length + end_dist *= geom.length + + # Filter out cases where distances meet at a middle point from opposite ends. + if start_dist < 0 < end_dist and abs(start_dist) + end_dist == geom.length: + return geom.interpolate(end_dist) + elif end_dist < 0 < start_dist and abs(end_dist) + start_dist == geom.length: + return geom.interpolate(start_dist) + + start_point = geom.interpolate(start_dist) + end_point = geom.interpolate(end_dist) + + if start_dist < 0: + start_dist = geom.length + start_dist # Values may still be negative, + if end_dist < 0: # but only in the out-of-range + end_dist = geom.length + end_dist # sense, not the wrap-around sense. + + reverse = start_dist > end_dist + if reverse: + start_dist, end_dist = end_dist, start_dist + + if start_dist < 0: + start_dist = 0 # to avoid duplicating the first vertex + + if reverse: + vertex_list = [tuple(*end_point.coords)] + else: + vertex_list = [tuple(*start_point.coords)] + + coords = list(geom.coords) + current_distance = 0 + for p1, p2 in zip(coords, coords[1:]): + if start_dist < current_distance < end_dist: + vertex_list.append(p1) + elif current_distance >= end_dist: + break + + current_distance += ((p2[0] - p1[0]) ** 2 + (p2[1] - p1[1]) ** 2) ** 0.5 + + if reverse: + vertex_list.append(tuple(*start_point.coords)) + # reverse direction result + vertex_list = reversed(vertex_list) + else: + vertex_list.append(tuple(*end_point.coords)) + + return LineString(vertex_list) + + +def clip_by_rect(geom, xmin, ymin, xmax, ymax): + """Returns the portion of a geometry within a rectangle + + The geometry is clipped in a fast but possibly dirty way. The output is + not guaranteed to be valid. No exceptions will be raised for topological + errors. + + Parameters + ---------- + geom : geometry + The geometry to be clipped + xmin : float + Minimum x value of the rectangle + ymin : float + Minimum y value of the rectangle + xmax : float + Maximum x value of the rectangle + ymax : float + Maximum y value of the rectangle + + Notes + ----- + Requires GEOS >= 3.5.0 + New in 1.7. + """ + if geom.is_empty: + return geom + return shapely.clip_by_rect(geom, xmin, ymin, xmax, ymax) + + +def orient(geom, sign=1.0): + """A properly oriented copy of the given geometry. + + The signed area of the result will have the given sign. A sign of + 1.0 means that the coordinates of the product's exterior rings will + be oriented counter-clockwise. + + Parameters + ---------- + geom : Geometry + The original geometry. May be a Polygon, MultiPolygon, or + GeometryCollection. + sign : float, optional. + The sign of the result's signed area. + + Returns + ------- + Geometry + + """ + if isinstance(geom, BaseMultipartGeometry): + return geom.__class__( + list( + map( + lambda geom: orient(geom, sign), + geom.geoms, + ) + ) + ) + if isinstance(geom, (Polygon,)): + return orient_(geom, sign) + return geom +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_sources/_generated/lasso.CubeTransit.rst.txt b/branch/seperate_maz_and_taz/_sources/_generated/lasso.CubeTransit.rst.txt new file mode 100644 index 0000000..e24b49e --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/_generated/lasso.CubeTransit.rst.txt @@ -0,0 +1,36 @@ +lasso.CubeTransit +================= + +.. currentmodule:: lasso + +.. autoclass:: CubeTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~CubeTransit.__init__ + ~CubeTransit.add_additional_time_periods + ~CubeTransit.add_cube + ~CubeTransit.build_route_name + ~CubeTransit.calculate_start_end_times + ~CubeTransit.create_add_route_card_dict + ~CubeTransit.create_delete_route_card_dict + ~CubeTransit.create_from_cube + ~CubeTransit.create_update_route_card_dict + ~CubeTransit.cube_properties_to_standard_properties + ~CubeTransit.evaluate_differences + ~CubeTransit.evaluate_route_property_differences + ~CubeTransit.evaluate_route_shape_changes + ~CubeTransit.get_time_period_numbers_from_cube_properties + ~CubeTransit.unpack_route_name + + + + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt b/branch/seperate_maz_and_taz/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt new file mode 100644 index 0000000..29190d8 --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt @@ -0,0 +1,90 @@ +lasso.ModelRoadwayNetwork +========================= + +.. currentmodule:: lasso + +.. autoclass:: ModelRoadwayNetwork + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~ModelRoadwayNetwork.__init__ + ~ModelRoadwayNetwork.add_counts + ~ModelRoadwayNetwork.add_incident_link_data_to_nodes + ~ModelRoadwayNetwork.add_new_roadway_feature_change + ~ModelRoadwayNetwork.add_variable_using_shst_reference + ~ModelRoadwayNetwork.addition_map + ~ModelRoadwayNetwork.apply + ~ModelRoadwayNetwork.apply_managed_lane_feature_change + ~ModelRoadwayNetwork.apply_python_calculation + ~ModelRoadwayNetwork.apply_roadway_feature_change + ~ModelRoadwayNetwork.assess_connectivity + ~ModelRoadwayNetwork.build_selection_key + ~ModelRoadwayNetwork.calculate_area_type + ~ModelRoadwayNetwork.calculate_centroidconnect + ~ModelRoadwayNetwork.calculate_county + ~ModelRoadwayNetwork.calculate_distance + ~ModelRoadwayNetwork.calculate_mpo + ~ModelRoadwayNetwork.calculate_use + ~ModelRoadwayNetwork.convert_int + ~ModelRoadwayNetwork.create_ML_variable + ~ModelRoadwayNetwork.create_calculated_variables + ~ModelRoadwayNetwork.create_dummy_connector_links + ~ModelRoadwayNetwork.create_hov_corridor_variable + ~ModelRoadwayNetwork.create_managed_lane_network + ~ModelRoadwayNetwork.create_managed_variable + ~ModelRoadwayNetwork.dataframe_to_fixed_width + ~ModelRoadwayNetwork.delete_roadway_feature_change + ~ModelRoadwayNetwork.deletion_map + ~ModelRoadwayNetwork.fill_na + ~ModelRoadwayNetwork.from_RoadwayNetwork + ~ModelRoadwayNetwork.get_attribute + ~ModelRoadwayNetwork.get_managed_lane_node_ids + ~ModelRoadwayNetwork.get_modal_graph + ~ModelRoadwayNetwork.get_modal_links_nodes + ~ModelRoadwayNetwork.get_property_by_time_period_and_group + ~ModelRoadwayNetwork.identify_segment + ~ModelRoadwayNetwork.identify_segment_endpoints + ~ModelRoadwayNetwork.is_network_connected + ~ModelRoadwayNetwork.load_transform_network + ~ModelRoadwayNetwork.network_connection_plot + ~ModelRoadwayNetwork.orig_dest_nodes_foreign_key + ~ModelRoadwayNetwork.ox_graph + ~ModelRoadwayNetwork.path_search + ~ModelRoadwayNetwork.read + ~ModelRoadwayNetwork.read_match_result + ~ModelRoadwayNetwork.rename_variables_for_dbf + ~ModelRoadwayNetwork.roadway_net_to_gdf + ~ModelRoadwayNetwork.roadway_standard_to_met_council_network + ~ModelRoadwayNetwork.select_roadway_features + ~ModelRoadwayNetwork.selection_has_unique_link_id + ~ModelRoadwayNetwork.selection_map + ~ModelRoadwayNetwork.shortest_path + ~ModelRoadwayNetwork.split_properties_by_time_period_and_category + ~ModelRoadwayNetwork.update_distance + ~ModelRoadwayNetwork.validate_link_schema + ~ModelRoadwayNetwork.validate_node_schema + ~ModelRoadwayNetwork.validate_properties + ~ModelRoadwayNetwork.validate_selection + ~ModelRoadwayNetwork.validate_shape_schema + ~ModelRoadwayNetwork.validate_uniqueness + ~ModelRoadwayNetwork.write + ~ModelRoadwayNetwork.write_roadway_as_fixedwidth + ~ModelRoadwayNetwork.write_roadway_as_shp + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~ModelRoadwayNetwork.CALCULATED_VALUES + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_sources/_generated/lasso.Parameters.rst.txt b/branch/seperate_maz_and_taz/_sources/_generated/lasso.Parameters.rst.txt new file mode 100644 index 0000000..4aeacfb --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/_generated/lasso.Parameters.rst.txt @@ -0,0 +1,31 @@ +lasso.Parameters +================ + +.. currentmodule:: lasso + +.. autoclass:: Parameters + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Parameters.__init__ + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Parameters.maz_shape_file + ~Parameters.properties_to_split + ~Parameters.county_link_range_dict + ~Parameters.zones + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_sources/_generated/lasso.Project.rst.txt b/branch/seperate_maz_and_taz/_sources/_generated/lasso.Project.rst.txt new file mode 100644 index 0000000..e6e6bcc --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/_generated/lasso.Project.rst.txt @@ -0,0 +1,42 @@ +lasso.Project +============= + +.. currentmodule:: lasso + +.. autoclass:: Project + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Project.__init__ + ~Project.add_highway_changes + ~Project.add_transit_changes + ~Project.create_project + ~Project.determine_roadway_network_changes_compatability + ~Project.emme_id_to_wrangler_id + ~Project.emme_name_to_wrangler_name + ~Project.evaluate_changes + ~Project.get_object_from_network_build_command + ~Project.get_operation_from_network_build_command + ~Project.read_logfile + ~Project.read_network_build_file + ~Project.write_project_card + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Project.CALCULATED_VALUES + ~Project.DEFAULT_PROJECT_NAME + ~Project.STATIC_VALUES + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_sources/_generated/lasso.StandardTransit.rst.txt b/branch/seperate_maz_and_taz/_sources/_generated/lasso.StandardTransit.rst.txt new file mode 100644 index 0000000..4fae048 --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/_generated/lasso.StandardTransit.rst.txt @@ -0,0 +1,33 @@ +lasso.StandardTransit +===================== + +.. currentmodule:: lasso + +.. autoclass:: StandardTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~StandardTransit.__init__ + ~StandardTransit.calculate_cube_mode + ~StandardTransit.cube_format + ~StandardTransit.evaluate_differences + ~StandardTransit.fromTransitNetwork + ~StandardTransit.read_gtfs + ~StandardTransit.route_properties_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_dict_list + ~StandardTransit.shape_gtfs_to_emme + ~StandardTransit.time_to_cube_time_period + ~StandardTransit.write_as_cube_lin + + + + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_sources/_generated/lasso.logger.rst.txt b/branch/seperate_maz_and_taz/_sources/_generated/lasso.logger.rst.txt new file mode 100644 index 0000000..2054273 --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/_generated/lasso.logger.rst.txt @@ -0,0 +1,29 @@ +lasso.logger +============ + +.. automodule:: lasso.logger + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + setupLogging + + + + + + + + + + + + + diff --git a/branch/seperate_maz_and_taz/_sources/_generated/lasso.util.rst.txt b/branch/seperate_maz_and_taz/_sources/_generated/lasso.util.rst.txt new file mode 100644 index 0000000..95fecf8 --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/_generated/lasso.util.rst.txt @@ -0,0 +1,35 @@ +lasso.util +========== + +.. automodule:: lasso.util + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + column_name_to_parts + create_locationreference + geodesic_point_buffer + get_shared_streets_intersection_hash + hhmmss_to_datetime + secs_to_datetime + shorten_name + + + + + + + + + + + + + diff --git a/branch/seperate_maz_and_taz/_sources/autodoc.rst.txt b/branch/seperate_maz_and_taz/_sources/autodoc.rst.txt new file mode 100644 index 0000000..7e48d58 --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/autodoc.rst.txt @@ -0,0 +1,29 @@ +Lasso Classes and Functions +==================================== + +.. automodule:: lasso + :no-members: + :no-undoc-members: + :no-inherited-members: + :no-show-inheritance: + + +Base Classes +-------------- +.. autosummary:: + :toctree: _generated + :nosignatures: + + CubeTransit + StandardTransit + ModelRoadwayNetwork + Project + Parameters + +Utils and Functions +-------------------- +.. autosummary:: + :toctree: _generated + + util + logger diff --git a/branch/seperate_maz_and_taz/_sources/index.rst.txt b/branch/seperate_maz_and_taz/_sources/index.rst.txt new file mode 100644 index 0000000..616853c --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/index.rst.txt @@ -0,0 +1,36 @@ +.. lasso documentation master file, created by + sphinx-quickstart on Thu Dec 5 15:43:28 2019. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +Welcome to lasso's documentation! +================================= + +This package of utilities is a wrapper around the +`network_wrangler `_ package +for MetCouncil and MTC. It aims to have the following functionality: + +1. parse Cube log files and base highway networks and create ProjectCards + for Network Wrangler +2. parse two Cube transit line files and create ProjectCards for NetworkWrangler +3. refine Network Wrangler highway networks to contain specific variables and + settings for the respective agency and export them to a format that can + be read in by Citilab's Cube software. + +.. toctree:: + :maxdepth: 3 + :caption: Contents: + + starting + setup + running + autodoc + + + +Indices and tables +================== + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/branch/seperate_maz_and_taz/_sources/running.md.txt b/branch/seperate_maz_and_taz/_sources/running.md.txt new file mode 100644 index 0000000..e139dc8 --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/running.md.txt @@ -0,0 +1,12 @@ +# Running Lasso + +## Create project files + + +## Create a scenario + + +## Exporting networks + + +## Auditing and Reporting diff --git a/branch/seperate_maz_and_taz/_sources/setup.md.txt b/branch/seperate_maz_and_taz/_sources/setup.md.txt new file mode 100644 index 0000000..e77d463 --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/setup.md.txt @@ -0,0 +1,9 @@ +# Setup + +### Projects + +### Parameters + +### Settings + +### Additional Data Files diff --git a/branch/seperate_maz_and_taz/_sources/starting.md.txt b/branch/seperate_maz_and_taz/_sources/starting.md.txt new file mode 100644 index 0000000..8886f95 --- /dev/null +++ b/branch/seperate_maz_and_taz/_sources/starting.md.txt @@ -0,0 +1,292 @@ +# Starting Out + +## Installation + +If you are managing multiple python versions, we suggest using [`virtualenv`](https://virtualenv.pypa.io/en/latest/) or [`conda`](https://conda.io/en/latest/) virtual environments. + +Example using a conda environment (recommended) and using the package manager [pip](https://pip.pypa.io/en/stable/) to install Lasso from the source on GitHub. + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/Lasso@master +``` + +Lasso will install `network_wrangler` from the [PyPi](https://pypi.org/project/network-wrangler/) repository because it is included in Lasso's `requirements.txt`. + +#### Bleeding Edge +If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the `develop` branch of + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/network_wrangler@develop +pip install git+https://github.com/wsp-sag/Lasso@develop +``` + +#### From Clone +If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in [editable mode](https://pip.pypa.io/en/stable/reference/pip_install/?highlight=editable#editable-installs). + +**if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!** + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas osmnx -n +conda activate +git clone https://github.com/wsp-sag/Lasso +git clone https://github.com/wsp-sag/network_wrangler +cd network_wrangler +pip install -e . +cd .. +cd Lasso +pip install -e . +``` + +Notes: + +1. The -e installs it in editable mode. +2. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per [these directions on github](https://help.github.com/en/articles/fork-a-repo). +3. if you wanted to install from a specific tag/version number or branch, replace `@master` with `@` or `@tag` +4. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler + +If you are going to be doing Lasso development, we also recommend: + - a good IDE such as [Atom](http://atom.io), VS Code, Sublime Text, etc. + with Python syntax highlighting turned on. + - [GitHub Desktop](https://desktop.github.com/) to locally update your clones + +## Brief Intro + +Lasso is a 'wrapper' around the [Network Wrangler](http://wsp-sag.github.io/network_wrangler) utility. + +Both Lasso and NetworkWrangler are built around the following data schemas: + - [`roadway network`], which is based on a mashup of Open Street Map and [Shared Streets](http://sharedstreets.io). In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category. + - [`transit network`], which is based on a frequency-based implementation of the csv-based GTFS; and + - [`project card`], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml. + +In addition, Lasso utilizes the following data schemas: + + - [`MetCouncil Model Roadway Network Schema`], which adds data fields to the `roadway network` schema that MetCouncil uses in their travel model including breaking out data fields by time period. + - [`MetCouncil Model Transit Network Schema`], which uses the Cube PublicTransport format, and + - [`Cube Log Files`], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler. + - [`Cube public transport line files`], which define a set of transit lines in the cube software. + +### Components +Network Wrangler has the following atomic parts: + + - _RoadwayNetwork_ object, which represents the `roadway network` data as GeoDataFrames; + - _TransitNetwork_ object, which represents the `transit network` data as DataFrames; + - _ProjectCard_ object, which represents the data of the `project card`. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.; + - _Scenario_ object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network. + +In addition, Lasso has the following atomic parts: + + - _Project_ object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file. + - _ModelRoadwayNetwork_ object is a subclass of `RoadwayNetwork` and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube. + - _StandardTransit_, an object for holding a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files. + - _CubeTransit_, an object for storing information about transit defined in `Cube public transport line files` + . Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries. + - _Parameters_, A class representing all the parameters defining the networks + including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + +#### RoadwayNetwork + +Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames `links_df`, `nodes_df`, and `shapes_df`. + +```python +net = RoadwayNetwork.read( + link_filename=MY_LINK_FILE, + node_filename=MY_NODE_FILE, + shape_filename=MY_SHAPE_FILE, + shape_foreign_key ='shape_id', + + ) +my_selection = { + "link": [{"name": ["I 35E"]}], + "A": {"osm_node_id": "961117623"}, # start searching for segments at A + "B": {"osm_node_id": "2564047368"}, +} +net.select_roadway_features(my_selection) + +my_change = [ + { + 'property': 'lanes', + 'existing': 1, + 'set': 2, + }, + { + 'property': 'drive_access', + 'set': 0, + }, +] + +my_net.apply_roadway_feature_change( + my_net.select_roadway_features(my_selection), + my_change +) + +ml_net = net.create_managed_lane_network(in_place=False) + +ml_net.is_network_connected(mode="drive")) + +_, disconnected_nodes = ml_net.assess_connectivity( + mode="walk", + ignore_end_nodes=True +) +ml_net.write(filename=my_out_prefix, path=my_dir) +``` +#### TransitNetwork + +#### ProjectCard + +#### Scenario + +Manages sets of project cards and tiering from a base scenario/set of networks. + +```python + +my_base_scenario = { + "road_net": RoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ), + "transit_net": TransitNetwork.read(STPAUL_DIR), +} + +card_filenames = [ + "3_multiple_roadway_attribute_change.yml", + "multiple_changes.yml", + "4_simple_managed_lane.yml", +] + +project_card_directory = os.path.join(STPAUL_DIR, "project_cards") + +project_cards_list = [ + ProjectCard.read(os.path.join(project_card_directory, filename), validate=False) + for filename in card_filenames +] + +my_scenario = Scenario.create_scenario( + base_scenario=my_base_scenario, + project_cards_list=project_cards_list, +) +my_scenario.check_scenario_requisites() + +my_scenario.apply_all_projects() + +my_scenario.scenario_summary() +``` + +#### Project +Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network; + +```python + +test_project = Project.create_project( + base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + +test_project.evaluate_changes() + +test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + +``` + +#### ModelRoadwayNetwork +A subclass of network_wrangler's RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema. + +```Python + +net = ModelRoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ) + +net.write_roadway_as_fixedwidth() + +``` + +#### StandardTransit +Translates the standard GTFS data to MetCouncil's Cube Line files. + +```Python +cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) +cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) +``` + +#### CubeTransit +Used by the project class and has the capability to: + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + +```python +tn = CubeTransit.create_from_cube(CUBE_DIR) +transit_change_list = tn.evaluate_differences(base_transit_network) +``` + +#### Parameters +Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary. + +```Python +# read parameters from a yaml configuration file +# could also provide as a key/value pair +with open(config_file) as f: + my_config = yaml.safe_load(f) + +# provide parameters at instantiation of ModelRoadwayNetwork +model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork( + my_scenario.road_net, parameters=my_config.get("my_parameters", {}) + ) +# network written with direction from the parameters given +model_road_net.write_roadway_as_shp() + +``` + +### Typical Workflow + +Workflows in Lasso and Network Wrangler typically accomplish one of two goals: +1. Create Project Cards to document network changes as a result of either transit or roadway projects. +2. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network. + +#### Project Cards from Transit LIN Files + + +#### Project Cards from Cube LOG Files + + +#### Model Network Files for a Scenario + + + +## Running Quickstart Jupyter Notebooks + +To learn basic lasso functionality, please refer to the following jupyter notebooks in the `/notebooks` directory: + + - `Lasso Project Card Creation Quickstart.ipynb` + - `Lasso Scenario Creation Quickstart.ipynb` + + Jupyter notebooks can be started by activating the lasso conda environment and typing `jupyter notebook`: + + ```bash + conda activate + jupyter notebook + ``` diff --git a/branch/seperate_maz_and_taz/_static/_sphinx_javascript_frameworks_compat.js b/branch/seperate_maz_and_taz/_static/_sphinx_javascript_frameworks_compat.js new file mode 100644 index 0000000..8141580 --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/_sphinx_javascript_frameworks_compat.js @@ -0,0 +1,123 @@ +/* Compatability shim for jQuery and underscores.js. + * + * Copyright Sphinx contributors + * Released under the two clause BSD licence + */ + +/** + * small helper function to urldecode strings + * + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent#Decoding_query_parameters_from_a_URL + */ +jQuery.urldecode = function(x) { + if (!x) { + return x + } + return decodeURIComponent(x.replace(/\+/g, ' ')); +}; + +/** + * small helper function to urlencode strings + */ +jQuery.urlencode = encodeURIComponent; + +/** + * This function returns the parsed url parameters of the + * current request. Multiple values per key are supported, + * it will always return arrays of strings for the value parts. + */ +jQuery.getQueryParameters = function(s) { + if (typeof s === 'undefined') + s = document.location.search; + var parts = s.substr(s.indexOf('?') + 1).split('&'); + var result = {}; + for (var i = 0; i < parts.length; i++) { + var tmp = parts[i].split('=', 2); + var key = jQuery.urldecode(tmp[0]); + var value = jQuery.urldecode(tmp[1]); + if (key in result) + result[key].push(value); + else + result[key] = [value]; + } + return result; +}; + +/** + * highlight a given string on a jquery object by wrapping it in + * span elements with the given class name. + */ +jQuery.fn.highlightText = function(text, className) { + function highlight(node, addItems) { + if (node.nodeType === 3) { + var val = node.nodeValue; + var pos = val.toLowerCase().indexOf(text); + if (pos >= 0 && + !jQuery(node.parentNode).hasClass(className) && + !jQuery(node.parentNode).hasClass("nohighlight")) { + var span; + var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.className = className; + } + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + node.parentNode.insertBefore(span, node.parentNode.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling)); + node.nodeValue = val.substr(0, pos); + if (isInSVG) { + var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); + var bbox = node.parentElement.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute('class', className); + addItems.push({ + "parent": node.parentNode, + "target": rect}); + } + } + } + else if (!jQuery(node).is("button, select, textarea")) { + jQuery.each(node.childNodes, function() { + highlight(this, addItems); + }); + } + } + var addItems = []; + var result = this.each(function() { + highlight(this, addItems); + }); + for (var i = 0; i < addItems.length; ++i) { + jQuery(addItems[i].parent).before(addItems[i].target); + } + return result; +}; + +/* + * backward compatibility for jQuery.browser + * This will be supported until firefox bug is fixed. + */ +if (!jQuery.browser) { + jQuery.uaMatch = function(ua) { + ua = ua.toLowerCase(); + + var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || + /(webkit)[ \/]([\w.]+)/.exec(ua) || + /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || + /(msie) ([\w.]+)/.exec(ua) || + ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || + []; + + return { + browser: match[ 1 ] || "", + version: match[ 2 ] || "0" + }; + }; + jQuery.browser = {}; + jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; +} diff --git a/branch/seperate_maz_and_taz/_static/basic.css b/branch/seperate_maz_and_taz/_static/basic.css new file mode 100644 index 0000000..cfc60b8 --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/basic.css @@ -0,0 +1,921 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 230px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li p.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 360px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, figure.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, figure.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, figure.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, figure.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar, +aside.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} + +nav.contents, +aside.topic, +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ + +nav.contents, +aside.topic, +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +aside.sidebar > :last-child, +nav.contents > :last-child, +aside.topic > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +aside.sidebar::after, +nav.contents::after, +aside.topic::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure, figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption, figcaption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number, +figcaption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text, +figcaption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + +/* -- object description styles --------------------------------------------- */ + +.sig { + font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; +} + +.sig-name, code.descname { + background-color: transparent; + font-weight: bold; +} + +.sig-name { + font-size: 1.1em; +} + +code.descname { + font-size: 1.2em; +} + +.sig-prename, code.descclassname { + background-color: transparent; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.sig-param.n { + font-style: italic; +} + +/* C++ specific styling */ + +.sig-inline.c-texpr, +.sig-inline.cpp-texpr { + font-family: unset; +} + +.sig.c .k, .sig.c .kt, +.sig.cpp .k, .sig.cpp .kt { + color: #0033B3; +} + +.sig.c .m, +.sig.cpp .m { + color: #1750EB; +} + +.sig.c .s, .sig.c .sc, +.sig.cpp .s, .sig.cpp .sc { + color: #067D17; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} + +aside.footnote > span, +div.citation > span { + float: left; +} +aside.footnote > span:last-of-type, +div.citation > span:last-of-type { + padding-right: 0.5em; +} +aside.footnote > p { + margin-left: 2em; +} +div.citation > p { + margin-left: 4em; +} +aside.footnote > p:last-of-type, +div.citation > p:last-of-type { + margin-bottom: 0em; +} +aside.footnote > p:last-of-type:after, +div.citation > p:last-of-type:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +.sig dd { + margin-top: 0px; + margin-bottom: 0px; +} + +.sig dl { + margin-top: 0px; + margin-bottom: 0px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0 0.5em; + content: ":"; + display: inline-block; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +.translated { + background-color: rgba(207, 255, 207, 0.2) +} + +.untranslated { + background-color: rgba(255, 207, 207, 0.2) +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; + white-space: nowrap; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +span.linenos, +div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; + -webkit-user-select: text; /* Safari fallback only */ + -webkit-user-select: none; /* Chrome/Safari */ + -moz-user-select: none; /* Firefox */ + -ms-user-select: none; /* IE10+ */ +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_static/css/badge_only.css b/branch/seperate_maz_and_taz/_static/css/badge_only.css new file mode 100644 index 0000000..c718cee --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/css/badge_only.css @@ -0,0 +1 @@ +.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-style:normal;font-weight:400;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#FontAwesome) format("svg")}.fa:before{font-family:FontAwesome;font-style:normal;font-weight:400;line-height:1}.fa:before,a .fa{text-decoration:inherit}.fa:before,a .fa,li .fa{display:inline-block}li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before,.icon-book:before{content:"\f02d"}.fa-caret-down:before,.icon-caret-down:before{content:"\f0d7"}.fa-caret-up:before,.icon-caret-up:before{content:"\f0d8"}.fa-caret-left:before,.icon-caret-left:before{content:"\f0d9"}.fa-caret-right:before,.icon-caret-right:before{content:"\f0da"}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60}.rst-versions .rst-current-version:after{clear:both;content:"";display:block}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}} \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Bold.woff b/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Bold.woff2 b/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Regular.woff b/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Regular.woff2 b/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.eot b/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.eot new file mode 100644 index 0000000..e9f60ca Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.eot differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.svg b/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.svg new file mode 100644 index 0000000..855c845 --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.ttf b/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 0000000..35acda2 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.ttf differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.woff b/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.woff new file mode 100644 index 0000000..400014a Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.woff differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.woff2 b/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 0000000..4d13fc6 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/fontawesome-webfont.woff2 differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold-italic.woff b/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold-italic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold-italic.woff differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold-italic.woff2 b/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold-italic.woff2 differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold.woff b/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold.woff differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold.woff2 b/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/lato-bold.woff2 differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal-italic.woff b/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal-italic.woff differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal-italic.woff2 b/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal-italic.woff2 differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal.woff b/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal.woff differ diff --git a/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal.woff2 b/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/css/fonts/lato-normal.woff2 differ diff --git a/branch/seperate_maz_and_taz/_static/css/theme.css b/branch/seperate_maz_and_taz/_static/css/theme.css new file mode 100644 index 0000000..19a446a --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/css/theme.css @@ -0,0 +1,4 @@ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .eqno .headerlink:before,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .eqno .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a button.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-left.toctree-expand,.wy-menu-vertical li button.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .eqno .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a button.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-right.toctree-expand,.wy-menu-vertical li button.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .eqno .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a button.pull-left.toctree-expand,.wy-menu-vertical li.on a button.pull-left.toctree-expand,.wy-menu-vertical li button.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .eqno .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a button.pull-right.toctree-expand,.wy-menu-vertical li.on a button.pull-right.toctree-expand,.wy-menu-vertical li button.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li button.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content .eqno .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content .eqno a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content p a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li a button.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content .eqno .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content p .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li button.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content .eqno .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a button.toctree-expand,.btn .wy-menu-vertical li.on a button.toctree-expand,.btn .wy-menu-vertical li button.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content .eqno .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a button.toctree-expand,.nav .wy-menu-vertical li.on a button.toctree-expand,.nav .wy-menu-vertical li button.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .eqno .btn .headerlink,.rst-content .eqno .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p .btn .headerlink,.rst-content p .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn button.toctree-expand,.wy-menu-vertical li.current>a .btn button.toctree-expand,.wy-menu-vertical li.current>a .nav button.toctree-expand,.wy-menu-vertical li .nav button.toctree-expand,.wy-menu-vertical li.on a .btn button.toctree-expand,.wy-menu-vertical li.on a .nav button.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .eqno .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li button.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .eqno .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li button.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .eqno .btn .fa-large.headerlink,.rst-content .eqno .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p .btn .fa-large.headerlink,.rst-content p .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn button.fa-large.toctree-expand,.wy-menu-vertical li .nav button.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .eqno .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li button.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .eqno .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li button.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .eqno .btn .fa-spin.headerlink,.rst-content .eqno .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p .btn .fa-spin.headerlink,.rst-content p .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn button.fa-spin.toctree-expand,.wy-menu-vertical li .nav button.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content .eqno .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li button.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content .eqno .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li button.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content .eqno .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li button.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content .eqno .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini button.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.rst-content section ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.rst-content section ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.rst-content section ul li p:last-child,.rst-content section ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.rst-content section ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.rst-content section ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.rst-content section ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content .section ol.arabic,.rst-content .toctree-wrapper ol,.rst-content .toctree-wrapper ol.arabic,.rst-content section ol,.rst-content section ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol.arabic li,.rst-content .section ol li,.rst-content .toctree-wrapper ol.arabic li,.rst-content .toctree-wrapper ol li,.rst-content section ol.arabic li,.rst-content section ol li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol.arabic li ul,.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content .toctree-wrapper ol.arabic li ul,.rst-content .toctree-wrapper ol li p:last-child,.rst-content .toctree-wrapper ol li ul,.rst-content section ol.arabic li ul,.rst-content section ol li p:last-child,.rst-content section ol li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol.arabic li ul li,.rst-content .section ol li ul li,.rst-content .toctree-wrapper ol.arabic li ul li,.rst-content .toctree-wrapper ol li ul li,.rst-content section ol.arabic li ul li,.rst-content section ol li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs>li{display:inline-block;padding-top:5px}.wy-breadcrumbs>li.wy-breadcrumbs-aside{float:right}.rst-content .wy-breadcrumbs>li code,.rst-content .wy-breadcrumbs>li tt,.wy-breadcrumbs>li .rst-content tt,.wy-breadcrumbs>li code{all:inherit;color:inherit}.breadcrumb-item:before{content:"/";color:#bbb;font-size:13px;padding:0 6px 0 3px}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li button.toctree-expand{display:block;float:left;margin-left:-1.2em;line-height:18px;color:#4d4d4d;border:none;background:none;padding:0}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover button.toctree-expand,.wy-menu-vertical li.on a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand{display:block;line-height:18px;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{padding:.4045em 1.618em .4045em 4.045em}.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{padding:.4045em 1.618em .4045em 5.663em}.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a{padding:.4045em 1.618em .4045em 7.281em}.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a{padding:.4045em 1.618em .4045em 8.899em}.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a{padding:.4045em 1.618em .4045em 10.517em}.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a{padding:.4045em 1.618em .4045em 12.135em}.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a{padding:.4045em 1.618em .4045em 13.753em}.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a{padding:.4045em 1.618em .4045em 15.371em}.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 1.618em .4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 button.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 button.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover button.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active button.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em;max-width:100%}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .eqno .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content .eqno .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li button.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version button.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content .toctree-wrapper>p.caption,.rst-content h1,.rst-content h2,.rst-content h3,.rst-content h4,.rst-content h5,.rst-content h6{margin-bottom:24px}.rst-content img{max-width:100%;height:auto}.rst-content div.figure,.rst-content figure{margin-bottom:24px}.rst-content div.figure .caption-text,.rst-content figure .caption-text{font-style:italic}.rst-content div.figure p:last-child.caption,.rst-content figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center,.rst-content figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img,.rst-content section>a>img,.rst-content section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp,.rst-content div.highlight span.linenos{user-select:none;pointer-events:none}.rst-content div.highlight span.linenos{display:inline-block;padding-left:0;padding-right:12px;margin-right:12px;border-right:1px solid #e6e9ea}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li,.rst-content .toctree-wrapper ol.loweralpha,.rst-content .toctree-wrapper ol.loweralpha>li,.rst-content section ol.loweralpha,.rst-content section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li,.rst-content .toctree-wrapper ol.upperalpha,.rst-content .toctree-wrapper ol.upperalpha>li,.rst-content section ol.upperalpha,.rst-content section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*,.rst-content .toctree-wrapper ol li>*,.rst-content .toctree-wrapper ul li>*,.rst-content section ol li>*,.rst-content section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child,.rst-content .toctree-wrapper ol li>:first-child,.rst-content .toctree-wrapper ul li>:first-child,.rst-content section ol li>:first-child,.rst-content section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child,.rst-content .toctree-wrapper ol li>p,.rst-content .toctree-wrapper ol li>p:last-child,.rst-content .toctree-wrapper ul li>p,.rst-content .toctree-wrapper ul li>p:last-child,.rst-content section ol li>p,.rst-content section ol li>p:last-child,.rst-content section ul li>p,.rst-content section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child,.rst-content .toctree-wrapper ol li>p:only-child,.rst-content .toctree-wrapper ol li>p:only-child:last-child,.rst-content .toctree-wrapper ul li>p:only-child,.rst-content .toctree-wrapper ul li>p:only-child:last-child,.rst-content section ol li>p:only-child,.rst-content section ol li>p:only-child:last-child,.rst-content section ul li>p:only-child,.rst-content section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul,.rst-content .toctree-wrapper ol li>ol,.rst-content .toctree-wrapper ol li>ul,.rst-content .toctree-wrapper ul li>ol,.rst-content .toctree-wrapper ul li>ul,.rst-content section ol li>ol,.rst-content section ol li>ul,.rst-content section ul li>ol,.rst-content section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul,.rst-content .toctree-wrapper ol.simple li>*,.rst-content .toctree-wrapper ol.simple li ol,.rst-content .toctree-wrapper ol.simple li ul,.rst-content .toctree-wrapper ul.simple li>*,.rst-content .toctree-wrapper ul.simple li ol,.rst-content .toctree-wrapper ul.simple li ul,.rst-content section ol.simple li>*,.rst-content section ol.simple li ol,.rst-content section ol.simple li ul,.rst-content section ul.simple li>*,.rst-content section ul.simple li ol,.rst-content section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink{opacity:0;font-size:14px;font-family:FontAwesome;margin-left:.5em}.rst-content .code-block-caption .headerlink:focus,.rst-content .code-block-caption:hover .headerlink,.rst-content .eqno .headerlink:focus,.rst-content .eqno:hover .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink:focus,.rst-content .toctree-wrapper>p.caption:hover .headerlink,.rst-content dl dt .headerlink:focus,.rst-content dl dt:hover .headerlink,.rst-content h1 .headerlink:focus,.rst-content h1:hover .headerlink,.rst-content h2 .headerlink:focus,.rst-content h2:hover .headerlink,.rst-content h3 .headerlink:focus,.rst-content h3:hover .headerlink,.rst-content h4 .headerlink:focus,.rst-content h4:hover .headerlink,.rst-content h5 .headerlink:focus,.rst-content h5:hover .headerlink,.rst-content h6 .headerlink:focus,.rst-content h6:hover .headerlink,.rst-content p.caption .headerlink:focus,.rst-content p.caption:hover .headerlink,.rst-content p .headerlink:focus,.rst-content p:hover .headerlink,.rst-content table>caption .headerlink:focus,.rst-content table>caption:hover .headerlink{opacity:1}.rst-content p a{overflow-wrap:anywhere}.rst-content .wy-table td p,.rst-content .wy-table td ul,.rst-content .wy-table th p,.rst-content .wy-table th ul,.rst-content table.docutils td p,.rst-content table.docutils td ul,.rst-content table.docutils th p,.rst-content table.docutils th ul,.rst-content table.field-list td p,.rst-content table.field-list td ul,.rst-content table.field-list th p,.rst-content table.field-list th ul{font-size:inherit}.rst-content .btn:focus{outline:2px solid}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .citation-reference>span.fn-bracket,.rst-content .footnote-reference>span.fn-bracket{display:none}.rst-content .hlist{width:100%}.rst-content dl dt span.classifier:before{content:" : "}.rst-content dl dt span.classifier-delimiter{display:none!important}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:auto minmax(80%,95%)}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{display:inline-grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{display:grid;grid-template-columns:auto auto minmax(.65rem,auto) minmax(40%,95%)}html.writer-html5 .rst-content aside.citation>span.label,html.writer-html5 .rst-content aside.footnote>span.label,html.writer-html5 .rst-content div.citation>span.label{grid-column-start:1;grid-column-end:2}html.writer-html5 .rst-content aside.citation>span.backrefs,html.writer-html5 .rst-content aside.footnote>span.backrefs,html.writer-html5 .rst-content div.citation>span.backrefs{grid-column-start:2;grid-column-end:3;grid-row-start:1;grid-row-end:3}html.writer-html5 .rst-content aside.citation>p,html.writer-html5 .rst-content aside.footnote>p,html.writer-html5 .rst-content div.citation>p{grid-column-start:4;grid-column-end:5}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{margin-bottom:24px}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.citation>dt>span.brackets:before,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.citation>dt>span.brackets:after,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a{word-break:keep-all}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a:not(:first-child):before,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.citation>dd p,html.writer-html5 .rst-content dl.footnote>dd p{font-size:.9rem}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{padding-left:1rem;padding-right:1rem;font-size:.9rem;line-height:1.2rem}html.writer-html5 .rst-content aside.citation p,html.writer-html5 .rst-content aside.footnote p,html.writer-html5 .rst-content div.citation p{font-size:.9rem;line-height:1.2rem;margin-bottom:12px}html.writer-html5 .rst-content aside.citation span.backrefs,html.writer-html5 .rst-content aside.footnote span.backrefs,html.writer-html5 .rst-content div.citation span.backrefs{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content aside.citation span.backrefs>a,html.writer-html5 .rst-content aside.footnote span.backrefs>a,html.writer-html5 .rst-content div.citation span.backrefs>a{word-break:keep-all}html.writer-html5 .rst-content aside.citation span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content aside.footnote span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content div.citation span.backrefs>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content aside.citation span.label,html.writer-html5 .rst-content aside.footnote span.label,html.writer-html5 .rst-content div.citation span.label{line-height:1.2rem}html.writer-html5 .rst-content aside.citation-list,html.writer-html5 .rst-content aside.footnote-list,html.writer-html5 .rst-content div.citation-list{margin-bottom:24px}html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content aside.footnote-list aside.footnote,html.writer-html5 .rst-content div.citation-list>div.citation,html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content aside.footnote-list aside.footnote code,html.writer-html5 .rst-content aside.footnote-list aside.footnote tt,html.writer-html5 .rst-content aside.footnote code,html.writer-html5 .rst-content aside.footnote tt,html.writer-html5 .rst-content div.citation-list>div.citation code,html.writer-html5 .rst-content div.citation-list>div.citation tt,html.writer-html5 .rst-content dl.citation code,html.writer-html5 .rst-content dl.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c;white-space:normal}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040;overflow-wrap:normal}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}.rst-content dl dd>ol:last-child,.rst-content dl dd>p:last-child,.rst-content dl dd>table:last-child,.rst-content dl dd>ul:last-child{margin-bottom:0}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px;max-width:100%}html.writer-html4 .rst-content dl:not(.docutils) .k,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .k{font-style:italic}html.writer-html4 .rst-content dl:not(.docutils) .descclassname,html.writer-html4 .rst-content dl:not(.docutils) .descname,html.writer-html4 .rst-content dl:not(.docutils) .sig-name,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .sig-name{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#000}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel,.rst-content .menuselection{font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content .guilabel,.rst-content .menuselection{border:1px solid #7fbbe3;background:#e7f2fa}.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>.kbd,.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>kbd{color:inherit;font-size:80%;background-color:#fff;border:1px solid #a6a6a6;border-radius:4px;box-shadow:0 2px grey;padding:2.4px 6px;margin:auto 0}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_static/doctools.js b/branch/seperate_maz_and_taz/_static/doctools.js new file mode 100644 index 0000000..d06a71d --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/doctools.js @@ -0,0 +1,156 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Base JavaScript utilities for all Sphinx HTML documentation. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +const BLACKLISTED_KEY_CONTROL_ELEMENTS = new Set([ + "TEXTAREA", + "INPUT", + "SELECT", + "BUTTON", +]); + +const _ready = (callback) => { + if (document.readyState !== "loading") { + callback(); + } else { + document.addEventListener("DOMContentLoaded", callback); + } +}; + +/** + * Small JavaScript module for the documentation. + */ +const Documentation = { + init: () => { + Documentation.initDomainIndexTable(); + Documentation.initOnKeyListeners(); + }, + + /** + * i18n support + */ + TRANSLATIONS: {}, + PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), + LOCALE: "unknown", + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext: (string) => { + const translated = Documentation.TRANSLATIONS[string]; + switch (typeof translated) { + case "undefined": + return string; // no translation + case "string": + return translated; // translation exists + default: + return translated[0]; // (singular, plural) translation tuple exists + } + }, + + ngettext: (singular, plural, n) => { + const translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated !== "undefined") + return translated[Documentation.PLURAL_EXPR(n)]; + return n === 1 ? singular : plural; + }, + + addTranslations: (catalog) => { + Object.assign(Documentation.TRANSLATIONS, catalog.messages); + Documentation.PLURAL_EXPR = new Function( + "n", + `return (${catalog.plural_expr})` + ); + Documentation.LOCALE = catalog.locale; + }, + + /** + * helper function to focus on search bar + */ + focusSearchBar: () => { + document.querySelectorAll("input[name=q]")[0]?.focus(); + }, + + /** + * Initialise the domain index toggle buttons + */ + initDomainIndexTable: () => { + const toggler = (el) => { + const idNumber = el.id.substr(7); + const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); + if (el.src.substr(-9) === "minus.png") { + el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; + toggledRows.forEach((el) => (el.style.display = "none")); + } else { + el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; + toggledRows.forEach((el) => (el.style.display = "")); + } + }; + + const togglerElements = document.querySelectorAll("img.toggler"); + togglerElements.forEach((el) => + el.addEventListener("click", (event) => toggler(event.currentTarget)) + ); + togglerElements.forEach((el) => (el.style.display = "")); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); + }, + + initOnKeyListeners: () => { + // only install a listener if it is really needed + if ( + !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && + !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS + ) + return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.altKey || event.ctrlKey || event.metaKey) return; + + if (!event.shiftKey) { + switch (event.key) { + case "ArrowLeft": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const prevLink = document.querySelector('link[rel="prev"]'); + if (prevLink && prevLink.href) { + window.location.href = prevLink.href; + event.preventDefault(); + } + break; + case "ArrowRight": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const nextLink = document.querySelector('link[rel="next"]'); + if (nextLink && nextLink.href) { + window.location.href = nextLink.href; + event.preventDefault(); + } + break; + } + } + + // some keyboard layouts may need Shift to get / + switch (event.key) { + case "/": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.focusSearchBar(); + event.preventDefault(); + } + }); + }, +}; + +// quick alias for translations +const _ = Documentation.gettext; + +_ready(Documentation.init); diff --git a/branch/seperate_maz_and_taz/_static/documentation_options.js b/branch/seperate_maz_and_taz/_static/documentation_options.js new file mode 100644 index 0000000..c066c69 --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/documentation_options.js @@ -0,0 +1,14 @@ +var DOCUMENTATION_OPTIONS = { + URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'), + VERSION: '', + LANGUAGE: 'en', + COLLAPSE_INDEX: false, + BUILDER: 'dirhtml', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false, + SHOW_SEARCH_SUMMARY: true, + ENABLE_SEARCH_SHORTCUTS: true, +}; \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_static/file.png b/branch/seperate_maz_and_taz/_static/file.png new file mode 100644 index 0000000..a858a41 Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/file.png differ diff --git a/branch/seperate_maz_and_taz/_static/graphviz.css b/branch/seperate_maz_and_taz/_static/graphviz.css new file mode 100644 index 0000000..8d81c02 --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/graphviz.css @@ -0,0 +1,19 @@ +/* + * graphviz.css + * ~~~~~~~~~~~~ + * + * Sphinx stylesheet -- graphviz extension. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +img.graphviz { + border: 0; + max-width: 100%; +} + +object.graphviz { + max-width: 100%; +} diff --git a/branch/seperate_maz_and_taz/_static/jquery.js b/branch/seperate_maz_and_taz/_static/jquery.js new file mode 100644 index 0000000..c4c6022 --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/jquery.js @@ -0,0 +1,2 @@ +/*! jQuery v3.6.0 | (c) OpenJS Foundation and other contributors | jquery.org/license */ +!function(e,t){"use strict";"object"==typeof module&&"object"==typeof module.exports?module.exports=e.document?t(e,!0):function(e){if(!e.document)throw new Error("jQuery requires a window with a document");return t(e)}:t(e)}("undefined"!=typeof window?window:this,function(C,e){"use strict";var t=[],r=Object.getPrototypeOf,s=t.slice,g=t.flat?function(e){return t.flat.call(e)}:function(e){return t.concat.apply([],e)},u=t.push,i=t.indexOf,n={},o=n.toString,v=n.hasOwnProperty,a=v.toString,l=a.call(Object),y={},m=function(e){return"function"==typeof e&&"number"!=typeof e.nodeType&&"function"!=typeof e.item},x=function(e){return null!=e&&e===e.window},E=C.document,c={type:!0,src:!0,nonce:!0,noModule:!0};function b(e,t,n){var r,i,o=(n=n||E).createElement("script");if(o.text=e,t)for(r in c)(i=t[r]||t.getAttribute&&t.getAttribute(r))&&o.setAttribute(r,i);n.head.appendChild(o).parentNode.removeChild(o)}function w(e){return null==e?e+"":"object"==typeof e||"function"==typeof e?n[o.call(e)]||"object":typeof e}var f="3.6.0",S=function(e,t){return new S.fn.init(e,t)};function p(e){var t=!!e&&"length"in e&&e.length,n=w(e);return!m(e)&&!x(e)&&("array"===n||0===t||"number"==typeof t&&0+~]|"+M+")"+M+"*"),U=new RegExp(M+"|>"),X=new RegExp(F),V=new RegExp("^"+I+"$"),G={ID:new RegExp("^#("+I+")"),CLASS:new RegExp("^\\.("+I+")"),TAG:new RegExp("^("+I+"|[*])"),ATTR:new RegExp("^"+W),PSEUDO:new RegExp("^"+F),CHILD:new RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+M+"*(even|odd|(([+-]|)(\\d*)n|)"+M+"*(?:([+-]|)"+M+"*(\\d+)|))"+M+"*\\)|)","i"),bool:new RegExp("^(?:"+R+")$","i"),needsContext:new RegExp("^"+M+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+M+"*((?:-\\d)?\\d*)"+M+"*\\)|)(?=[^-]|$)","i")},Y=/HTML$/i,Q=/^(?:input|select|textarea|button)$/i,J=/^h\d$/i,K=/^[^{]+\{\s*\[native \w/,Z=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,ee=/[+~]/,te=new RegExp("\\\\[\\da-fA-F]{1,6}"+M+"?|\\\\([^\\r\\n\\f])","g"),ne=function(e,t){var n="0x"+e.slice(1)-65536;return t||(n<0?String.fromCharCode(n+65536):String.fromCharCode(n>>10|55296,1023&n|56320))},re=/([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g,ie=function(e,t){return t?"\0"===e?"\ufffd":e.slice(0,-1)+"\\"+e.charCodeAt(e.length-1).toString(16)+" ":"\\"+e},oe=function(){T()},ae=be(function(e){return!0===e.disabled&&"fieldset"===e.nodeName.toLowerCase()},{dir:"parentNode",next:"legend"});try{H.apply(t=O.call(p.childNodes),p.childNodes),t[p.childNodes.length].nodeType}catch(e){H={apply:t.length?function(e,t){L.apply(e,O.call(t))}:function(e,t){var n=e.length,r=0;while(e[n++]=t[r++]);e.length=n-1}}}function se(t,e,n,r){var i,o,a,s,u,l,c,f=e&&e.ownerDocument,p=e?e.nodeType:9;if(n=n||[],"string"!=typeof t||!t||1!==p&&9!==p&&11!==p)return n;if(!r&&(T(e),e=e||C,E)){if(11!==p&&(u=Z.exec(t)))if(i=u[1]){if(9===p){if(!(a=e.getElementById(i)))return n;if(a.id===i)return n.push(a),n}else if(f&&(a=f.getElementById(i))&&y(e,a)&&a.id===i)return n.push(a),n}else{if(u[2])return H.apply(n,e.getElementsByTagName(t)),n;if((i=u[3])&&d.getElementsByClassName&&e.getElementsByClassName)return H.apply(n,e.getElementsByClassName(i)),n}if(d.qsa&&!N[t+" "]&&(!v||!v.test(t))&&(1!==p||"object"!==e.nodeName.toLowerCase())){if(c=t,f=e,1===p&&(U.test(t)||z.test(t))){(f=ee.test(t)&&ye(e.parentNode)||e)===e&&d.scope||((s=e.getAttribute("id"))?s=s.replace(re,ie):e.setAttribute("id",s=S)),o=(l=h(t)).length;while(o--)l[o]=(s?"#"+s:":scope")+" "+xe(l[o]);c=l.join(",")}try{return H.apply(n,f.querySelectorAll(c)),n}catch(e){N(t,!0)}finally{s===S&&e.removeAttribute("id")}}}return g(t.replace($,"$1"),e,n,r)}function ue(){var r=[];return function e(t,n){return r.push(t+" ")>b.cacheLength&&delete e[r.shift()],e[t+" "]=n}}function le(e){return e[S]=!0,e}function ce(e){var t=C.createElement("fieldset");try{return!!e(t)}catch(e){return!1}finally{t.parentNode&&t.parentNode.removeChild(t),t=null}}function fe(e,t){var n=e.split("|"),r=n.length;while(r--)b.attrHandle[n[r]]=t}function pe(e,t){var n=t&&e,r=n&&1===e.nodeType&&1===t.nodeType&&e.sourceIndex-t.sourceIndex;if(r)return r;if(n)while(n=n.nextSibling)if(n===t)return-1;return e?1:-1}function de(t){return function(e){return"input"===e.nodeName.toLowerCase()&&e.type===t}}function he(n){return function(e){var t=e.nodeName.toLowerCase();return("input"===t||"button"===t)&&e.type===n}}function ge(t){return function(e){return"form"in e?e.parentNode&&!1===e.disabled?"label"in e?"label"in e.parentNode?e.parentNode.disabled===t:e.disabled===t:e.isDisabled===t||e.isDisabled!==!t&&ae(e)===t:e.disabled===t:"label"in e&&e.disabled===t}}function ve(a){return le(function(o){return o=+o,le(function(e,t){var n,r=a([],e.length,o),i=r.length;while(i--)e[n=r[i]]&&(e[n]=!(t[n]=e[n]))})})}function ye(e){return e&&"undefined"!=typeof e.getElementsByTagName&&e}for(e in d=se.support={},i=se.isXML=function(e){var t=e&&e.namespaceURI,n=e&&(e.ownerDocument||e).documentElement;return!Y.test(t||n&&n.nodeName||"HTML")},T=se.setDocument=function(e){var t,n,r=e?e.ownerDocument||e:p;return r!=C&&9===r.nodeType&&r.documentElement&&(a=(C=r).documentElement,E=!i(C),p!=C&&(n=C.defaultView)&&n.top!==n&&(n.addEventListener?n.addEventListener("unload",oe,!1):n.attachEvent&&n.attachEvent("onunload",oe)),d.scope=ce(function(e){return a.appendChild(e).appendChild(C.createElement("div")),"undefined"!=typeof e.querySelectorAll&&!e.querySelectorAll(":scope fieldset div").length}),d.attributes=ce(function(e){return e.className="i",!e.getAttribute("className")}),d.getElementsByTagName=ce(function(e){return e.appendChild(C.createComment("")),!e.getElementsByTagName("*").length}),d.getElementsByClassName=K.test(C.getElementsByClassName),d.getById=ce(function(e){return a.appendChild(e).id=S,!C.getElementsByName||!C.getElementsByName(S).length}),d.getById?(b.filter.ID=function(e){var t=e.replace(te,ne);return function(e){return e.getAttribute("id")===t}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n=t.getElementById(e);return n?[n]:[]}}):(b.filter.ID=function(e){var n=e.replace(te,ne);return function(e){var t="undefined"!=typeof e.getAttributeNode&&e.getAttributeNode("id");return t&&t.value===n}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n,r,i,o=t.getElementById(e);if(o){if((n=o.getAttributeNode("id"))&&n.value===e)return[o];i=t.getElementsByName(e),r=0;while(o=i[r++])if((n=o.getAttributeNode("id"))&&n.value===e)return[o]}return[]}}),b.find.TAG=d.getElementsByTagName?function(e,t){return"undefined"!=typeof t.getElementsByTagName?t.getElementsByTagName(e):d.qsa?t.querySelectorAll(e):void 0}:function(e,t){var n,r=[],i=0,o=t.getElementsByTagName(e);if("*"===e){while(n=o[i++])1===n.nodeType&&r.push(n);return r}return o},b.find.CLASS=d.getElementsByClassName&&function(e,t){if("undefined"!=typeof t.getElementsByClassName&&E)return t.getElementsByClassName(e)},s=[],v=[],(d.qsa=K.test(C.querySelectorAll))&&(ce(function(e){var t;a.appendChild(e).innerHTML="",e.querySelectorAll("[msallowcapture^='']").length&&v.push("[*^$]="+M+"*(?:''|\"\")"),e.querySelectorAll("[selected]").length||v.push("\\["+M+"*(?:value|"+R+")"),e.querySelectorAll("[id~="+S+"-]").length||v.push("~="),(t=C.createElement("input")).setAttribute("name",""),e.appendChild(t),e.querySelectorAll("[name='']").length||v.push("\\["+M+"*name"+M+"*="+M+"*(?:''|\"\")"),e.querySelectorAll(":checked").length||v.push(":checked"),e.querySelectorAll("a#"+S+"+*").length||v.push(".#.+[+~]"),e.querySelectorAll("\\\f"),v.push("[\\r\\n\\f]")}),ce(function(e){e.innerHTML="";var t=C.createElement("input");t.setAttribute("type","hidden"),e.appendChild(t).setAttribute("name","D"),e.querySelectorAll("[name=d]").length&&v.push("name"+M+"*[*^$|!~]?="),2!==e.querySelectorAll(":enabled").length&&v.push(":enabled",":disabled"),a.appendChild(e).disabled=!0,2!==e.querySelectorAll(":disabled").length&&v.push(":enabled",":disabled"),e.querySelectorAll("*,:x"),v.push(",.*:")})),(d.matchesSelector=K.test(c=a.matches||a.webkitMatchesSelector||a.mozMatchesSelector||a.oMatchesSelector||a.msMatchesSelector))&&ce(function(e){d.disconnectedMatch=c.call(e,"*"),c.call(e,"[s!='']:x"),s.push("!=",F)}),v=v.length&&new RegExp(v.join("|")),s=s.length&&new RegExp(s.join("|")),t=K.test(a.compareDocumentPosition),y=t||K.test(a.contains)?function(e,t){var n=9===e.nodeType?e.documentElement:e,r=t&&t.parentNode;return e===r||!(!r||1!==r.nodeType||!(n.contains?n.contains(r):e.compareDocumentPosition&&16&e.compareDocumentPosition(r)))}:function(e,t){if(t)while(t=t.parentNode)if(t===e)return!0;return!1},j=t?function(e,t){if(e===t)return l=!0,0;var n=!e.compareDocumentPosition-!t.compareDocumentPosition;return n||(1&(n=(e.ownerDocument||e)==(t.ownerDocument||t)?e.compareDocumentPosition(t):1)||!d.sortDetached&&t.compareDocumentPosition(e)===n?e==C||e.ownerDocument==p&&y(p,e)?-1:t==C||t.ownerDocument==p&&y(p,t)?1:u?P(u,e)-P(u,t):0:4&n?-1:1)}:function(e,t){if(e===t)return l=!0,0;var n,r=0,i=e.parentNode,o=t.parentNode,a=[e],s=[t];if(!i||!o)return e==C?-1:t==C?1:i?-1:o?1:u?P(u,e)-P(u,t):0;if(i===o)return pe(e,t);n=e;while(n=n.parentNode)a.unshift(n);n=t;while(n=n.parentNode)s.unshift(n);while(a[r]===s[r])r++;return r?pe(a[r],s[r]):a[r]==p?-1:s[r]==p?1:0}),C},se.matches=function(e,t){return se(e,null,null,t)},se.matchesSelector=function(e,t){if(T(e),d.matchesSelector&&E&&!N[t+" "]&&(!s||!s.test(t))&&(!v||!v.test(t)))try{var n=c.call(e,t);if(n||d.disconnectedMatch||e.document&&11!==e.document.nodeType)return n}catch(e){N(t,!0)}return 0":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(e){return e[1]=e[1].replace(te,ne),e[3]=(e[3]||e[4]||e[5]||"").replace(te,ne),"~="===e[2]&&(e[3]=" "+e[3]+" "),e.slice(0,4)},CHILD:function(e){return e[1]=e[1].toLowerCase(),"nth"===e[1].slice(0,3)?(e[3]||se.error(e[0]),e[4]=+(e[4]?e[5]+(e[6]||1):2*("even"===e[3]||"odd"===e[3])),e[5]=+(e[7]+e[8]||"odd"===e[3])):e[3]&&se.error(e[0]),e},PSEUDO:function(e){var t,n=!e[6]&&e[2];return G.CHILD.test(e[0])?null:(e[3]?e[2]=e[4]||e[5]||"":n&&X.test(n)&&(t=h(n,!0))&&(t=n.indexOf(")",n.length-t)-n.length)&&(e[0]=e[0].slice(0,t),e[2]=n.slice(0,t)),e.slice(0,3))}},filter:{TAG:function(e){var t=e.replace(te,ne).toLowerCase();return"*"===e?function(){return!0}:function(e){return e.nodeName&&e.nodeName.toLowerCase()===t}},CLASS:function(e){var t=m[e+" "];return t||(t=new RegExp("(^|"+M+")"+e+"("+M+"|$)"))&&m(e,function(e){return t.test("string"==typeof e.className&&e.className||"undefined"!=typeof e.getAttribute&&e.getAttribute("class")||"")})},ATTR:function(n,r,i){return function(e){var t=se.attr(e,n);return null==t?"!="===r:!r||(t+="","="===r?t===i:"!="===r?t!==i:"^="===r?i&&0===t.indexOf(i):"*="===r?i&&-1:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i;function j(e,n,r){return m(n)?S.grep(e,function(e,t){return!!n.call(e,t,e)!==r}):n.nodeType?S.grep(e,function(e){return e===n!==r}):"string"!=typeof n?S.grep(e,function(e){return-1)[^>]*|#([\w-]+))$/;(S.fn.init=function(e,t,n){var r,i;if(!e)return this;if(n=n||D,"string"==typeof e){if(!(r="<"===e[0]&&">"===e[e.length-1]&&3<=e.length?[null,e,null]:q.exec(e))||!r[1]&&t)return!t||t.jquery?(t||n).find(e):this.constructor(t).find(e);if(r[1]){if(t=t instanceof S?t[0]:t,S.merge(this,S.parseHTML(r[1],t&&t.nodeType?t.ownerDocument||t:E,!0)),N.test(r[1])&&S.isPlainObject(t))for(r in t)m(this[r])?this[r](t[r]):this.attr(r,t[r]);return this}return(i=E.getElementById(r[2]))&&(this[0]=i,this.length=1),this}return e.nodeType?(this[0]=e,this.length=1,this):m(e)?void 0!==n.ready?n.ready(e):e(S):S.makeArray(e,this)}).prototype=S.fn,D=S(E);var L=/^(?:parents|prev(?:Until|All))/,H={children:!0,contents:!0,next:!0,prev:!0};function O(e,t){while((e=e[t])&&1!==e.nodeType);return e}S.fn.extend({has:function(e){var t=S(e,this),n=t.length;return this.filter(function(){for(var e=0;e\x20\t\r\n\f]*)/i,he=/^$|^module$|\/(?:java|ecma)script/i;ce=E.createDocumentFragment().appendChild(E.createElement("div")),(fe=E.createElement("input")).setAttribute("type","radio"),fe.setAttribute("checked","checked"),fe.setAttribute("name","t"),ce.appendChild(fe),y.checkClone=ce.cloneNode(!0).cloneNode(!0).lastChild.checked,ce.innerHTML="",y.noCloneChecked=!!ce.cloneNode(!0).lastChild.defaultValue,ce.innerHTML="",y.option=!!ce.lastChild;var ge={thead:[1,"","
"],col:[2,"","
"],tr:[2,"","
"],td:[3,"","
"],_default:[0,"",""]};function ve(e,t){var n;return n="undefined"!=typeof e.getElementsByTagName?e.getElementsByTagName(t||"*"):"undefined"!=typeof e.querySelectorAll?e.querySelectorAll(t||"*"):[],void 0===t||t&&A(e,t)?S.merge([e],n):n}function ye(e,t){for(var n=0,r=e.length;n",""]);var me=/<|&#?\w+;/;function xe(e,t,n,r,i){for(var o,a,s,u,l,c,f=t.createDocumentFragment(),p=[],d=0,h=e.length;d\s*$/g;function je(e,t){return A(e,"table")&&A(11!==t.nodeType?t:t.firstChild,"tr")&&S(e).children("tbody")[0]||e}function De(e){return e.type=(null!==e.getAttribute("type"))+"/"+e.type,e}function qe(e){return"true/"===(e.type||"").slice(0,5)?e.type=e.type.slice(5):e.removeAttribute("type"),e}function Le(e,t){var n,r,i,o,a,s;if(1===t.nodeType){if(Y.hasData(e)&&(s=Y.get(e).events))for(i in Y.remove(t,"handle events"),s)for(n=0,r=s[i].length;n").attr(n.scriptAttrs||{}).prop({charset:n.scriptCharset,src:n.url}).on("load error",i=function(e){r.remove(),i=null,e&&t("error"===e.type?404:200,e.type)}),E.head.appendChild(r[0])},abort:function(){i&&i()}}});var _t,zt=[],Ut=/(=)\?(?=&|$)|\?\?/;S.ajaxSetup({jsonp:"callback",jsonpCallback:function(){var e=zt.pop()||S.expando+"_"+wt.guid++;return this[e]=!0,e}}),S.ajaxPrefilter("json jsonp",function(e,t,n){var r,i,o,a=!1!==e.jsonp&&(Ut.test(e.url)?"url":"string"==typeof e.data&&0===(e.contentType||"").indexOf("application/x-www-form-urlencoded")&&Ut.test(e.data)&&"data");if(a||"jsonp"===e.dataTypes[0])return r=e.jsonpCallback=m(e.jsonpCallback)?e.jsonpCallback():e.jsonpCallback,a?e[a]=e[a].replace(Ut,"$1"+r):!1!==e.jsonp&&(e.url+=(Tt.test(e.url)?"&":"?")+e.jsonp+"="+r),e.converters["script json"]=function(){return o||S.error(r+" was not called"),o[0]},e.dataTypes[0]="json",i=C[r],C[r]=function(){o=arguments},n.always(function(){void 0===i?S(C).removeProp(r):C[r]=i,e[r]&&(e.jsonpCallback=t.jsonpCallback,zt.push(r)),o&&m(i)&&i(o[0]),o=i=void 0}),"script"}),y.createHTMLDocument=((_t=E.implementation.createHTMLDocument("").body).innerHTML="
",2===_t.childNodes.length),S.parseHTML=function(e,t,n){return"string"!=typeof e?[]:("boolean"==typeof t&&(n=t,t=!1),t||(y.createHTMLDocument?((r=(t=E.implementation.createHTMLDocument("")).createElement("base")).href=E.location.href,t.head.appendChild(r)):t=E),o=!n&&[],(i=N.exec(e))?[t.createElement(i[1])]:(i=xe([e],t,o),o&&o.length&&S(o).remove(),S.merge([],i.childNodes)));var r,i,o},S.fn.load=function(e,t,n){var r,i,o,a=this,s=e.indexOf(" ");return-1").append(S.parseHTML(e)).find(r):e)}).always(n&&function(e,t){a.each(function(){n.apply(this,o||[e.responseText,t,e])})}),this},S.expr.pseudos.animated=function(t){return S.grep(S.timers,function(e){return t===e.elem}).length},S.offset={setOffset:function(e,t,n){var r,i,o,a,s,u,l=S.css(e,"position"),c=S(e),f={};"static"===l&&(e.style.position="relative"),s=c.offset(),o=S.css(e,"top"),u=S.css(e,"left"),("absolute"===l||"fixed"===l)&&-1<(o+u).indexOf("auto")?(a=(r=c.position()).top,i=r.left):(a=parseFloat(o)||0,i=parseFloat(u)||0),m(t)&&(t=t.call(e,n,S.extend({},s))),null!=t.top&&(f.top=t.top-s.top+a),null!=t.left&&(f.left=t.left-s.left+i),"using"in t?t.using.call(e,f):c.css(f)}},S.fn.extend({offset:function(t){if(arguments.length)return void 0===t?this:this.each(function(e){S.offset.setOffset(this,t,e)});var e,n,r=this[0];return r?r.getClientRects().length?(e=r.getBoundingClientRect(),n=r.ownerDocument.defaultView,{top:e.top+n.pageYOffset,left:e.left+n.pageXOffset}):{top:0,left:0}:void 0},position:function(){if(this[0]){var e,t,n,r=this[0],i={top:0,left:0};if("fixed"===S.css(r,"position"))t=r.getBoundingClientRect();else{t=this.offset(),n=r.ownerDocument,e=r.offsetParent||n.documentElement;while(e&&(e===n.body||e===n.documentElement)&&"static"===S.css(e,"position"))e=e.parentNode;e&&e!==r&&1===e.nodeType&&((i=S(e).offset()).top+=S.css(e,"borderTopWidth",!0),i.left+=S.css(e,"borderLeftWidth",!0))}return{top:t.top-i.top-S.css(r,"marginTop",!0),left:t.left-i.left-S.css(r,"marginLeft",!0)}}},offsetParent:function(){return this.map(function(){var e=this.offsetParent;while(e&&"static"===S.css(e,"position"))e=e.offsetParent;return e||re})}}),S.each({scrollLeft:"pageXOffset",scrollTop:"pageYOffset"},function(t,i){var o="pageYOffset"===i;S.fn[t]=function(e){return $(this,function(e,t,n){var r;if(x(e)?r=e:9===e.nodeType&&(r=e.defaultView),void 0===n)return r?r[i]:e[t];r?r.scrollTo(o?r.pageXOffset:n,o?n:r.pageYOffset):e[t]=n},t,e,arguments.length)}}),S.each(["top","left"],function(e,n){S.cssHooks[n]=Fe(y.pixelPosition,function(e,t){if(t)return t=We(e,n),Pe.test(t)?S(e).position()[n]+"px":t})}),S.each({Height:"height",Width:"width"},function(a,s){S.each({padding:"inner"+a,content:s,"":"outer"+a},function(r,o){S.fn[o]=function(e,t){var n=arguments.length&&(r||"boolean"!=typeof e),i=r||(!0===e||!0===t?"margin":"border");return $(this,function(e,t,n){var r;return x(e)?0===o.indexOf("outer")?e["inner"+a]:e.document.documentElement["client"+a]:9===e.nodeType?(r=e.documentElement,Math.max(e.body["scroll"+a],r["scroll"+a],e.body["offset"+a],r["offset"+a],r["client"+a])):void 0===n?S.css(e,t,i):S.style(e,t,n,i)},s,n?e:void 0,n)}})}),S.each(["ajaxStart","ajaxStop","ajaxComplete","ajaxError","ajaxSuccess","ajaxSend"],function(e,t){S.fn[t]=function(e){return this.on(t,e)}}),S.fn.extend({bind:function(e,t,n){return this.on(e,null,t,n)},unbind:function(e,t){return this.off(e,null,t)},delegate:function(e,t,n,r){return this.on(t,e,n,r)},undelegate:function(e,t,n){return 1===arguments.length?this.off(e,"**"):this.off(t,e||"**",n)},hover:function(e,t){return this.mouseenter(e).mouseleave(t||e)}}),S.each("blur focus focusin focusout resize scroll click dblclick mousedown mouseup mousemove mouseover mouseout mouseenter mouseleave change select submit keydown keypress keyup contextmenu".split(" "),function(e,n){S.fn[n]=function(e,t){return 0",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=y.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=y.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),y.elements=c+" "+a,j(b)}function f(a){var b=x[a[v]];return b||(b={},w++,a[v]=w,x[w]=b),b}function g(a,c,d){if(c||(c=b),q)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():u.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||t.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),q)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return y.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(y,b.frag)}function j(a){a||(a=b);var d=f(a);return!y.shivCSS||p||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),q||i(a,d),a}function k(a){for(var b,c=a.getElementsByTagName("*"),e=c.length,f=RegExp("^(?:"+d().join("|")+")$","i"),g=[];e--;)b=c[e],f.test(b.nodeName)&&g.push(b.applyElement(l(b)));return g}function l(a){for(var b,c=a.attributes,d=c.length,e=a.ownerDocument.createElement(A+":"+a.nodeName);d--;)b=c[d],b.specified&&e.setAttribute(b.nodeName,b.nodeValue);return e.style.cssText=a.style.cssText,e}function m(a){for(var b,c=a.split("{"),e=c.length,f=RegExp("(^|[\\s,>+~])("+d().join("|")+")(?=[[\\s,>+~#.:]|$)","gi"),g="$1"+A+"\\:$2";e--;)b=c[e]=c[e].split("}"),b[b.length-1]=b[b.length-1].replace(f,g),c[e]=b.join("}");return c.join("{")}function n(a){for(var b=a.length;b--;)a[b].removeNode()}function o(a){function b(){clearTimeout(g._removeSheetTimer),d&&d.removeNode(!0),d=null}var d,e,g=f(a),h=a.namespaces,i=a.parentWindow;return!B||a.printShived?a:("undefined"==typeof h[A]&&h.add(A),i.attachEvent("onbeforeprint",function(){b();for(var f,g,h,i=a.styleSheets,j=[],l=i.length,n=Array(l);l--;)n[l]=i[l];for(;h=n.pop();)if(!h.disabled&&z.test(h.media)){try{f=h.imports,g=f.length}catch(o){g=0}for(l=0;g>l;l++)n.push(f[l]);try{j.push(h.cssText)}catch(o){}}j=m(j.reverse().join("")),e=k(a),d=c(a,j)}),i.attachEvent("onafterprint",function(){n(e),clearTimeout(g._removeSheetTimer),g._removeSheetTimer=setTimeout(b,500)}),a.printShived=!0,a)}var p,q,r="3.7.3",s=a.html5||{},t=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,u=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,v="_html5shiv",w=0,x={};!function(){try{var a=b.createElement("a");a.innerHTML="",p="hidden"in a,q=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){p=!0,q=!0}}();var y={elements:s.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:r,shivCSS:s.shivCSS!==!1,supportsUnknownElements:q,shivMethods:s.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=y,j(b);var z=/^$|\b(?:all|print)\b/,A="html5shiv",B=!q&&function(){var c=b.documentElement;return!("undefined"==typeof b.namespaces||"undefined"==typeof b.parentWindow||"undefined"==typeof c.applyElement||"undefined"==typeof c.removeNode||"undefined"==typeof a.attachEvent)}();y.type+=" print",y.shivPrint=o,o(b),"object"==typeof module&&module.exports&&(module.exports=y)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_static/js/html5shiv.min.js b/branch/seperate_maz_and_taz/_static/js/html5shiv.min.js new file mode 100644 index 0000000..cd1c674 --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/js/html5shiv.min.js @@ -0,0 +1,4 @@ +/** +* @preserve HTML5 Shiv 3.7.3 | @afarkas @jdalton @jon_neal @rem | MIT/GPL2 Licensed +*/ +!function(a,b){function c(a,b){var c=a.createElement("p"),d=a.getElementsByTagName("head")[0]||a.documentElement;return c.innerHTML="x",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=t.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=t.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),t.elements=c+" "+a,j(b)}function f(a){var b=s[a[q]];return b||(b={},r++,a[q]=r,s[r]=b),b}function g(a,c,d){if(c||(c=b),l)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():p.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||o.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),l)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return t.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(t,b.frag)}function j(a){a||(a=b);var d=f(a);return!t.shivCSS||k||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),l||i(a,d),a}var k,l,m="3.7.3-pre",n=a.html5||{},o=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,p=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,q="_html5shiv",r=0,s={};!function(){try{var a=b.createElement("a");a.innerHTML="",k="hidden"in a,l=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){k=!0,l=!0}}();var t={elements:n.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:m,shivCSS:n.shivCSS!==!1,supportsUnknownElements:l,shivMethods:n.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=t,j(b),"object"==typeof module&&module.exports&&(module.exports=t)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_static/js/theme.js b/branch/seperate_maz_and_taz/_static/js/theme.js new file mode 100644 index 0000000..1fddb6e --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/js/theme.js @@ -0,0 +1 @@ +!function(n){var e={};function t(i){if(e[i])return e[i].exports;var o=e[i]={i:i,l:!1,exports:{}};return n[i].call(o.exports,o,o.exports,t),o.l=!0,o.exports}t.m=n,t.c=e,t.d=function(n,e,i){t.o(n,e)||Object.defineProperty(n,e,{enumerable:!0,get:i})},t.r=function(n){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(n,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(n,"__esModule",{value:!0})},t.t=function(n,e){if(1&e&&(n=t(n)),8&e)return n;if(4&e&&"object"==typeof n&&n&&n.__esModule)return n;var i=Object.create(null);if(t.r(i),Object.defineProperty(i,"default",{enumerable:!0,value:n}),2&e&&"string"!=typeof n)for(var o in n)t.d(i,o,function(e){return n[e]}.bind(null,o));return i},t.n=function(n){var e=n&&n.__esModule?function(){return n.default}:function(){return n};return t.d(e,"a",e),e},t.o=function(n,e){return Object.prototype.hasOwnProperty.call(n,e)},t.p="",t(t.s=0)}([function(n,e,t){t(1),n.exports=t(3)},function(n,e,t){(function(){var e="undefined"!=typeof window?window.jQuery:t(2);n.exports.ThemeNav={navBar:null,win:null,winScroll:!1,winResize:!1,linkScroll:!1,winPosition:0,winHeight:null,docHeight:null,isRunning:!1,enable:function(n){var t=this;void 0===n&&(n=!0),t.isRunning||(t.isRunning=!0,e((function(e){t.init(e),t.reset(),t.win.on("hashchange",t.reset),n&&t.win.on("scroll",(function(){t.linkScroll||t.winScroll||(t.winScroll=!0,requestAnimationFrame((function(){t.onScroll()})))})),t.win.on("resize",(function(){t.winResize||(t.winResize=!0,requestAnimationFrame((function(){t.onResize()})))})),t.onResize()})))},enableSticky:function(){this.enable(!0)},init:function(n){n(document);var e=this;this.navBar=n("div.wy-side-scroll:first"),this.win=n(window),n(document).on("click","[data-toggle='wy-nav-top']",(function(){n("[data-toggle='wy-nav-shift']").toggleClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift")})).on("click",".wy-menu-vertical .current ul li a",(function(){var t=n(this);n("[data-toggle='wy-nav-shift']").removeClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift"),e.toggleCurrent(t),e.hashChange()})).on("click","[data-toggle='rst-current-version']",(function(){n("[data-toggle='rst-versions']").toggleClass("shift-up")})),n("table.docutils:not(.field-list,.footnote,.citation)").wrap("
"),n("table.docutils.footnote").wrap("
"),n("table.docutils.citation").wrap("
"),n(".wy-menu-vertical ul").not(".simple").siblings("a").each((function(){var t=n(this);expand=n(''),expand.on("click",(function(n){return e.toggleCurrent(t),n.stopPropagation(),!1})),t.prepend(expand)}))},reset:function(){var n=encodeURI(window.location.hash)||"#";try{var e=$(".wy-menu-vertical"),t=e.find('[href="'+n+'"]');if(0===t.length){var i=$('.document [id="'+n.substring(1)+'"]').closest("div.section");0===(t=e.find('[href="#'+i.attr("id")+'"]')).length&&(t=e.find('[href="#"]'))}if(t.length>0){$(".wy-menu-vertical .current").removeClass("current").attr("aria-expanded","false"),t.addClass("current").attr("aria-expanded","true"),t.closest("li.toctree-l1").parent().addClass("current").attr("aria-expanded","true");for(let n=1;n<=10;n++)t.closest("li.toctree-l"+n).addClass("current").attr("aria-expanded","true");t[0].scrollIntoView()}}catch(n){console.log("Error expanding nav for anchor",n)}},onScroll:function(){this.winScroll=!1;var n=this.win.scrollTop(),e=n+this.winHeight,t=this.navBar.scrollTop()+(n-this.winPosition);n<0||e>this.docHeight||(this.navBar.scrollTop(t),this.winPosition=n)},onResize:function(){this.winResize=!1,this.winHeight=this.win.height(),this.docHeight=$(document).height()},hashChange:function(){this.linkScroll=!0,this.win.one("hashchange",(function(){this.linkScroll=!1}))},toggleCurrent:function(n){var e=n.closest("li");e.siblings("li.current").removeClass("current").attr("aria-expanded","false"),e.siblings().find("li.current").removeClass("current").attr("aria-expanded","false");var t=e.find("> ul li");t.length&&(t.removeClass("current").attr("aria-expanded","false"),e.toggleClass("current").attr("aria-expanded",(function(n,e){return"true"==e?"false":"true"})))}},"undefined"!=typeof window&&(window.SphinxRtdTheme={Navigation:n.exports.ThemeNav,StickyNav:n.exports.ThemeNav}),function(){for(var n=0,e=["ms","moz","webkit","o"],t=0;t0 + var meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$"; // [C]VC[V] is m=1 + var mgr1 = "^(" + C + ")?" + V + C + V + C; // [C]VCVC... is m>1 + var s_v = "^(" + C + ")?" + v; // vowel in stem + + this.stemWord = function (w) { + var stem; + var suffix; + var firstch; + var origword = w; + + if (w.length < 3) + return w; + + var re; + var re2; + var re3; + var re4; + + firstch = w.substr(0,1); + if (firstch == "y") + w = firstch.toUpperCase() + w.substr(1); + + // Step 1a + re = /^(.+?)(ss|i)es$/; + re2 = /^(.+?)([^s])s$/; + + if (re.test(w)) + w = w.replace(re,"$1$2"); + else if (re2.test(w)) + w = w.replace(re2,"$1$2"); + + // Step 1b + re = /^(.+?)eed$/; + re2 = /^(.+?)(ed|ing)$/; + if (re.test(w)) { + var fp = re.exec(w); + re = new RegExp(mgr0); + if (re.test(fp[1])) { + re = /.$/; + w = w.replace(re,""); + } + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + re2 = new RegExp(s_v); + if (re2.test(stem)) { + w = stem; + re2 = /(at|bl|iz)$/; + re3 = new RegExp("([^aeiouylsz])\\1$"); + re4 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re2.test(w)) + w = w + "e"; + else if (re3.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + else if (re4.test(w)) + w = w + "e"; + } + } + + // Step 1c + re = /^(.+?)y$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(s_v); + if (re.test(stem)) + w = stem + "i"; + } + + // Step 2 + re = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step2list[suffix]; + } + + // Step 3 + re = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step3list[suffix]; + } + + // Step 4 + re = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/; + re2 = /^(.+?)(s|t)(ion)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + if (re.test(stem)) + w = stem; + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1] + fp[2]; + re2 = new RegExp(mgr1); + if (re2.test(stem)) + w = stem; + } + + // Step 5 + re = /^(.+?)e$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + re2 = new RegExp(meq1); + re3 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) + w = stem; + } + re = /ll$/; + re2 = new RegExp(mgr1); + if (re.test(w) && re2.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + + // and turn initial Y back to y + if (firstch == "y") + w = firstch.toLowerCase() + w.substr(1); + return w; + } +} + diff --git a/branch/seperate_maz_and_taz/_static/minus.png b/branch/seperate_maz_and_taz/_static/minus.png new file mode 100644 index 0000000..d96755f Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/minus.png differ diff --git a/branch/seperate_maz_and_taz/_static/plus.png b/branch/seperate_maz_and_taz/_static/plus.png new file mode 100644 index 0000000..7107cec Binary files /dev/null and b/branch/seperate_maz_and_taz/_static/plus.png differ diff --git a/branch/seperate_maz_and_taz/_static/pygments.css b/branch/seperate_maz_and_taz/_static/pygments.css new file mode 100644 index 0000000..84ab303 --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/pygments.css @@ -0,0 +1,75 @@ +pre { line-height: 125%; } +td.linenos .normal { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +span.linenos { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +td.linenos .special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +span.linenos.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +.highlight .hll { background-color: #ffffcc } +.highlight { background: #f8f8f8; } +.highlight .c { color: #3D7B7B; font-style: italic } /* Comment */ +.highlight .err { border: 1px solid #FF0000 } /* Error */ +.highlight .k { color: #008000; font-weight: bold } /* Keyword */ +.highlight .o { color: #666666 } /* Operator */ +.highlight .ch { color: #3D7B7B; font-style: italic } /* Comment.Hashbang */ +.highlight .cm { color: #3D7B7B; font-style: italic } /* Comment.Multiline */ +.highlight .cp { color: #9C6500 } /* Comment.Preproc */ +.highlight .cpf { color: #3D7B7B; font-style: italic } /* Comment.PreprocFile */ +.highlight .c1 { color: #3D7B7B; font-style: italic } /* Comment.Single */ +.highlight .cs { color: #3D7B7B; font-style: italic } /* Comment.Special */ +.highlight .gd { color: #A00000 } /* Generic.Deleted */ +.highlight .ge { font-style: italic } /* Generic.Emph */ +.highlight .ges { font-weight: bold; font-style: italic } /* Generic.EmphStrong */ +.highlight .gr { color: #E40000 } /* Generic.Error */ +.highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */ +.highlight .gi { color: #008400 } /* Generic.Inserted */ +.highlight .go { color: #717171 } /* Generic.Output */ +.highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */ +.highlight .gs { font-weight: bold } /* Generic.Strong */ +.highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */ +.highlight .gt { color: #0044DD } /* Generic.Traceback */ +.highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */ +.highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */ +.highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */ +.highlight .kp { color: #008000 } /* Keyword.Pseudo */ +.highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */ +.highlight .kt { color: #B00040 } /* Keyword.Type */ +.highlight .m { color: #666666 } /* Literal.Number */ +.highlight .s { color: #BA2121 } /* Literal.String */ +.highlight .na { color: #687822 } /* Name.Attribute */ +.highlight .nb { color: #008000 } /* Name.Builtin */ +.highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */ +.highlight .no { color: #880000 } /* Name.Constant */ +.highlight .nd { color: #AA22FF } /* Name.Decorator */ +.highlight .ni { color: #717171; font-weight: bold } /* Name.Entity */ +.highlight .ne { color: #CB3F38; font-weight: bold } /* Name.Exception */ +.highlight .nf { color: #0000FF } /* Name.Function */ +.highlight .nl { color: #767600 } /* Name.Label */ +.highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */ +.highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */ +.highlight .nv { color: #19177C } /* Name.Variable */ +.highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */ +.highlight .w { color: #bbbbbb } /* Text.Whitespace */ +.highlight .mb { color: #666666 } /* Literal.Number.Bin */ +.highlight .mf { color: #666666 } /* Literal.Number.Float */ +.highlight .mh { color: #666666 } /* Literal.Number.Hex */ +.highlight .mi { color: #666666 } /* Literal.Number.Integer */ +.highlight .mo { color: #666666 } /* Literal.Number.Oct */ +.highlight .sa { color: #BA2121 } /* Literal.String.Affix */ +.highlight .sb { color: #BA2121 } /* Literal.String.Backtick */ +.highlight .sc { color: #BA2121 } /* Literal.String.Char */ +.highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */ +.highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */ +.highlight .s2 { color: #BA2121 } /* Literal.String.Double */ +.highlight .se { color: #AA5D1F; font-weight: bold } /* Literal.String.Escape */ +.highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */ +.highlight .si { color: #A45A77; font-weight: bold } /* Literal.String.Interpol */ +.highlight .sx { color: #008000 } /* Literal.String.Other */ +.highlight .sr { color: #A45A77 } /* Literal.String.Regex */ +.highlight .s1 { color: #BA2121 } /* Literal.String.Single */ +.highlight .ss { color: #19177C } /* Literal.String.Symbol */ +.highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */ +.highlight .fm { color: #0000FF } /* Name.Function.Magic */ +.highlight .vc { color: #19177C } /* Name.Variable.Class */ +.highlight .vg { color: #19177C } /* Name.Variable.Global */ +.highlight .vi { color: #19177C } /* Name.Variable.Instance */ +.highlight .vm { color: #19177C } /* Name.Variable.Magic */ +.highlight .il { color: #666666 } /* Literal.Number.Integer.Long */ \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/_static/searchtools.js b/branch/seperate_maz_and_taz/_static/searchtools.js new file mode 100644 index 0000000..97d56a7 --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/searchtools.js @@ -0,0 +1,566 @@ +/* + * searchtools.js + * ~~~~~~~~~~~~~~~~ + * + * Sphinx JavaScript utilities for the full-text search. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +/** + * Simple result scoring code. + */ +if (typeof Scorer === "undefined") { + var Scorer = { + // Implement the following function to further tweak the score for each result + // The function takes a result array [docname, title, anchor, descr, score, filename] + // and returns the new score. + /* + score: result => { + const [docname, title, anchor, descr, score, filename] = result + return score + }, + */ + + // query matches the full name of an object + objNameMatch: 11, + // or matches in the last dotted part of the object name + objPartialMatch: 6, + // Additive scores depending on the priority of the object + objPrio: { + 0: 15, // used to be importantResults + 1: 5, // used to be objectResults + 2: -5, // used to be unimportantResults + }, + // Used when the priority is not in the mapping. + objPrioDefault: 0, + + // query found in title + title: 15, + partialTitle: 7, + // query found in terms + term: 5, + partialTerm: 2, + }; +} + +const _removeChildren = (element) => { + while (element && element.lastChild) element.removeChild(element.lastChild); +}; + +/** + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions#escaping + */ +const _escapeRegExp = (string) => + string.replace(/[.*+\-?^${}()|[\]\\]/g, "\\$&"); // $& means the whole matched string + +const _displayItem = (item, searchTerms) => { + const docBuilder = DOCUMENTATION_OPTIONS.BUILDER; + const docUrlRoot = DOCUMENTATION_OPTIONS.URL_ROOT; + const docFileSuffix = DOCUMENTATION_OPTIONS.FILE_SUFFIX; + const docLinkSuffix = DOCUMENTATION_OPTIONS.LINK_SUFFIX; + const showSearchSummary = DOCUMENTATION_OPTIONS.SHOW_SEARCH_SUMMARY; + + const [docName, title, anchor, descr, score, _filename] = item; + + let listItem = document.createElement("li"); + let requestUrl; + let linkUrl; + if (docBuilder === "dirhtml") { + // dirhtml builder + let dirname = docName + "/"; + if (dirname.match(/\/index\/$/)) + dirname = dirname.substring(0, dirname.length - 6); + else if (dirname === "index/") dirname = ""; + requestUrl = docUrlRoot + dirname; + linkUrl = requestUrl; + } else { + // normal html builders + requestUrl = docUrlRoot + docName + docFileSuffix; + linkUrl = docName + docLinkSuffix; + } + let linkEl = listItem.appendChild(document.createElement("a")); + linkEl.href = linkUrl + anchor; + linkEl.dataset.score = score; + linkEl.innerHTML = title; + if (descr) + listItem.appendChild(document.createElement("span")).innerHTML = + " (" + descr + ")"; + else if (showSearchSummary) + fetch(requestUrl) + .then((responseData) => responseData.text()) + .then((data) => { + if (data) + listItem.appendChild( + Search.makeSearchSummary(data, searchTerms) + ); + }); + Search.output.appendChild(listItem); +}; +const _finishSearch = (resultCount) => { + Search.stopPulse(); + Search.title.innerText = _("Search Results"); + if (!resultCount) + Search.status.innerText = Documentation.gettext( + "Your search did not match any documents. Please make sure that all words are spelled correctly and that you've selected enough categories." + ); + else + Search.status.innerText = _( + `Search finished, found ${resultCount} page(s) matching the search query.` + ); +}; +const _displayNextItem = ( + results, + resultCount, + searchTerms +) => { + // results left, load the summary and display it + // this is intended to be dynamic (don't sub resultsCount) + if (results.length) { + _displayItem(results.pop(), searchTerms); + setTimeout( + () => _displayNextItem(results, resultCount, searchTerms), + 5 + ); + } + // search finished, update title and status message + else _finishSearch(resultCount); +}; + +/** + * Default splitQuery function. Can be overridden in ``sphinx.search`` with a + * custom function per language. + * + * The regular expression works by splitting the string on consecutive characters + * that are not Unicode letters, numbers, underscores, or emoji characters. + * This is the same as ``\W+`` in Python, preserving the surrogate pair area. + */ +if (typeof splitQuery === "undefined") { + var splitQuery = (query) => query + .split(/[^\p{Letter}\p{Number}_\p{Emoji_Presentation}]+/gu) + .filter(term => term) // remove remaining empty strings +} + +/** + * Search Module + */ +const Search = { + _index: null, + _queued_query: null, + _pulse_status: -1, + + htmlToText: (htmlString) => { + const htmlElement = new DOMParser().parseFromString(htmlString, 'text/html'); + htmlElement.querySelectorAll(".headerlink").forEach((el) => { el.remove() }); + const docContent = htmlElement.querySelector('[role="main"]'); + if (docContent !== undefined) return docContent.textContent; + console.warn( + "Content block not found. Sphinx search tries to obtain it via '[role=main]'. Could you check your theme or template." + ); + return ""; + }, + + init: () => { + const query = new URLSearchParams(window.location.search).get("q"); + document + .querySelectorAll('input[name="q"]') + .forEach((el) => (el.value = query)); + if (query) Search.performSearch(query); + }, + + loadIndex: (url) => + (document.body.appendChild(document.createElement("script")).src = url), + + setIndex: (index) => { + Search._index = index; + if (Search._queued_query !== null) { + const query = Search._queued_query; + Search._queued_query = null; + Search.query(query); + } + }, + + hasIndex: () => Search._index !== null, + + deferQuery: (query) => (Search._queued_query = query), + + stopPulse: () => (Search._pulse_status = -1), + + startPulse: () => { + if (Search._pulse_status >= 0) return; + + const pulse = () => { + Search._pulse_status = (Search._pulse_status + 1) % 4; + Search.dots.innerText = ".".repeat(Search._pulse_status); + if (Search._pulse_status >= 0) window.setTimeout(pulse, 500); + }; + pulse(); + }, + + /** + * perform a search for something (or wait until index is loaded) + */ + performSearch: (query) => { + // create the required interface elements + const searchText = document.createElement("h2"); + searchText.textContent = _("Searching"); + const searchSummary = document.createElement("p"); + searchSummary.classList.add("search-summary"); + searchSummary.innerText = ""; + const searchList = document.createElement("ul"); + searchList.classList.add("search"); + + const out = document.getElementById("search-results"); + Search.title = out.appendChild(searchText); + Search.dots = Search.title.appendChild(document.createElement("span")); + Search.status = out.appendChild(searchSummary); + Search.output = out.appendChild(searchList); + + const searchProgress = document.getElementById("search-progress"); + // Some themes don't use the search progress node + if (searchProgress) { + searchProgress.innerText = _("Preparing search..."); + } + Search.startPulse(); + + // index already loaded, the browser was quick! + if (Search.hasIndex()) Search.query(query); + else Search.deferQuery(query); + }, + + /** + * execute search (requires search index to be loaded) + */ + query: (query) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + const allTitles = Search._index.alltitles; + const indexEntries = Search._index.indexentries; + + // stem the search terms and add them to the correct list + const stemmer = new Stemmer(); + const searchTerms = new Set(); + const excludedTerms = new Set(); + const highlightTerms = new Set(); + const objectTerms = new Set(splitQuery(query.toLowerCase().trim())); + splitQuery(query.trim()).forEach((queryTerm) => { + const queryTermLower = queryTerm.toLowerCase(); + + // maybe skip this "word" + // stopwords array is from language_data.js + if ( + stopwords.indexOf(queryTermLower) !== -1 || + queryTerm.match(/^\d+$/) + ) + return; + + // stem the word + let word = stemmer.stemWord(queryTermLower); + // select the correct list + if (word[0] === "-") excludedTerms.add(word.substr(1)); + else { + searchTerms.add(word); + highlightTerms.add(queryTermLower); + } + }); + + if (SPHINX_HIGHLIGHT_ENABLED) { // set in sphinx_highlight.js + localStorage.setItem("sphinx_highlight_terms", [...highlightTerms].join(" ")) + } + + // console.debug("SEARCH: searching for:"); + // console.info("required: ", [...searchTerms]); + // console.info("excluded: ", [...excludedTerms]); + + // array of [docname, title, anchor, descr, score, filename] + let results = []; + _removeChildren(document.getElementById("search-progress")); + + const queryLower = query.toLowerCase(); + for (const [title, foundTitles] of Object.entries(allTitles)) { + if (title.toLowerCase().includes(queryLower) && (queryLower.length >= title.length/2)) { + for (const [file, id] of foundTitles) { + let score = Math.round(100 * queryLower.length / title.length) + results.push([ + docNames[file], + titles[file] !== title ? `${titles[file]} > ${title}` : title, + id !== null ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // search for explicit entries in index directives + for (const [entry, foundEntries] of Object.entries(indexEntries)) { + if (entry.includes(queryLower) && (queryLower.length >= entry.length/2)) { + for (const [file, id] of foundEntries) { + let score = Math.round(100 * queryLower.length / entry.length) + results.push([ + docNames[file], + titles[file], + id ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // lookup as object + objectTerms.forEach((term) => + results.push(...Search.performObjectSearch(term, objectTerms)) + ); + + // lookup as search terms in fulltext + results.push(...Search.performTermsSearch(searchTerms, excludedTerms)); + + // let the scorer override scores with a custom scoring function + if (Scorer.score) results.forEach((item) => (item[4] = Scorer.score(item))); + + // now sort the results by score (in opposite order of appearance, since the + // display function below uses pop() to retrieve items) and then + // alphabetically + results.sort((a, b) => { + const leftScore = a[4]; + const rightScore = b[4]; + if (leftScore === rightScore) { + // same score: sort alphabetically + const leftTitle = a[1].toLowerCase(); + const rightTitle = b[1].toLowerCase(); + if (leftTitle === rightTitle) return 0; + return leftTitle > rightTitle ? -1 : 1; // inverted is intentional + } + return leftScore > rightScore ? 1 : -1; + }); + + // remove duplicate search results + // note the reversing of results, so that in the case of duplicates, the highest-scoring entry is kept + let seen = new Set(); + results = results.reverse().reduce((acc, result) => { + let resultStr = result.slice(0, 4).concat([result[5]]).map(v => String(v)).join(','); + if (!seen.has(resultStr)) { + acc.push(result); + seen.add(resultStr); + } + return acc; + }, []); + + results = results.reverse(); + + // for debugging + //Search.lastresults = results.slice(); // a copy + // console.info("search results:", Search.lastresults); + + // print the results + _displayNextItem(results, results.length, searchTerms); + }, + + /** + * search for object names + */ + performObjectSearch: (object, objectTerms) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const objects = Search._index.objects; + const objNames = Search._index.objnames; + const titles = Search._index.titles; + + const results = []; + + const objectSearchCallback = (prefix, match) => { + const name = match[4] + const fullname = (prefix ? prefix + "." : "") + name; + const fullnameLower = fullname.toLowerCase(); + if (fullnameLower.indexOf(object) < 0) return; + + let score = 0; + const parts = fullnameLower.split("."); + + // check for different match types: exact matches of full name or + // "last name" (i.e. last dotted part) + if (fullnameLower === object || parts.slice(-1)[0] === object) + score += Scorer.objNameMatch; + else if (parts.slice(-1)[0].indexOf(object) > -1) + score += Scorer.objPartialMatch; // matches in last name + + const objName = objNames[match[1]][2]; + const title = titles[match[0]]; + + // If more than one term searched for, we require other words to be + // found in the name/title/description + const otherTerms = new Set(objectTerms); + otherTerms.delete(object); + if (otherTerms.size > 0) { + const haystack = `${prefix} ${name} ${objName} ${title}`.toLowerCase(); + if ( + [...otherTerms].some((otherTerm) => haystack.indexOf(otherTerm) < 0) + ) + return; + } + + let anchor = match[3]; + if (anchor === "") anchor = fullname; + else if (anchor === "-") anchor = objNames[match[1]][1] + "-" + fullname; + + const descr = objName + _(", in ") + title; + + // add custom score for some objects according to scorer + if (Scorer.objPrio.hasOwnProperty(match[2])) + score += Scorer.objPrio[match[2]]; + else score += Scorer.objPrioDefault; + + results.push([ + docNames[match[0]], + fullname, + "#" + anchor, + descr, + score, + filenames[match[0]], + ]); + }; + Object.keys(objects).forEach((prefix) => + objects[prefix].forEach((array) => + objectSearchCallback(prefix, array) + ) + ); + return results; + }, + + /** + * search for full-text terms in the index + */ + performTermsSearch: (searchTerms, excludedTerms) => { + // prepare search + const terms = Search._index.terms; + const titleTerms = Search._index.titleterms; + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + + const scoreMap = new Map(); + const fileMap = new Map(); + + // perform the search on the required terms + searchTerms.forEach((word) => { + const files = []; + const arr = [ + { files: terms[word], score: Scorer.term }, + { files: titleTerms[word], score: Scorer.title }, + ]; + // add support for partial matches + if (word.length > 2) { + const escapedWord = _escapeRegExp(word); + Object.keys(terms).forEach((term) => { + if (term.match(escapedWord) && !terms[word]) + arr.push({ files: terms[term], score: Scorer.partialTerm }); + }); + Object.keys(titleTerms).forEach((term) => { + if (term.match(escapedWord) && !titleTerms[word]) + arr.push({ files: titleTerms[word], score: Scorer.partialTitle }); + }); + } + + // no match but word was a required one + if (arr.every((record) => record.files === undefined)) return; + + // found search word in contents + arr.forEach((record) => { + if (record.files === undefined) return; + + let recordFiles = record.files; + if (recordFiles.length === undefined) recordFiles = [recordFiles]; + files.push(...recordFiles); + + // set score for the word in each file + recordFiles.forEach((file) => { + if (!scoreMap.has(file)) scoreMap.set(file, {}); + scoreMap.get(file)[word] = record.score; + }); + }); + + // create the mapping + files.forEach((file) => { + if (fileMap.has(file) && fileMap.get(file).indexOf(word) === -1) + fileMap.get(file).push(word); + else fileMap.set(file, [word]); + }); + }); + + // now check if the files don't contain excluded terms + const results = []; + for (const [file, wordList] of fileMap) { + // check if all requirements are matched + + // as search terms with length < 3 are discarded + const filteredTermCount = [...searchTerms].filter( + (term) => term.length > 2 + ).length; + if ( + wordList.length !== searchTerms.size && + wordList.length !== filteredTermCount + ) + continue; + + // ensure that none of the excluded terms is in the search result + if ( + [...excludedTerms].some( + (term) => + terms[term] === file || + titleTerms[term] === file || + (terms[term] || []).includes(file) || + (titleTerms[term] || []).includes(file) + ) + ) + break; + + // select one (max) score for the file. + const score = Math.max(...wordList.map((w) => scoreMap.get(file)[w])); + // add result to the result list + results.push([ + docNames[file], + titles[file], + "", + null, + score, + filenames[file], + ]); + } + return results; + }, + + /** + * helper function to return a node containing the + * search summary for a given text. keywords is a list + * of stemmed words. + */ + makeSearchSummary: (htmlText, keywords) => { + const text = Search.htmlToText(htmlText); + if (text === "") return null; + + const textLower = text.toLowerCase(); + const actualStartPosition = [...keywords] + .map((k) => textLower.indexOf(k.toLowerCase())) + .filter((i) => i > -1) + .slice(-1)[0]; + const startWithContext = Math.max(actualStartPosition - 120, 0); + + const top = startWithContext === 0 ? "" : "..."; + const tail = startWithContext + 240 < text.length ? "..." : ""; + + let summary = document.createElement("p"); + summary.classList.add("context"); + summary.textContent = top + text.substr(startWithContext, 240).trim() + tail; + + return summary; + }, +}; + +_ready(Search.init); diff --git a/branch/seperate_maz_and_taz/_static/sphinx_highlight.js b/branch/seperate_maz_and_taz/_static/sphinx_highlight.js new file mode 100644 index 0000000..aae669d --- /dev/null +++ b/branch/seperate_maz_and_taz/_static/sphinx_highlight.js @@ -0,0 +1,144 @@ +/* Highlighting utilities for Sphinx HTML documentation. */ +"use strict"; + +const SPHINX_HIGHLIGHT_ENABLED = true + +/** + * highlight a given string on a node by wrapping it in + * span elements with the given class name. + */ +const _highlight = (node, addItems, text, className) => { + if (node.nodeType === Node.TEXT_NODE) { + const val = node.nodeValue; + const parent = node.parentNode; + const pos = val.toLowerCase().indexOf(text); + if ( + pos >= 0 && + !parent.classList.contains(className) && + !parent.classList.contains("nohighlight") + ) { + let span; + + const closestNode = parent.closest("body, svg, foreignObject"); + const isInSVG = closestNode && closestNode.matches("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.classList.add(className); + } + + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + parent.insertBefore( + span, + parent.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling + ) + ); + node.nodeValue = val.substr(0, pos); + + if (isInSVG) { + const rect = document.createElementNS( + "http://www.w3.org/2000/svg", + "rect" + ); + const bbox = parent.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute("class", className); + addItems.push({ parent: parent, target: rect }); + } + } + } else if (node.matches && !node.matches("button, select, textarea")) { + node.childNodes.forEach((el) => _highlight(el, addItems, text, className)); + } +}; +const _highlightText = (thisNode, text, className) => { + let addItems = []; + _highlight(thisNode, addItems, text, className); + addItems.forEach((obj) => + obj.parent.insertAdjacentElement("beforebegin", obj.target) + ); +}; + +/** + * Small JavaScript module for the documentation. + */ +const SphinxHighlight = { + + /** + * highlight the search words provided in localstorage in the text + */ + highlightSearchWords: () => { + if (!SPHINX_HIGHLIGHT_ENABLED) return; // bail if no highlight + + // get and clear terms from localstorage + const url = new URL(window.location); + const highlight = + localStorage.getItem("sphinx_highlight_terms") + || url.searchParams.get("highlight") + || ""; + localStorage.removeItem("sphinx_highlight_terms") + url.searchParams.delete("highlight"); + window.history.replaceState({}, "", url); + + // get individual terms from highlight string + const terms = highlight.toLowerCase().split(/\s+/).filter(x => x); + if (terms.length === 0) return; // nothing to do + + // There should never be more than one element matching "div.body" + const divBody = document.querySelectorAll("div.body"); + const body = divBody.length ? divBody[0] : document.querySelector("body"); + window.setTimeout(() => { + terms.forEach((term) => _highlightText(body, term, "highlighted")); + }, 10); + + const searchBox = document.getElementById("searchbox"); + if (searchBox === null) return; + searchBox.appendChild( + document + .createRange() + .createContextualFragment( + '" + ) + ); + }, + + /** + * helper function to hide the search marks again + */ + hideSearchWords: () => { + document + .querySelectorAll("#searchbox .highlight-link") + .forEach((el) => el.remove()); + document + .querySelectorAll("span.highlighted") + .forEach((el) => el.classList.remove("highlighted")); + localStorage.removeItem("sphinx_highlight_terms") + }, + + initEscapeListener: () => { + // only install a listener if it is really needed + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.shiftKey || event.altKey || event.ctrlKey || event.metaKey) return; + if (DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS && (event.key === "Escape")) { + SphinxHighlight.hideSearchWords(); + event.preventDefault(); + } + }); + }, +}; + +_ready(SphinxHighlight.highlightSearchWords); +_ready(SphinxHighlight.initEscapeListener); diff --git a/branch/seperate_maz_and_taz/autodoc/index.html b/branch/seperate_maz_and_taz/autodoc/index.html new file mode 100644 index 0000000..4179b9e --- /dev/null +++ b/branch/seperate_maz_and_taz/autodoc/index.html @@ -0,0 +1,165 @@ + + + + + + + Lasso Classes and Functions — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Lasso Classes and Functions

+
+

Base Classes

+ + + + + + + + + + + + + + + + + + +

CubeTransit

Class for storing information about transit defined in Cube line files.

StandardTransit

Holds a standard transit feed as a Partridge object and contains methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files.

ModelRoadwayNetwork

Subclass of network_wrangler class RoadwayNetwork

Project

A single or set of changes to the roadway or transit system.

Parameters

A class representing all the parameters defining the networks including time of day, categories, etc.

+
+
+

Utils and Functions

+ + + + + + + + + +

util

logger

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/genindex/index.html b/branch/seperate_maz_and_taz/genindex/index.html new file mode 100644 index 0000000..b38d787 --- /dev/null +++ b/branch/seperate_maz_and_taz/genindex/index.html @@ -0,0 +1,1040 @@ + + + + + + Index — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Index

+ +
+ _ + | A + | B + | C + | D + | E + | F + | G + | H + | I + | K + | L + | M + | N + | O + | P + | R + | S + | T + | U + | V + | W + | X + | Y + | Z + +
+

_

+ + +
+ +

A

+ + + +
+ +

B

+ + + +
+ +

C

+ + + +
+ +

D

+ + + +
+ +

E

+ + + +
+ +

F

+ + + +
+ +

G

+ + + +
+ +

H

+ + + +
+ +

I

+ + + +
+ +

K

+ + +
+ +

L

+ + + +
+ +

M

+ + + +
+ +

N

+ + + +
+ +

O

+ + + +
+ +

P

+ + + +
+ +

R

+ + + +
+ +

S

+ + + +
+ +

T

+ + + +
+ +

U

+ + + +
+ +

V

+ + + +
+ +

W

+ + + +
+ +

X

+ + + +
+ +

Y

+ + +
+ +

Z

+ + + +
+ + + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/index.html b/branch/seperate_maz_and_taz/index.html new file mode 100644 index 0000000..c65fa50 --- /dev/null +++ b/branch/seperate_maz_and_taz/index.html @@ -0,0 +1,181 @@ + + + + + + + Welcome to lasso’s documentation! — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Welcome to lasso’s documentation!

+

This package of utilities is a wrapper around the +network_wrangler package +for MetCouncil and MTC. It aims to have the following functionality:

+
    +
  1. parse Cube log files and base highway networks and create ProjectCards +for Network Wrangler

  2. +
  3. parse two Cube transit line files and create ProjectCards for NetworkWrangler

  4. +
  5. refine Network Wrangler highway networks to contain specific variables and +settings for the respective agency and export them to a format that can +be read in by Citilab’s Cube software.

  6. +
+ +
+
+

Indices and tables

+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/objects.inv b/branch/seperate_maz_and_taz/objects.inv new file mode 100644 index 0000000..57b89c9 Binary files /dev/null and b/branch/seperate_maz_and_taz/objects.inv differ diff --git a/branch/seperate_maz_and_taz/py-modindex/index.html b/branch/seperate_maz_and_taz/py-modindex/index.html new file mode 100644 index 0000000..46c6f94 --- /dev/null +++ b/branch/seperate_maz_and_taz/py-modindex/index.html @@ -0,0 +1,136 @@ + + + + + + Python Module Index — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Python Module Index

+ +
+ l +
+ + + + + + + + + + + + + +
 
+ l
+ lasso +
    + lasso.logger +
    + lasso.util +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/running/index.html b/branch/seperate_maz_and_taz/running/index.html new file mode 100644 index 0000000..6feca00 --- /dev/null +++ b/branch/seperate_maz_and_taz/running/index.html @@ -0,0 +1,133 @@ + + + + + + + Running Lasso — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Running Lasso

+
+

Create project files

+
+
+

Create a scenario

+
+
+

Exporting networks

+
+
+

Auditing and Reporting

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/search/index.html b/branch/seperate_maz_and_taz/search/index.html new file mode 100644 index 0000000..6e5865f --- /dev/null +++ b/branch/seperate_maz_and_taz/search/index.html @@ -0,0 +1,126 @@ + + + + + + Search — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + + + +
+ +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + + + + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/searchindex.js b/branch/seperate_maz_and_taz/searchindex.js new file mode 100644 index 0000000..69a3e2a --- /dev/null +++ b/branch/seperate_maz_and_taz/searchindex.js @@ -0,0 +1 @@ +Search.setIndex({"docnames": ["_generated/lasso.CubeTransit", "_generated/lasso.ModelRoadwayNetwork", "_generated/lasso.Parameters", "_generated/lasso.Project", "_generated/lasso.StandardTransit", "_generated/lasso.logger", "_generated/lasso.util", "autodoc", "index", "running", "setup", "starting"], "filenames": ["_generated/lasso.CubeTransit.rst", "_generated/lasso.ModelRoadwayNetwork.rst", "_generated/lasso.Parameters.rst", "_generated/lasso.Project.rst", "_generated/lasso.StandardTransit.rst", "_generated/lasso.logger.rst", "_generated/lasso.util.rst", "autodoc.rst", "index.rst", "running.md", "setup.md", "starting.md"], "titles": ["lasso.CubeTransit", "lasso.ModelRoadwayNetwork", "lasso.Parameters", "lasso.Project", "lasso.StandardTransit", "lasso.logger", "lasso.util", "Lasso Classes and Functions", "Welcome to lasso\u2019s documentation!", "Running Lasso", "Setup", "Starting Out"], "terms": {"class": [0, 1, 2, 3, 4, 6, 8, 11], "paramet": [0, 1, 3, 4, 6, 8], "sourc": [0, 1, 2, 3, 4, 5, 6, 11], "base": [0, 1, 2, 3, 4, 6, 8, 11], "object": [0, 1, 2, 3, 4, 6, 11], "store": [0, 1, 11], "inform": [0, 1, 4, 11], "about": [0, 1, 4, 11], "transit": [0, 1, 2, 3, 4, 8], "defin": [0, 1, 2, 3, 11], "cube": [0, 1, 2, 3, 4, 8], "line": [0, 1, 2, 3, 4, 6, 8, 11], "file": [0, 1, 2, 3, 4, 8], "ha": [0, 6, 11], "capabl": [0, 11], "pars": [0, 8, 11], "properti": [0, 1, 2, 4, 6, 11], "shape": [0, 1, 3, 4, 6, 11], "python": [0, 1, 2, 11], "dictionari": [0, 1, 2, 3, 4, 6, 11], "compar": [0, 3, 4, 11], "repres": [0, 2, 4, 6, 11], "chang": [0, 1, 3, 4, 11], "project": [0, 1, 2, 4, 6, 8], "card": [0, 1, 3, 4], "typic": [0, 3, 4, 6, 8], "usag": [0, 1, 3, 4], "exampl": [0, 1, 3, 4, 6, 11], "tn": [0, 11], "create_from_cub": [0, 11], "cube_dir": [0, 3, 11], "transit_change_list": [0, 11], "evaluate_differ": [0, 4, 11], "base_transit_network": [0, 3, 11], "list": [0, 1, 2, 3, 4, 6, 11], "string": [0, 1, 3, 4, 6], "uniqu": [0, 1, 3], "name": [0, 1, 2, 3, 4, 6, 11], "network": [0, 1, 2, 3, 4, 5, 8], "type": [0, 1, 2, 3, 4, 6, 11], "line_properti": 0, "kei": [0, 1, 3, 11], "valu": [0, 1, 3, 4, 6, 11], "ar": [0, 1, 2, 3, 6, 11], "These": 0, "directli": 0, "read": [0, 1, 3, 4, 8, 11], "from": [0, 1, 2, 3, 4, 6, 8], "haven": 0, "t": [0, 1, 6], "been": [0, 6], "translat": [0, 4, 11], "standard": [0, 1, 2, 4, 11], "dict": [0, 1, 2, 3, 4], "panda": [0, 1, 3], "datafram": [0, 1, 3, 4, 11], "node": [0, 1, 2, 3, 4, 6, 11], "follow": [0, 4, 8, 11], "column": [0, 1, 4], "node_id": 0, "int": [0, 1, 2, 3, 4, 6], "posit": [0, 6], "integ": [0, 1, 2], "id": [0, 1, 3, 4, 11], "number": [0, 1, 2, 3, 4, 6, 11], "neg": [0, 6], "indic": [0, 1, 6], "non": [0, 6], "stop": [0, 4], "boolean": [0, 1], "i": [0, 1, 3, 4, 5, 6, 8, 11], "order": [0, 1, 2, 6, 11], "within": [0, 1, 2, 6], "thi": [0, 1, 2, 3, 4, 6, 8, 11], "program_typ": 0, "either": [0, 1, 5, 6, 11], "pt": 0, "trnbld": 0, "str": [0, 1, 2, 3, 6], "instanc": [0, 1, 2, 3, 4, 11], "appli": [0, 1, 6], "which": [0, 1, 3, 6, 11], "includ": [0, 1, 2, 11], "time": [0, 1, 2, 4, 6, 11], "period": [0, 1, 2, 4, 11], "variabl": [0, 1, 2, 3, 4, 8, 11], "source_list": 0, "have": [0, 1, 6, 8], "ad": [0, 1, 3, 6, 11], "diff_dict": 0, "__init__": [0, 1, 2, 3, 4], "constructor": [0, 1, 3], "set": [0, 1, 2, 3, 4, 5, 6, 8, 11], "see": [0, 1, 3, 4, 6], "an": [0, 1, 3, 4, 6, 11], "method": [0, 1, 2, 3, 4, 6, 11], "add_additional_time_period": 0, "new_time_period_numb": 0, "orig_line_nam": 0, "copi": [0, 1, 3, 4, 6, 11], "rout": [0, 1, 4], "anoth": 0, "appropri": [0, 4], "specif": [0, 1, 8, 11], "new": [0, 1, 6, 11], "under": 0, "self": [0, 1, 2, 3, 4, 6], "origin": [0, 1, 6, 11], "its": [0, 1], "return": [0, 1, 3, 4, 6], "add_cub": 0, "transit_sourc": 0, "lin": [0, 3, 4], "add": [0, 1, 3, 6, 11], "exist": [0, 1, 3, 6, 11], "transitnetwork": [0, 4], "directori": [0, 1, 4, 11], "static": [0, 1, 3, 4], "build_route_nam": 0, "route_id": [0, 4], "time_period": [0, 1, 2], "agency_id": 0, "0": [0, 1, 2, 4, 6, 11], "direction_id": 0, "1": [0, 1, 2, 4, 6, 11], "creat": [0, 1, 2, 3, 4, 6, 8, 11], "contaten": 0, "agenc": [0, 8], "direct": [0, 1, 6, 11], "e": [0, 1, 11], "452": 0, "111": 0, "pk": [0, 2], "construct": [0, 6, 11], "line_nam": 0, "0_452": 0, "111_452_pk1": 0, "calculate_start_end_tim": 0, "line_properties_dict": 0, "calcul": [0, 1, 2, 3, 4, 6], "start": [0, 1, 4, 8], "end": [0, 1, 6], "warn": [0, 1], "doesn": [0, 1], "take": [0, 1], "care": 0, "discongru": 0, "flavor": [0, 1], "create_add_route_card_dict": 0, "format": [0, 1, 2, 4, 8, 11], "route_properti": 0, "being": 0, "updat": [0, 1, 11], "A": [0, 1, 2, 3, 4, 6, 11], "addit": [0, 1, 6, 8, 11], "create_delete_route_card_dict": 0, "base_transit_line_properties_dict": 0, "delet": [0, 1], "style": [0, 6], "attribut": [0, 1, 2, 3, 4, 11], "find": [0, 1, 4], "create_update_route_card_dict": 0, "updated_properties_dict": 0, "cube_properties_to_standard_properti": 0, "cube_properties_dict": 0, "convert": [0, 1, 4, 6], "most": 0, "pertin": 0, "like": [0, 1, 2, 6], "headwai": [0, 2, 4], "varibl": [0, 2], "stnadard": 0, "unit": [0, 1, 6], "minut": 0, "second": [0, 4, 6], "correct": 0, "base_transit": 0, "identifi": [0, 1, 11], "what": [0, 1, 3, 6], "need": [0, 1, 2, 3, 4], "For": [0, 1, 4, 6], "multipl": [0, 1, 6, 11], "make": [0, 1, 11], "duplic": 0, "so": [0, 1, 5, 6, 11], "each": [0, 1, 3, 6], "condit": [0, 1], "contain": [0, 1, 4, 6, 8, 11], "requir": [0, 1, 6, 11], "evalu": [0, 1, 3], "differ": [0, 2, 6], "between": [0, 1, 2, 3, 6], "evaluate_route_property_differ": 0, "properties_build": 0, "properties_bas": 0, "time_period_numb": 0, "absolut": [0, 6], "true": [0, 1, 3, 4, 5, 6, 11], "validate_bas": 0, "fals": [0, 1, 3, 4, 6, 11], "check": [0, 1, 3, 6], "ani": [0, 1, 3, 6, 11], "entri": [0, 3], "property_nam": 0, "property_valu": 0, "us": [0, 1, 2, 3, 4, 6, 11], "command": [0, 1, 3], "rather": [0, 1], "than": [0, 1, 6], "If": [0, 1, 3, 4, 6, 11], "automat": 0, "note": [0, 6, 11], "onli": [0, 1, 3, 4, 6], "numer": [0, 4, 6], "frequenc": [0, 4, 11], "suitabl": 0, "write": [0, 1, 3, 4, 11], "evaluate_route_shape_chang": 0, "shape_build": 0, "shape_bas": 0, "two": [0, 1, 6, 8, 11], "build": [0, 1, 3, 11], "version": [0, 6, 11], "ddatafram": 0, "get_time_period_numbers_from_cube_properti": 0, "properties_list": 0, "associ": [0, 2], "them": [0, 1, 4, 6, 8], "all": [0, 1, 2, 5, 6, 11], "found": [0, 1, 6], "unpack_route_nam": 0, "unpack": 0, "info": [0, 1, 2], "link": [1, 2, 3, 6, 11], "kwarg": [1, 2, 3, 6], "roadwaynetwork": [1, 3, 4], "subclass": [1, 11], "network_wrangl": [1, 8, 11], "represent": [1, 3, 4, 6], "physic": 1, "roadwai": [1, 2, 3, 4, 11], "geodatafram": [1, 11], "specifi": [1, 3, 6], "default": [1, 2, 3, 4, 6, 11], "cr": [1, 3, 6], "coordin": [1, 3, 6], "refer": [1, 3, 4, 6, 11], "system": [1, 3], "espg": [1, 3], "node_foreign_kei": [1, 3], "tabl": [1, 2, 3, 6], "link_foreign_kei": [1, 3], "foreign": [1, 3], "shape_foreign_kei": [1, 3, 11], "unique_link_id": [1, 3], "unique_node_id": [1, 3], "modes_to_network_link_vari": [1, 3], "map": [1, 2, 3, 6, 11], "mode": [1, 3, 4, 11], "modes_to_network_nodes_vari": [1, 3], "managed_lanes_node_id_scalar": [1, 3], "scalar": [1, 3, 6], "primari": [1, 3], "correspond": [1, 3, 4], "manag": [1, 2, 3, 11], "lane": [1, 2, 3, 11], "managed_lanes_link_id_scalar": [1, 3], "managed_lanes_required_attribut": [1, 3], "must": [1, 3, 6], "keep_same_attributes_ml_and_gp": [1, 3], "parallel": [1, 3, 6], "gener": [1, 3], "purpos": [1, 3, 6], "add_count": 1, "network_vari": 1, "aadt": 1, "mndot_count_shst_data": 1, "none": [1, 3, 4, 5, 6], "widot_count_shst_data": 1, "mndot_count_variable_shp": [1, 2], "widot_count_variable_shp": 1, "count": [1, 2], "mc": [1, 2, 4], "join": [1, 3, 4, 6, 11], "data": [1, 2, 3, 4, 8, 11], "via": 1, "shst": 1, "api": 1, "match": 1, "result": [1, 6, 11], "should": [1, 2, 3, 6, 11], "written": [1, 11], "path": [1, 3, 4, 6, 11], "mndot": [1, 2], "locat": [1, 2, 3, 4], "widot": 1, "geodatabas": 1, "add_incident_link_data_to_nod": 1, "links_df": [1, 11], "nodes_df": [1, 11], "link_vari": 1, "unique_node_kei": 1, "model_node_id": [1, 2], "go": [1, 11], "assess": [1, 3], "connect": 1, "incid": 1, "where": 1, "length": [1, 6], "n": [1, 2, 3, 6, 11], "out": [1, 2, 5, 6, 8], "add_new_roadway_feature_chang": 1, "featur": [1, 6], "also": [1, 6, 11], "valid": [1, 6, 11], "add_variable_using_shst_refer": 1, "var_shst_csvdata": 1, "shst_csv_variabl": 1, "network_var_typ": 1, "overwrit": 1, "bool": [1, 3, 6], "addition_map": 1, "show": 1, "project_card_dictionari": 1, "wrapper": [1, 8, 11], "apply_managed_lane_feature_chang": 1, "link_idx": 1, "in_plac": [1, 11], "lndice": 1, "whether": 1, "decid": 1, "connector": [1, 2], "when": [1, 3, 4, 6], "thei": [1, 2], "more": [1, 6, 11], "apply_python_calcul": 1, "pycod": 1, "execut": 1, "code": [1, 2, 4, 6, 11], "apply_roadway_feature_chang": [1, 11], "select": [1, 11], "pass": [1, 5, 6], "assess_connect": [1, 11], "ignore_end_nod": [1, 11], "graph": 1, "disconnect": 1, "subgraph": 1, "describ": [1, 11], "member": [1, 6], "one": [1, 6, 11], "drive": [1, 11], "walk": [1, 11], "bike": 1, "ignor": [1, 6], "strai": 1, "singleton": 1, "tupl": [1, 6], "osmnx": [1, 11], "networkx": 1, "digraph": 1, "build_selection_kei": 1, "selection_dict": 1, "combin": [1, 2, 4, 6], "queri": [1, 11], "b": [1, 2, 11], "you": [1, 3, 11], "selection_dictonari": 1, "serv": 1, "calculate_area_typ": 1, "area_type_shap": [1, 2], "area_type_shape_vari": 1, "area_typ": [1, 2, 3], "area_type_codes_dict": 1, "downtown_area_type_shap": [1, 2], "downtown_area_typ": [1, 2], "area": [1, 2, 3, 6], "centroid": [1, 2, 6], "geometri": [1, 2, 6], "field": [1, 2, 11], "determin": [1, 3, 6], "label": 1, "isn": 1, "perfect": 1, "much": 1, "quicker": 1, "other": [1, 6, 11], "The": [1, 2, 5, 6, 11], "geodadabas": 1, "input": [1, 6], "downtown": [1, 2], "boundari": [1, 2, 6], "counti": [1, 2, 3], "calculate_centroidconnect": 1, "centroidconnect": [1, 2, 3], "highest_taz_numb": [1, 2], "as_integ": 1, "max": 1, "taz": [1, 2], "calculate_counti": 1, "county_shap": [1, 2], "county_shape_vari": 1, "county_codes_dict": 1, "calculate_dist": 1, "distanc": [1, 2, 3, 6], "centroidconnect_onli": 1, "mile": 1, "centroidconnector": 1, "calculate_mpo": 1, "county_network_vari": 1, "mpo": [1, 2], "mpo_counti": [1, 2], "param": [1, 4], "county_vari": 1, "region": [1, 6], "calculate_us": 1, "defauli": 1, "convert_int": 1, "int_col_nam": 1, "create_ml_vari": 1, "ml_lane": [1, 2], "ml": 1, "placehold": 1, "come": 1, "log": [1, 3, 8], "create_calculated_vari": 1, "create_dummy_connector_link": 1, "ml_df": 1, "access_lan": 1, "egress_lan": 1, "access_roadwai": 1, "ml_access": 1, "egress_roadwai": 1, "access_name_prefix": 1, "access": [1, 4, 6], "dummi": 1, "egress_name_prefix": 1, "egress": 1, "gp_df": 1, "roadai": 1, "prefix": 1, "create_hov_corridor_vari": 1, "segment_id": [1, 2], "hov": 1, "corridor": 1, "create_managed_lane_network": [1, 11], "keep_additional_attributes_ml_and_gp": 1, "separ": [1, 3, 6], "look": 1, "want": [1, 11], "leav": 1, "some": 1, "rigor": 1, "test": [1, 2, 6], "create_managed_vari": 1, "dataframe_to_fixed_width": 1, "df": 1, "fix": [1, 2], "width": [1, 6], "transform": [1, 6], "delete_roadway_feature_chang": 1, "ignore_miss": 1, "get": [1, 2, 6, 11], "miss": 1, "fail": [1, 6], "deletion_map": 1, "fill_na": 1, "fill": [1, 6], "na": 1, "from_roadwaynetwork": [1, 11], "roadway_network_object": 1, "get_attribut": 1, "join_kei": 1, "source_shst_ref_df": 1, "source_gdf": 1, "field_nam": 1, "get_managed_lane_node_id": 1, "nodes_list": 1, "4500000": 1, "237": 1, "get_modal_graph": 1, "bike_access": [1, 2], "bu": [1, 4], "bus_onli": [1, 2], "drive_access": [1, 2, 11], "rail": [1, 4], "rail_onli": [1, 2], "walk_access": [1, 2], "strongli": 1, "vertex": [1, 6], "reachabl": 1, "everi": [1, 6], "get_modal_links_nod": 1, "kept": 1, "both": [1, 4, 6, 11], "filter": [1, 6], "right": [1, 6], "now": 1, "we": [1, 4, 11], "don": 1, "becaus": [1, 4, 6, 11], "mark": 1, "issu": 1, "discuss": 1, "http": [1, 4, 6, 11], "github": [1, 6, 11], "com": [1, 4, 6, 11], "wsp": [1, 11], "sag": [1, 11], "145": 1, "modal_nodes_df": 1, "mode_node_vari": 1, "get_property_by_time_period_and_group": 1, "prop": 1, "categori": [1, 2, 11], "default_return": 1, "seri": [1, 11], "group": 1, "16": [1, 2, 6], "00": [1, 2], "19": [1, 2], "option": [1, 6], "sov": [1, 2], "search": [1, 2, 8, 11], "hov3": [1, 2], "hov2": [1, 2], "identify_seg": 1, "o_id": 1, "d_id": 1, "endpoint": 1, "up": [1, 5, 11], "segment": [1, 4, 6, 11], "candid": 1, "otherwis": [1, 4, 6], "ram": 1, "hog": 1, "could": [1, 6, 11], "odd": 1, "shortest": 1, "segment_vari": 1, "keep": [1, 6], "identify_segment_endpoint": 1, "min_connecting_link": 1, "10": [1, 2, 6], "min_dist": 1, "max_link_devi": 1, "2": [1, 2, 4, 6, 11], "is_network_connect": [1, 11], "consid": [1, 6, 11], "cach": 1, "long": [1, 6], "load_transform_network": 1, "node_filenam": [1, 11], "link_filenam": [1, 11], "shape_filenam": [1, 11], "4326": [1, 6], "validate_schema": 1, "disk": 1, "schema": [1, 11], "shapes_df": [1, 11], "network_connection_plot": 1, "g": [1, 6], "disconnected_subgraph_nod": 1, "plot": 1, "fig": 1, "ax": [1, 6], "orig_dest_nodes_foreign_kei": 1, "whatev": 1, "u": 1, "v": [1, 2, 11], "ab": 1, "noth": 1, "assum": 1, "a_id": 1, "b_id": 1, "ox_graph": 1, "unique_link_kei": 1, "model_link_id": [1, 2, 3], "arrai": [1, 6], "remov": [1, 6], "certain": 1, "do": [1, 6, 11], "too": [1, 5], "link_df": 1, "referenc": 1, "multidigraph": 1, "path_search": 1, "candidate_links_df": 1, "weight_column": 1, "weight_factor": 1, "search_breadth": 1, "5": [1, 2, 4, 6], "max_search_breadth": 1, "candidate_link": 1, "part": [1, 6, 11], "foreigh": 1, "destin": 1, "weight": 1, "iter": [1, 6], "multipli": 1, "fast": [1, 11], "recalculate_calculated_vari": [1, 3], "recalculate_dist": [1, 3], "json": [1, 11], "geojson": 1, "skip": 1, "speed": 1, "spatial": [1, 6], "etc": [1, 2, 3, 11], "re": 1, "read_match_result": 1, "lot": 1, "same": [1, 4, 6], "concaten": 1, "singl": [1, 3, 6], "geopanda": [1, 11], "sure": 1, "why": 1, "util": [1, 8, 11], "rename_variables_for_dbf": 1, "input_df": 1, "variable_crosswalk": 1, "output_vari": [1, 2], "convert_geometry_to_xi": 1, "renam": [1, 3], "dbf": 1, "shp": [1, 2], "char": 1, "crosswalk": [1, 3], "x": [1, 2, 6], "y": [1, 2, 6], "roadway_net_to_gdf": 1, "roadway_net": 1, "turn": [1, 11], "export": [1, 8, 11], "sophist": 1, "attach": 1, "roadway_standard_to_met_council_network": 1, "output_epsg": [1, 2], "consist": [1, 6, 11], "metcouncil": [1, 2, 4, 8, 11], "": [1, 2, 4, 6, 11], "model": [1, 2], "expect": [1, 6], "epsg": [1, 2, 6], "output": [1, 2, 3, 4, 6], "select_roadway_featur": [1, 11], "search_mod": 1, "force_search": 1, "sp_weight_factor": 1, "satisfi": [1, 6], "criteria": 1, "net": [1, 11], "osm": [1, 2], "share": [1, 11], "street": [1, 11], "osm_model_link_id": 1, "1234": 1, "shstid": 1, "4321": 1, "regex": 1, "facil": [1, 2, 11], "main": 1, "st": [1, 2], "least": [1, 11], "perform": [1, 4], "even": 1, "previou": 1, "discourag": 1, "meander": 1, "ref": 1, "here": [1, 6], "defaul": 1, "selection_has_unique_link_id": 1, "selection_dictionari": 1, "selection_map": 1, "selected_link_idx": 1, "candidate_link_idx": 1, "selected_links_idx": 1, "candidate_links_idx": 1, "shortest_path": 1, "graph_links_df": 1, "100": 1, "four": 1, "nx": 1, "split_properties_by_time_period_and_categori": 1, "properties_to_split": [1, 2], "split": [1, 2, 4], "structur": 1, "stratifi": 1, "times_period": 1, "am": [1, 2], "6": [1, 2, 4, 6], "pm": [1, 2], "15": [1, 2], "update_dist": 1, "use_shap": 1, "inplac": 1, "straight": 1, "avail": 1, "portion": 1, "provid": [1, 3, 4, 6, 11], "entir": 1, "crow": 1, "fly": 1, "meter": [1, 6], "nan": 1, "validate_link_schema": 1, "schema_loc": 1, "roadway_network_link": 1, "validate_node_schema": 1, "node_fil": 1, "roadway_network_nod": 1, "validate_properti": 1, "ignore_exist": 1, "require_existing_for_chang": 1, "theproject": 1, "dictonari": 1, "validate_select": 1, "selection_requir": 1, "whetther": 1, "minimum": [1, 5, 6], "validate_shape_schema": 1, "shape_fil": 1, "roadway_network_shap": 1, "validate_uniqu": 1, "confirm": 1, "met": 1, "filenam": [1, 3, 11], "were": 1, "save": 1, "write_roadway_as_fixedwidth": [1, 11], "output_dir": 1, "node_output_vari": 1, "link_output_vari": 1, "output_link_txt": [1, 2], "output_node_txt": [1, 2], "output_link_header_width_txt": [1, 2], "output_node_header_width_txt": [1, 2], "output_cube_network_script": [1, 2], "drive_onli": 1, "function": [1, 5, 6, 8, 11], "doe": [1, 4], "header": [1, 2], "3": [1, 2, 4, 6, 11], "script": [1, 2], "run": [1, 8], "databas": [1, 3], "record": [1, 4], "write_roadway_as_shp": [1, 11], "data_to_csv": 1, "data_to_dbf": 1, "output_link_shp": [1, 2], "output_node_shp": [1, 2], "output_link_csv": [1, 2], "output_node_csv": [1, 2], "output_gpkg": 1, "output_link_gpkg_lay": 1, "output_node_gpkg_lay": 1, "output_gpkg_link_filt": 1, "gpkg": 1, "csv": [1, 2, 3, 11], "full": [1, 6], "geopackag": 1, "layer": [1, 11], "subset": 1, "calculated_valu": [1, 3], "dai": [2, 4, 11], "can": [2, 6, 8, 11], "runtim": [2, 11], "initi": [2, 3, 11], "keyword": [2, 6, 11], "argument": [2, 6, 11], "explicitli": [2, 11], "highlight": [2, 11], "attr": 2, "time_period_to_tim": 2, "abbrevi": [2, 4], "gtf": [2, 4, 11], "highwai": [2, 3, 8, 11], "ea": 2, "md": 2, "ev": 2, "cube_time_period": 2, "4": [2, 4, 6], "demand": 2, "allow": [2, 6], "suffix": 2, "truck": 2, "trk": 2, "final": 2, "lanes_am": 2, "time_periods_to_tim": 2, "shapefil": 2, "r": 2, "metcouncil_data": 2, "cb_2017_us_county_5m": 2, "county_variable_shp": 2, "lanes_lookup_fil": 2, "lookup": 2, "centroid_connect_lan": 2, "anoka": 2, "dakota": 2, "hennepin": 2, "ramsei": 2, "scott": 2, "washington": 2, "carver": 2, "taz_shap": 2, "tazofficialwcurrentforecast": 2, "taz_data": 2, "highest": 2, "3100": 2, "link_id": 2, "shstgeometryid": 2, "roadway_class": 2, "truck_access": 2, "trn_priority_ea": 2, "trn_priority_am": 2, "trn_priority_md": 2, "trn_priority_pm": 2, "trn_priority_ev": 2, "ttime_assert_ea": 2, "ttime_assert_am": 2, "ttime_assert_md": 2, "ttime_assert_pm": 2, "ttime_assert_ev": 2, "lanes_ea": 2, "lanes_md": 2, "lanes_pm": 2, "lanes_ev": 2, "price_sov_ea": 2, "price_hov2_ea": 2, "price_hov3_ea": 2, "price_truck_ea": 2, "price_sov_am": 2, "price_hov2_am": 2, "price_hov3_am": 2, "price_truck_am": 2, "price_sov_md": 2, "price_hov2_md": 2, "price_hov3_md": 2, "price_truck_md": 2, "price_sov_pm": 2, "price_hov2_pm": 2, "price_hov3_pm": 2, "price_truck_pm": 2, "price_sov_ev": 2, "price_hov2_ev": 2, "price_hov3_ev": 2, "price_truck_ev": 2, "roadway_class_idx": 2, "facility_typ": 2, "osm_node_id": [2, 6, 11], "bike_nod": 2, "transit_nod": 2, "walk_nod": 2, "drive_nod": 2, "ml_lanes_ea": 2, "ml_lanes_am": 2, "ml_lanes_md": 2, "ml_lanes_pm": 2, "ml_lanes_ev": 2, "osm_facility_type_dict": 2, "thrivemsp2040communitydesign": 2, "area_type_variable_shp": 2, "comdes2040": 2, "area_type_code_dict": 2, "23": 2, "urban": [2, 4], "center": [2, 6], "24": 2, "25": 2, "35": 2, "36": 2, "41": 2, "51": 2, "52": 2, "53": 2, "60": 2, "downtownzones_taz": 2, "mrcc_roadway_class_shap": 2, "mrcc": 2, "trans_mrcc_centerlin": 2, "mrcc_roadway_class_variable_shp": 2, "mrcc_roadway_class_shp": 2, "route_si": 2, "widot_roadway_class_shap": 2, "wisconsin": 2, "wisconsin_lanes_counts_median": 2, "wislr": 2, "widot_roadway_class_variable_shp": 2, "rdwy_ctgy_": 2, "mndot_count_shap": 2, "count_mn": 2, "aadt_2017_count_loc": 2, "osm_highway_facility_type_crosswalk": 2, "legacy_tm2_attribut": 2, "shstreferenceid": 2, "legaci": 2, "tm2": 2, "osm_lanes_attribut": 2, "tam_tm2_attribut": 2, "tam": 2, "tom_tom_attribut": 2, "tomtom": 2, "tomtom_attribut": 2, "sfcta_attribut": 2, "sfcta": 2, "geograph": 2, "102646": 2, "scratch": 2, "txt": [2, 11], "links_header_width": 2, "nodes_header_width": 2, "import": [2, 6], "make_complete_network_from_fixed_width_fil": 2, "county_link_range_dict": 2, "county_code_dict": 2, "7": [2, 4, 11], "extern": 2, "chisago": 2, "11": 2, "goodhu": 2, "12": 2, "isanti": 2, "13": 2, "le": 2, "sueur": 2, "14": 2, "mcleod": 2, "pierc": 2, "polk": 2, "17": 2, "rice": 2, "18": 2, "sherburn": 2, "siblei": 2, "20": 2, "croix": 2, "21": 2, "wright": 2, "22": 2, "maz_shape_fil": 2, "route_type_bus_mode_dict": 2, "urb": 2, "loc": 2, "sub": [2, 6], "express": [2, 4, 6], "route_type_mode_dict": 2, "8": [2, 4, 6], "9": [2, 4], "cube_time_periods_nam": 2, "op": 2, "detail": [2, 5], "zone": 2, "possibl": [2, 6], "roadway_link_chang": 3, "roadway_node_chang": 3, "transit_chang": [3, 4], "base_roadway_network": 3, "base_cube_transit_network": 3, "build_cube_transit_network": 3, "project_nam": 3, "produc": [3, 6], "test_project": [3, 11], "create_project": [3, 11], "base_cube_transit_sourc": 3, "o": [3, 4, 11], "build_cube_transit_sourc": 3, "transit_route_shape_chang": [3, 11], "evaluate_chang": [3, 11], "write_project_card": [3, 11], "scratch_dir": [3, 11], "t_transit_shape_test": [3, 11], "yml": [3, 11], "default_project_nam": 3, "level": 3, "constant": 3, "static_valu": 3, "card_data": 3, "cubetransit": [3, 8], "bunch": 3, "projectcard": [3, 8], "case": 3, "standardtransit": [3, 8], "add_highway_chang": 3, "limit_variables_to_existing_network": 3, "hoc": [3, 11], "add_transit_chang": 3, "roadway_log_fil": 3, "roadway_shp_fil": 3, "roadway_csv_fil": 3, "network_build_fil": 3, "emme_node_id_crosswalk_fil": 3, "emme_name_crosswalk_fil": 3, "base_roadway_dir": 3, "base_transit_dir": [3, 4, 11], "consum": 3, "logfil": 3, "emm": [3, 4], "folder": 3, "base_cube_transit_fil": 3, "build_cube_transit_fil": 3, "first": [3, 4, 6], "recalcul": 3, "determine_roadway_network_changes_compat": 3, "emme_id_to_wrangler_id": 3, "emme_link_change_df": 3, "emme_node_change_df": 3, "emme_transit_changes_df": 3, "rewrit": 3, "wrangler": [3, 8, 11], "emme_name_to_wrangler_nam": 3, "aggreg": 3, "get_object_from_network_build_command": 3, "row": [3, 4], "histori": 3, "l": 3, "get_operation_from_network_build_command": 3, "action": 3, "c": [3, 6], "d": [3, 6], "read_logfil": 3, "logfilenam": 3, "reprsent": [3, 11], "read_network_build_fil": 3, "networkbuildfilenam": 3, "nework": 3, "assign_group": 3, "user": [3, 11], "TO": 3, "ptg_feed": 4, "hold": [4, 11], "feed": [4, 11], "partridg": [4, 11], "manipul": [4, 11], "cube_transit_net": [4, 11], "read_gtf": [4, 11], "write_as_cube_lin": [4, 11], "write_dir": [4, 11], "outfil": [4, 11], "calculate_cube_mod": 4, "assign": 4, "logic": 4, "route_typ": 4, "develop": [4, 11], "googl": 4, "cube_mod": 4, "route_type_to_cube_mod": 4, "tram": 4, "streetcar": 4, "light": 4, "further": 4, "disaggreg": 4, "buse": 4, "suburban": 4, "longnam": 4, "lower": [4, 6], "elif": 4, "99": 4, "local": [4, 11], "els": [4, 6], "route_long_nam": 4, "cube_format": 4, "represnt": 4, "notat": 4, "trip": 4, "trip_id": 4, "shape_id": [4, 11], "tod": 4, "onewai": 4, "oper": [4, 6], "fromtransitnetwork": 4, "transit_network_object": 4, "modelroadwaynetwork": [4, 8], "gtfs_feed_dir": 4, "route_properties_gtfs_to_cub": 4, "prepar": 4, "trip_df": 4, "shape_gtfs_to_cub": 4, "add_nntim": 4, "shape_gtfs_to_dict_list": 4, "step": 4, "through": 4, "todo": 4, "elimin": 4, "necess": 4, "tag": [4, 11], "begin": 4, "As": 4, "m": 4, "minim": 4, "modif": 4, "question": 4, "shape_pt_sequ": 4, "shape_mode_node_id": 4, "is_stop": 4, "stop_sequ": 4, "shape_gtfs_to_emm": 4, "trip_row": 4, "time_to_cube_time_period": 4, "start_time_sec": 4, "as_str": 4, "verbos": 4, "midnight": [4, 6], "this_tp": 4, "this_tp_num": 4, "outpath": 4, "after": 4, "setuplog": 5, "infologfilenam": 5, "debuglogfilenam": 5, "logtoconsol": 5, "infolog": 5, "ters": 5, "just": 5, "give": 5, "bare": 5, "composit": 5, "clear": 5, "later": 5, "debuglog": 5, "veri": [5, 6], "noisi": 5, "debug": 5, "spew": 5, "consol": 5, "point": 6, "arg": 6, "basegeometri": 6, "possibli": 6, "z": 6, "zero": 6, "dimension": 6, "float": 6, "sequenc": 6, "individu": 6, "p": 6, "print": 6, "almost_equ": 6, "decim": 6, "equal": 6, "place": 6, "deprec": 6, "sinc": 6, "confus": 6, "equals_exact": 6, "instead": 6, "approxim": 6, "compon": [6, 8], "linestr": 6, "1e": 6, "buffer": 6, "quad_seg": 6, "cap_styl": 6, "round": 6, "join_styl": 6, "mitre_limit": 6, "single_sid": 6, "dilat": 6, "eros": 6, "small": 6, "mai": 6, "sometim": 6, "tidi": 6, "polygon": 6, "around": [6, 8, 11], "resolut": 6, "angl": 6, "fillet": 6, "buffercapstyl": 6, "squar": 6, "flat": 6, "circular": 6, "rectangular": 6, "while": 6, "involv": 6, "bufferjoinstyl": 6, "mitr": 6, "bevel": 6, "midpoint": 6, "edg": [6, 8], "touch": 6, "depend": 6, "limit": 6, "ratio": 6, "sharp": 6, "corner": 6, "offset": 6, "meet": 6, "miter": 6, "extend": 6, "To": [6, 11], "prevent": 6, "unreason": 6, "control": 6, "maximum": 6, "exce": 6, "side": 6, "sign": 6, "left": 6, "hand": 6, "regular": 6, "cap": 6, "alwai": 6, "forc": 6, "equival": 6, "cap_flat": 6, "quadseg": 6, "alia": 6, "strictli": 6, "wkt": 6, "load": 6, "gon": 6, "approx": 6, "radiu": 6, "circl": 6, "1365484905459": 6, "128": 6, "141513801144": 6, "triangl": 6, "exterior": 6, "coord": 6, "contains_properli": 6, "complet": 6, "common": 6, "document": [6, 11], "covered_bi": 6, "cover": 6, "cross": 6, "grid_siz": 6, "disjoint": 6, "unitless": 6, "dwithin": 6, "given": [6, 11], "topolog": 6, "toler": 6, "comparison": 6, "geometrytyp": 6, "hausdorff_dist": 6, "hausdorff": 6, "interpol": 6, "normal": 6, "along": 6, "linear": 6, "taken": 6, "measur": 6, "revers": 6, "rang": 6, "index": [6, 8], "handl": 6, "clamp": 6, "interpret": 6, "fraction": 6, "line_interpolate_point": 6, "intersect": 6, "line_locate_point": 6, "nearest": 6, "form": 6, "canon": 6, "ring": 6, "multi": 6, "multilinestr": 6, "overlap": 6, "point_on_surfac": 6, "guarante": 6, "cheapli": 6, "representative_point": 6, "relat": 6, "de": 6, "9im": 6, "matrix": 6, "relate_pattern": 6, "pattern": 6, "relationship": 6, "interior": 6, "unchang": 6, "is_ccw": 6, "clockwis": 6, "max_segment_length": 6, "vertic": 6, "longer": 6, "evenli": 6, "subdivid": 6, "densifi": 6, "unmodifi": 6, "array_lik": 6, "greater": 6, "simplifi": 6, "preserve_topologi": 6, "dougla": 6, "peucker": 6, "algorithm": 6, "unless": 6, "topologi": 6, "preserv": 6, "invalid": 6, "svg": 6, "scale_factor": 6, "fill_color": 6, "opac": 6, "element": 6, "factor": 6, "diamet": 6, "hex": 6, "color": 6, "66cc99": 6, "ff3333": 6, "symmetric_differ": 6, "symmetr": 6, "union": 6, "dimens": 6, "bound": 6, "collect": 6, "empti": 6, "null": 6, "minx": 6, "mini": 6, "maxx": 6, "maxi": 6, "geometr": 6, "convex_hul": 6, "convex": 6, "hull": 6, "less": 6, "three": [6, 11], "multipoint": 6, "triangular": 6, "imagin": 6, "elast": 6, "band": 6, "stretch": 6, "coordinatesequ": 6, "envelop": 6, "figur": 6, "geom_typ": 6, "has_z": 6, "is_clos": 6, "close": 6, "applic": 6, "is_empti": 6, "is_r": 6, "is_simpl": 6, "simpl": 6, "mean": 6, "is_valid": 6, "definit": 6, "minimum_clear": 6, "move": 6, "minimum_rotated_rectangl": 6, "orient": 6, "rotat": 6, "rectangl": 6, "enclos": 6, "unlik": 6, "constrain": 6, "degener": 6, "oriented_envelop": 6, "wkb": 6, "wkb_hex": 6, "xy": 6, "shell": 6, "hole": 6, "It": [6, 8], "space": 6, "pair": [6, 11], "tripl": 6, "abov": 6, "classmethod": 6, "from_bound": 6, "xmin": 6, "ymin": 6, "xmax": 6, "ymax": 6, "stroke": 6, "partial": 6, "func": 6, "futur": 6, "call": 6, "column_name_to_part": 6, "create_locationrefer": 6, "geodesic_point_buff": 6, "lat": 6, "lon": 6, "get_shared_streets_intersection_hash": 6, "per": [6, 11], "sharedstreet": 6, "j": 6, "blob": 6, "0e6d7de0aee2e9ae3b007d1e45284b06cc241d02": 6, "src": 6, "l553": 6, "l565": 6, "93": 6, "0965985": 6, "44": 6, "952112199999995": 6, "954734870": 6, "69f13f881649cb21ee3b359730790bb9": 6, "hhmmss_to_datetim": 6, "hhmmss_str": 6, "datetim": 6, "hh": 6, "mm": 6, "ss": 6, "dt": 6, "secs_to_datetim": 6, "sec": 6, "shorten_nam": 6, "geom": 6, "xp": 6, "yp": 6, "zp": 6, "shall": 6, "ident": 6, "def": 6, "id_func": 6, "g2": 6, "g1": 6, "pyproj": 6, "accur": 6, "wgs84": 6, "utm": 6, "32618": 6, "from_cr": 6, "always_xi": 6, "support": 6, "lambda": 6, "unidecod": 6, "error": 6, "replace_str": 6, "transliter": 6, "unicod": 6, "ascii": 6, "\u5317\u4eb0": 6, "bei": 6, "jing": 6, "tri": 6, "codec": 6, "charact": 6, "fall": 6, "back": 6, "five": 6, "faster": 6, "slightli": 6, "slower": 6, "unicode_expect_nonascii": 6, "present": 6, "replac": [6, 11], "strict": 6, "rais": 6, "unidecodeerror": 6, "substitut": 6, "might": [6, 11], "packag": [8, 11], "mtc": 8, "aim": 8, "networkwrangl": [8, 11], "refin": 8, "respect": 8, "citilab": 8, "softwar": [8, 11], "instal": 8, "bleed": 8, "clone": 8, "brief": 8, "intro": 8, "workflow": 8, "quickstart": 8, "jupyt": 8, "notebook": 8, "setup": 8, "scenario": 8, "audit": 8, "report": 8, "logger": 8, "modul": 8, "page": 8, "suggest": 11, "virtualenv": 11, "conda": 11, "virtual": 11, "environ": 11, "recommend": 11, "pip": 11, "lasso": 11, "config": 11, "channel": 11, "forg": 11, "rtree": 11, "my_lasso_environ": 11, "activ": 11, "git": 11, "master": 11, "pypi": 11, "repositori": 11, "date": 11, "branch": 11, "work": 11, "your": 11, "machin": 11, "edit": 11, "plan": 11, "well": 11, "cd": 11, "team": 11, "contribut": 11, "bxack": 11, "pleas": 11, "fork": 11, "befor": 11, "upstream": 11, "branchnam": 11, "frequent": 11, "instruct": 11, "good": 11, "atom": 11, "sublim": 11, "text": 11, "syntax": 11, "desktop": 11, "built": 11, "mashup": 11, "open": 11, "In": 11, "nest": 11, "span": 11, "implement": 11, "novel": 11, "travel": 11, "break": 11, "publictransport": 11, "done": 11, "gui": 11, "public": 11, "transport": 11, "infrastructur": 11, "servic": 11, "tier": 11, "made": 11, "mainli": 11, "my_link_fil": 11, "my_node_fil": 11, "my_shape_fil": 11, "my_select": 11, "35e": 11, "961117623": 11, "2564047368": 11, "my_chang": 11, "my_net": 11, "ml_net": 11, "_": 11, "disconnected_nod": 11, "my_out_prefix": 11, "my_dir": 11, "my_base_scenario": 11, "road_net": 11, "stpaul_link_fil": 11, "stpaul_node_fil": 11, "stpaul_shape_fil": 11, "transit_net": 11, "stpaul_dir": 11, "card_filenam": 11, "3_multiple_roadway_attribute_chang": 11, "multiple_chang": 11, "4_simple_managed_lan": 11, "project_card_directori": 11, "project_card": 11, "project_cards_list": 11, "my_scenario": 11, "create_scenario": 11, "base_scenario": 11, "check_scenario_requisit": 11, "apply_all_project": 11, "scenario_summari": 11, "base_transit_sourc": 11, "build_transit_sourc": 11, "understand": 11, "how": 11, "overrid": 11, "those": 11, "instanti": 11, "yaml": 11, "configur": 11, "config_fil": 11, "f": 11, "my_config": 11, "safe_load": 11, "model_road_net": 11, "my_paramet": 11, "accomplish": 11, "goal": 11, "top": 11, "learn": 11, "basic": 11, "creation": 11, "ipynb": 11}, "objects": {"": [[7, 0, 0, "-", "lasso"]], "lasso": [[0, 1, 1, "", "CubeTransit"], [1, 1, 1, "", "ModelRoadwayNetwork"], [2, 1, 1, "", "Parameters"], [3, 1, 1, "", "Project"], [4, 1, 1, "", "StandardTransit"], [5, 0, 0, "-", "logger"], [6, 0, 0, "-", "util"]], "lasso.CubeTransit": [[0, 2, 1, "", "__init__"], [0, 2, 1, "", "add_additional_time_periods"], [0, 2, 1, "", "add_cube"], [0, 2, 1, "", "build_route_name"], [0, 2, 1, "", "calculate_start_end_times"], [0, 2, 1, "", "create_add_route_card_dict"], [0, 2, 1, "", "create_delete_route_card_dict"], [0, 2, 1, "", "create_from_cube"], [0, 2, 1, "", "create_update_route_card_dict"], [0, 2, 1, "", "cube_properties_to_standard_properties"], [0, 3, 1, "", "diff_dict"], [0, 2, 1, "", "evaluate_differences"], [0, 2, 1, "", "evaluate_route_property_differences"], [0, 2, 1, "", "evaluate_route_shape_changes"], [0, 2, 1, "", "get_time_period_numbers_from_cube_properties"], [0, 3, 1, "", "line_properties"], [0, 3, 1, "", "lines"], [0, 3, 1, "", "parameters"], [0, 3, 1, "", "program_type"], [0, 3, 1, "", "shapes"], [0, 3, 1, "", "source_list"], [0, 2, 1, "", "unpack_route_name"]], "lasso.ModelRoadwayNetwork": [[1, 3, 1, "", "CALCULATED_VALUES"], [1, 2, 1, "", "__init__"], [1, 2, 1, "", "add_counts"], [1, 2, 1, "", "add_incident_link_data_to_nodes"], [1, 2, 1, "", "add_new_roadway_feature_change"], [1, 2, 1, "", "add_variable_using_shst_reference"], [1, 2, 1, "", "addition_map"], [1, 2, 1, "", "apply"], [1, 2, 1, "", "apply_managed_lane_feature_change"], [1, 2, 1, "", "apply_python_calculation"], [1, 2, 1, "", "apply_roadway_feature_change"], [1, 2, 1, "", "assess_connectivity"], [1, 2, 1, "", "build_selection_key"], [1, 2, 1, "", "calculate_area_type"], [1, 2, 1, "", "calculate_centroidconnect"], [1, 2, 1, "", "calculate_county"], [1, 2, 1, "", "calculate_distance"], [1, 2, 1, "", "calculate_mpo"], [1, 2, 1, "", "calculate_use"], [1, 2, 1, "", "convert_int"], [1, 2, 1, "", "create_ML_variable"], [1, 2, 1, "", "create_calculated_variables"], [1, 2, 1, "", "create_dummy_connector_links"], [1, 2, 1, "", "create_hov_corridor_variable"], [1, 2, 1, "", "create_managed_lane_network"], [1, 2, 1, "", "create_managed_variable"], [1, 2, 1, "", "dataframe_to_fixed_width"], [1, 2, 1, "", "delete_roadway_feature_change"], [1, 2, 1, "", "deletion_map"], [1, 2, 1, "", "fill_na"], [1, 2, 1, "", "from_RoadwayNetwork"], [1, 2, 1, "", "get_attribute"], [1, 2, 1, "", "get_managed_lane_node_ids"], [1, 2, 1, "", "get_modal_graph"], [1, 2, 1, "", "get_modal_links_nodes"], [1, 2, 1, "", "get_property_by_time_period_and_group"], [1, 2, 1, "", "identify_segment"], [1, 2, 1, "", "identify_segment_endpoints"], [1, 2, 1, "", "is_network_connected"], [1, 2, 1, "", "load_transform_network"], [1, 2, 1, "", "network_connection_plot"], [1, 2, 1, "", "orig_dest_nodes_foreign_key"], [1, 2, 1, "", "ox_graph"], [1, 2, 1, "", "path_search"], [1, 2, 1, "", "read"], [1, 2, 1, "", "read_match_result"], [1, 2, 1, "", "rename_variables_for_dbf"], [1, 2, 1, "", "roadway_net_to_gdf"], [1, 2, 1, "", "roadway_standard_to_met_council_network"], [1, 2, 1, "", "select_roadway_features"], [1, 2, 1, "", "selection_has_unique_link_id"], [1, 2, 1, "", "selection_map"], [1, 2, 1, "", "shortest_path"], [1, 2, 1, "", "split_properties_by_time_period_and_category"], [1, 2, 1, "", "update_distance"], [1, 2, 1, "", "validate_link_schema"], [1, 2, 1, "", "validate_node_schema"], [1, 2, 1, "", "validate_properties"], [1, 2, 1, "", "validate_selection"], [1, 2, 1, "", "validate_shape_schema"], [1, 2, 1, "", "validate_uniqueness"], [1, 2, 1, "", "write"], [1, 2, 1, "", "write_roadway_as_fixedwidth"], [1, 2, 1, "", "write_roadway_as_shp"]], "lasso.Parameters": [[2, 2, 1, "", "__init__"], [2, 3, 1, "", "county_link_range_dict"], [2, 3, 1, "", "maz_shape_file"], [2, 3, 1, "", "properties_to_split"], [2, 3, 1, "", "zones"]], "lasso.Project": [[3, 3, 1, "", "CALCULATED_VALUES"], [3, 3, 1, "id0", "DEFAULT_PROJECT_NAME"], [3, 3, 1, "id1", "STATIC_VALUES"], [3, 2, 1, "", "__init__"], [3, 2, 1, "", "add_highway_changes"], [3, 2, 1, "", "add_transit_changes"], [3, 3, 1, "", "base_cube_transit_network"], [3, 3, 1, "", "base_roadway_network"], [3, 3, 1, "", "build_cube_transit_network"], [3, 3, 1, "", "card_data"], [3, 2, 1, "", "create_project"], [3, 2, 1, "", "determine_roadway_network_changes_compatability"], [3, 2, 1, "", "emme_id_to_wrangler_id"], [3, 2, 1, "", "emme_name_to_wrangler_name"], [3, 2, 1, "", "evaluate_changes"], [3, 2, 1, "", "get_object_from_network_build_command"], [3, 2, 1, "", "get_operation_from_network_build_command"], [3, 3, 1, "", "parameters"], [3, 3, 1, "", "project_name"], [3, 2, 1, "", "read_logfile"], [3, 2, 1, "", "read_network_build_file"], [3, 3, 1, "", "roadway_link_changes"], [3, 3, 1, "", "roadway_node_changes"], [3, 3, 1, "", "transit_changes"], [3, 2, 1, "", "write_project_card"]], "lasso.StandardTransit": [[4, 2, 1, "", "__init__"], [4, 2, 1, "", "calculate_cube_mode"], [4, 2, 1, "", "cube_format"], [4, 2, 1, "", "evaluate_differences"], [4, 3, 1, "", "feed"], [4, 2, 1, "", "fromTransitNetwork"], [4, 3, 1, "", "parameters"], [4, 2, 1, "", "read_gtfs"], [4, 2, 1, "", "route_properties_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_dict_list"], [4, 2, 1, "", "shape_gtfs_to_emme"], [4, 2, 1, "", "time_to_cube_time_period"], [4, 2, 1, "", "write_as_cube_lin"]], "lasso.logger": [[5, 4, 1, "", "setupLogging"]], "lasso.util": [[6, 1, 1, "", "Point"], [6, 1, 1, "", "Polygon"], [6, 4, 1, "", "column_name_to_parts"], [6, 4, 1, "", "create_locationreference"], [6, 4, 1, "", "geodesic_point_buffer"], [6, 4, 1, "", "get_shared_streets_intersection_hash"], [6, 4, 1, "", "hhmmss_to_datetime"], [6, 1, 1, "", "partial"], [6, 4, 1, "", "secs_to_datetime"], [6, 4, 1, "", "shorten_name"], [6, 4, 1, "", "transform"], [6, 4, 1, "", "unidecode"]], "lasso.util.Point": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "x"], [6, 5, 1, "", "xy"], [6, 5, 1, "", "y"], [6, 5, 1, "", "z"]], "lasso.util.Polygon": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "id0", "exterior"], [6, 2, 1, "", "from_bounds"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 5, 1, "id1", "interiors"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "xy"]], "lasso.util.partial": [[6, 3, 1, "", "args"], [6, 3, 1, "", "func"], [6, 3, 1, "", "keywords"]]}, "objtypes": {"0": "py:module", "1": "py:class", "2": "py:method", "3": "py:attribute", "4": "py:function", "5": "py:property"}, "objnames": {"0": ["py", "module", "Python module"], "1": ["py", "class", "Python class"], "2": ["py", "method", "Python method"], "3": ["py", "attribute", "Python attribute"], "4": ["py", "function", "Python function"], "5": ["py", "property", "Python property"]}, "titleterms": {"lasso": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], "cubetransit": [0, 11], "modelroadwaynetwork": [1, 11], "todo": 1, "paramet": [2, 10, 11], "project": [3, 9, 10, 11], "standardtransit": [4, 11], "logger": 5, "util": [6, 7], "class": 7, "function": 7, "base": 7, "welcom": 8, "": 8, "document": 8, "content": 8, "indic": 8, "tabl": 8, "run": [9, 11], "creat": 9, "file": [9, 10, 11], "scenario": [9, 11], "export": 9, "network": [9, 11], "audit": 9, "report": 9, "setup": 10, "set": 10, "addit": 10, "data": 10, "start": 11, "out": 11, "instal": 11, "bleed": 11, "edg": 11, "from": 11, "clone": 11, "brief": 11, "intro": 11, "compon": 11, "roadwaynetwork": 11, "transitnetwork": 11, "projectcard": 11, "typic": 11, "workflow": 11, "card": 11, "transit": 11, "lin": 11, "cube": 11, "log": 11, "model": 11, "quickstart": 11, "jupyt": 11, "notebook": 11}, "envversion": {"sphinx.domains.c": 3, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 9, "sphinx.domains.index": 1, "sphinx.domains.javascript": 3, "sphinx.domains.math": 2, "sphinx.domains.python": 4, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.intersphinx": 1, "sphinx.ext.todo": 2, "sphinx.ext.viewcode": 1, "sphinx": 58}, "alltitles": {"lasso.CubeTransit": [[0, "lasso-cubetransit"]], "lasso.ModelRoadwayNetwork": [[1, "lasso-modelroadwaynetwork"]], "Todo": [[1, "id1"], [1, "id2"], [1, "id3"], [1, "id4"], [1, "id5"], [1, "id6"]], "lasso.Parameters": [[2, "lasso-parameters"]], "lasso.Project": [[3, "lasso-project"]], "lasso.StandardTransit": [[4, "lasso-standardtransit"]], "lasso.logger": [[5, "module-lasso.logger"]], "lasso.util": [[6, "module-lasso.util"]], "Lasso Classes and Functions": [[7, "module-lasso"]], "Base Classes": [[7, "base-classes"]], "Utils and Functions": [[7, "utils-and-functions"]], "Welcome to lasso\u2019s documentation!": [[8, "welcome-to-lasso-s-documentation"]], "Contents:": [[8, null]], "Indices and tables": [[8, "indices-and-tables"]], "Running Lasso": [[9, "running-lasso"]], "Create project files": [[9, "create-project-files"]], "Create a scenario": [[9, "create-a-scenario"]], "Exporting networks": [[9, "exporting-networks"]], "Auditing and Reporting": [[9, "auditing-and-reporting"]], "Setup": [[10, "setup"]], "Projects": [[10, "projects"]], "Parameters": [[10, "parameters"], [11, "parameters"]], "Settings": [[10, "settings"]], "Additional Data Files": [[10, "additional-data-files"]], "Starting Out": [[11, "starting-out"]], "Installation": [[11, "installation"]], "Bleeding Edge": [[11, "bleeding-edge"]], "From Clone": [[11, "from-clone"]], "Brief Intro": [[11, "brief-intro"]], "Components": [[11, "components"]], "RoadwayNetwork": [[11, "roadwaynetwork"]], "TransitNetwork": [[11, "transitnetwork"]], "ProjectCard": [[11, "projectcard"]], "Scenario": [[11, "scenario"]], "Project": [[11, "project"]], "ModelRoadwayNetwork": [[11, "modelroadwaynetwork"]], "StandardTransit": [[11, "standardtransit"]], "CubeTransit": [[11, "cubetransit"]], "Typical Workflow": [[11, "typical-workflow"]], "Project Cards from Transit LIN Files": [[11, "project-cards-from-transit-lin-files"]], "Project Cards from Cube LOG Files": [[11, "project-cards-from-cube-log-files"]], "Model Network Files for a Scenario": [[11, "model-network-files-for-a-scenario"]], "Running Quickstart Jupyter Notebooks": [[11, "running-quickstart-jupyter-notebooks"]]}, "indexentries": {"cubetransit (class in lasso)": [[0, "lasso.CubeTransit"]], "__init__() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.__init__"]], "add_additional_time_periods() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_additional_time_periods"]], "add_cube() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_cube"]], "build_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.build_route_name"]], "calculate_start_end_times() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.calculate_start_end_times"]], "create_add_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_add_route_card_dict"]], "create_delete_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_delete_route_card_dict"]], "create_from_cube() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.create_from_cube"]], "create_update_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_update_route_card_dict"]], "cube_properties_to_standard_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.cube_properties_to_standard_properties"]], "diff_dict (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.diff_dict"]], "evaluate_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_differences"]], "evaluate_route_property_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_route_property_differences"]], "evaluate_route_shape_changes() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.evaluate_route_shape_changes"]], "get_time_period_numbers_from_cube_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.get_time_period_numbers_from_cube_properties"]], "line_properties (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.line_properties"]], "lines (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.lines"]], "parameters (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.parameters"]], "program_type (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.program_type"]], "shapes (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.shapes"]], "source_list (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.source_list"]], "unpack_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.unpack_route_name"]], "calculated_values (lasso.modelroadwaynetwork attribute)": [[1, "lasso.ModelRoadwayNetwork.CALCULATED_VALUES"]], "modelroadwaynetwork (class in lasso)": [[1, "lasso.ModelRoadwayNetwork"]], "__init__() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.__init__"]], "add_counts() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_counts"]], "add_incident_link_data_to_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.add_incident_link_data_to_nodes"]], "add_new_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_new_roadway_feature_change"]], "add_variable_using_shst_reference() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_variable_using_shst_reference"]], "addition_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.addition_map"]], "apply() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply"]], "apply_managed_lane_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_managed_lane_feature_change"]], "apply_python_calculation() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_python_calculation"]], "apply_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_roadway_feature_change"]], "assess_connectivity() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.assess_connectivity"]], "build_selection_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.build_selection_key"]], "calculate_area_type() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_area_type"]], "calculate_centroidconnect() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_centroidconnect"]], "calculate_county() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_county"]], "calculate_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_distance"]], "calculate_mpo() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_mpo"]], "calculate_use() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_use"]], "convert_int() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.convert_int"]], "create_ml_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_ML_variable"]], "create_calculated_variables() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_calculated_variables"]], "create_dummy_connector_links() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_dummy_connector_links"]], "create_hov_corridor_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_hov_corridor_variable"]], "create_managed_lane_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_lane_network"]], "create_managed_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_variable"]], "dataframe_to_fixed_width() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.dataframe_to_fixed_width"]], "delete_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.delete_roadway_feature_change"]], "deletion_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.deletion_map"]], "fill_na() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.fill_na"]], "from_roadwaynetwork() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.from_RoadwayNetwork"]], "get_attribute() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_attribute"]], "get_managed_lane_node_ids() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_managed_lane_node_ids"]], "get_modal_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_graph"]], "get_modal_links_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_links_nodes"]], "get_property_by_time_period_and_group() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.get_property_by_time_period_and_group"]], "identify_segment() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment"]], "identify_segment_endpoints() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment_endpoints"]], "is_network_connected() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.is_network_connected"]], "load_transform_network() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.load_transform_network"]], "network_connection_plot() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.network_connection_plot"]], "orig_dest_nodes_foreign_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.orig_dest_nodes_foreign_key"]], "ox_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.ox_graph"]], "path_search() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.path_search"]], "read() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read"]], "read_match_result() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read_match_result"]], "rename_variables_for_dbf() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.rename_variables_for_dbf"]], "roadway_net_to_gdf() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.roadway_net_to_gdf"]], "roadway_standard_to_met_council_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.roadway_standard_to_met_council_network"]], "select_roadway_features() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.select_roadway_features"]], "selection_has_unique_link_id() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_has_unique_link_id"]], "selection_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_map"]], "shortest_path() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.shortest_path"]], "split_properties_by_time_period_and_category() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.split_properties_by_time_period_and_category"]], "update_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.update_distance"]], "validate_link_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_link_schema"]], "validate_node_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_node_schema"]], "validate_properties() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_properties"]], "validate_selection() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_selection"]], "validate_shape_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_shape_schema"]], "validate_uniqueness() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_uniqueness"]], "write() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write"]], "write_roadway_as_fixedwidth() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_fixedwidth"]], "write_roadway_as_shp() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_shp"]], "parameters (class in lasso)": [[2, "lasso.Parameters"]], "__init__() (lasso.parameters method)": [[2, "lasso.Parameters.__init__"]], "county_link_range_dict (lasso.parameters attribute)": [[2, "lasso.Parameters.county_link_range_dict"]], "maz_shape_file (lasso.parameters attribute)": [[2, "lasso.Parameters.maz_shape_file"]], "properties_to_split (lasso.parameters attribute)": [[2, "lasso.Parameters.properties_to_split"]], "zones (lasso.parameters attribute)": [[2, "lasso.Parameters.zones"]], "calculated_values (lasso.project attribute)": [[3, "lasso.Project.CALCULATED_VALUES"]], "default_project_name (lasso.project attribute)": [[3, "id0"], [3, "lasso.Project.DEFAULT_PROJECT_NAME"]], "project (class in lasso)": [[3, "lasso.Project"]], "static_values (lasso.project attribute)": [[3, "id1"], [3, "lasso.Project.STATIC_VALUES"]], "__init__() (lasso.project method)": [[3, "lasso.Project.__init__"]], "add_highway_changes() (lasso.project method)": [[3, "lasso.Project.add_highway_changes"]], "add_transit_changes() (lasso.project method)": [[3, "lasso.Project.add_transit_changes"]], "base_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.base_cube_transit_network"]], "base_roadway_network (lasso.project attribute)": [[3, "lasso.Project.base_roadway_network"]], "build_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.build_cube_transit_network"]], "card_data (lasso.project attribute)": [[3, "lasso.Project.card_data"]], "create_project() (lasso.project static method)": [[3, "lasso.Project.create_project"]], "determine_roadway_network_changes_compatability() (lasso.project static method)": [[3, "lasso.Project.determine_roadway_network_changes_compatability"]], "emme_id_to_wrangler_id() (lasso.project static method)": [[3, "lasso.Project.emme_id_to_wrangler_id"]], "emme_name_to_wrangler_name() (lasso.project static method)": [[3, "lasso.Project.emme_name_to_wrangler_name"]], "evaluate_changes() (lasso.project method)": [[3, "lasso.Project.evaluate_changes"]], "get_object_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_object_from_network_build_command"]], "get_operation_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_operation_from_network_build_command"]], "parameters (lasso.project attribute)": [[3, "lasso.Project.parameters"]], "project_name (lasso.project attribute)": [[3, "lasso.Project.project_name"]], "read_logfile() (lasso.project static method)": [[3, "lasso.Project.read_logfile"]], "read_network_build_file() (lasso.project static method)": [[3, "lasso.Project.read_network_build_file"]], "roadway_link_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_link_changes"]], "roadway_node_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_node_changes"]], "transit_changes (lasso.project attribute)": [[3, "lasso.Project.transit_changes"]], "write_project_card() (lasso.project method)": [[3, "lasso.Project.write_project_card"]], "standardtransit (class in lasso)": [[4, "lasso.StandardTransit"]], "__init__() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.__init__"]], "calculate_cube_mode() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.calculate_cube_mode"]], "cube_format() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.cube_format"]], "evaluate_differences() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.evaluate_differences"]], "feed (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.feed"]], "fromtransitnetwork() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.fromTransitNetwork"]], "parameters (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.parameters"]], "read_gtfs() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.read_gtfs"]], "route_properties_gtfs_to_cube() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.route_properties_gtfs_to_cube"]], "shape_gtfs_to_cube() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_cube"]], "shape_gtfs_to_dict_list() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_dict_list"]], "shape_gtfs_to_emme() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_emme"]], "time_to_cube_time_period() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.time_to_cube_time_period"]], "write_as_cube_lin() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.write_as_cube_lin"]], "lasso.logger": [[5, "module-lasso.logger"]], "module": [[5, "module-lasso.logger"], [6, "module-lasso.util"], [7, "module-lasso"]], "setuplogging() (in module lasso.logger)": [[5, "lasso.logger.setupLogging"]], "point (class in lasso.util)": [[6, "lasso.util.Point"]], "polygon (class in lasso.util)": [[6, "lasso.util.Polygon"]], "almost_equals() (lasso.util.point method)": [[6, "lasso.util.Point.almost_equals"]], "almost_equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.almost_equals"]], "area (lasso.util.point property)": [[6, "lasso.util.Point.area"]], "area (lasso.util.polygon property)": [[6, "lasso.util.Polygon.area"]], "args (lasso.util.partial attribute)": [[6, "lasso.util.partial.args"]], "boundary (lasso.util.point property)": [[6, "lasso.util.Point.boundary"]], "boundary (lasso.util.polygon property)": [[6, "lasso.util.Polygon.boundary"]], "bounds (lasso.util.point property)": [[6, "lasso.util.Point.bounds"]], "bounds (lasso.util.polygon property)": [[6, "lasso.util.Polygon.bounds"]], "buffer() (lasso.util.point method)": [[6, "lasso.util.Point.buffer"]], "buffer() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.buffer"]], "centroid (lasso.util.point property)": [[6, "lasso.util.Point.centroid"]], "centroid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.centroid"]], "column_name_to_parts() (in module lasso.util)": [[6, "lasso.util.column_name_to_parts"]], "contains() (lasso.util.point method)": [[6, "lasso.util.Point.contains"]], "contains() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains"]], "contains_properly() (lasso.util.point method)": [[6, "lasso.util.Point.contains_properly"]], "contains_properly() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains_properly"]], "convex_hull (lasso.util.point property)": [[6, "lasso.util.Point.convex_hull"]], "convex_hull (lasso.util.polygon property)": [[6, "lasso.util.Polygon.convex_hull"]], "coords (lasso.util.point property)": [[6, "lasso.util.Point.coords"]], "coords (lasso.util.polygon property)": [[6, "lasso.util.Polygon.coords"]], "covered_by() (lasso.util.point method)": [[6, "lasso.util.Point.covered_by"]], "covered_by() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covered_by"]], "covers() (lasso.util.point method)": [[6, "lasso.util.Point.covers"]], "covers() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covers"]], "create_locationreference() (in module lasso.util)": [[6, "lasso.util.create_locationreference"]], "crosses() (lasso.util.point method)": [[6, "lasso.util.Point.crosses"]], "crosses() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.crosses"]], "difference() (lasso.util.point method)": [[6, "lasso.util.Point.difference"]], "difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.difference"]], "disjoint() (lasso.util.point method)": [[6, "lasso.util.Point.disjoint"]], "disjoint() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.disjoint"]], "distance() (lasso.util.point method)": [[6, "lasso.util.Point.distance"]], "distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.distance"]], "dwithin() (lasso.util.point method)": [[6, "lasso.util.Point.dwithin"]], "dwithin() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.dwithin"]], "envelope (lasso.util.point property)": [[6, "lasso.util.Point.envelope"]], "envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.envelope"]], "equals() (lasso.util.point method)": [[6, "lasso.util.Point.equals"]], "equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals"]], "equals_exact() (lasso.util.point method)": [[6, "lasso.util.Point.equals_exact"]], "equals_exact() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals_exact"]], "exterior (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.exterior"]], "exterior (lasso.util.polygon property)": [[6, "id0"]], "from_bounds() (lasso.util.polygon class method)": [[6, "lasso.util.Polygon.from_bounds"]], "func (lasso.util.partial attribute)": [[6, "lasso.util.partial.func"]], "geodesic_point_buffer() (in module lasso.util)": [[6, "lasso.util.geodesic_point_buffer"]], "geom_type (lasso.util.point property)": [[6, "lasso.util.Point.geom_type"]], "geom_type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.geom_type"]], "geometrytype() (lasso.util.point method)": [[6, "lasso.util.Point.geometryType"]], "geometrytype() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.geometryType"]], "get_shared_streets_intersection_hash() (in module lasso.util)": [[6, "lasso.util.get_shared_streets_intersection_hash"]], "has_z (lasso.util.point property)": [[6, "lasso.util.Point.has_z"]], "has_z (lasso.util.polygon property)": [[6, "lasso.util.Polygon.has_z"]], "hausdorff_distance() (lasso.util.point method)": [[6, "lasso.util.Point.hausdorff_distance"]], "hausdorff_distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.hausdorff_distance"]], "hhmmss_to_datetime() (in module lasso.util)": [[6, "lasso.util.hhmmss_to_datetime"]], "interiors (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.interiors"]], "interiors (lasso.util.polygon property)": [[6, "id1"]], "interpolate() (lasso.util.point method)": [[6, "lasso.util.Point.interpolate"]], "interpolate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.interpolate"]], "intersection() (lasso.util.point method)": [[6, "lasso.util.Point.intersection"]], "intersection() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersection"]], "intersects() (lasso.util.point method)": [[6, "lasso.util.Point.intersects"]], "intersects() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersects"]], "is_closed (lasso.util.point property)": [[6, "lasso.util.Point.is_closed"]], "is_closed (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_closed"]], "is_empty (lasso.util.point property)": [[6, "lasso.util.Point.is_empty"]], "is_empty (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_empty"]], "is_ring (lasso.util.point property)": [[6, "lasso.util.Point.is_ring"]], "is_ring (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_ring"]], "is_simple (lasso.util.point property)": [[6, "lasso.util.Point.is_simple"]], "is_simple (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_simple"]], "is_valid (lasso.util.point property)": [[6, "lasso.util.Point.is_valid"]], "is_valid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_valid"]], "keywords (lasso.util.partial attribute)": [[6, "lasso.util.partial.keywords"]], "lasso.util": [[6, "module-lasso.util"]], "length (lasso.util.point property)": [[6, "lasso.util.Point.length"]], "length (lasso.util.polygon property)": [[6, "lasso.util.Polygon.length"]], "line_interpolate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_interpolate_point"]], "line_interpolate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_interpolate_point"]], "line_locate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_locate_point"]], "line_locate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_locate_point"]], "minimum_clearance (lasso.util.point property)": [[6, "lasso.util.Point.minimum_clearance"]], "minimum_clearance (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_clearance"]], "minimum_rotated_rectangle (lasso.util.point property)": [[6, "lasso.util.Point.minimum_rotated_rectangle"]], "minimum_rotated_rectangle (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_rotated_rectangle"]], "normalize() (lasso.util.point method)": [[6, "lasso.util.Point.normalize"]], "normalize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.normalize"]], "oriented_envelope (lasso.util.point property)": [[6, "lasso.util.Point.oriented_envelope"]], "oriented_envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.oriented_envelope"]], "overlaps() (lasso.util.point method)": [[6, "lasso.util.Point.overlaps"]], "overlaps() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.overlaps"]], "partial (class in lasso.util)": [[6, "lasso.util.partial"]], "point_on_surface() (lasso.util.point method)": [[6, "lasso.util.Point.point_on_surface"]], "point_on_surface() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.point_on_surface"]], "project() (lasso.util.point method)": [[6, "lasso.util.Point.project"]], "project() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.project"]], "relate() (lasso.util.point method)": [[6, "lasso.util.Point.relate"]], "relate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate"]], "relate_pattern() (lasso.util.point method)": [[6, "lasso.util.Point.relate_pattern"]], "relate_pattern() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate_pattern"]], "representative_point() (lasso.util.point method)": [[6, "lasso.util.Point.representative_point"]], "representative_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.representative_point"]], "reverse() (lasso.util.point method)": [[6, "lasso.util.Point.reverse"]], "reverse() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.reverse"]], "secs_to_datetime() (in module lasso.util)": [[6, "lasso.util.secs_to_datetime"]], "segmentize() (lasso.util.point method)": [[6, "lasso.util.Point.segmentize"]], "segmentize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.segmentize"]], "shorten_name() (in module lasso.util)": [[6, "lasso.util.shorten_name"]], "simplify() (lasso.util.point method)": [[6, "lasso.util.Point.simplify"]], "simplify() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.simplify"]], "svg() (lasso.util.point method)": [[6, "lasso.util.Point.svg"]], "svg() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.svg"]], "symmetric_difference() (lasso.util.point method)": [[6, "lasso.util.Point.symmetric_difference"]], "symmetric_difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.symmetric_difference"]], "touches() (lasso.util.point method)": [[6, "lasso.util.Point.touches"]], "touches() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.touches"]], "transform() (in module lasso.util)": [[6, "lasso.util.transform"]], "type (lasso.util.point property)": [[6, "lasso.util.Point.type"]], "type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.type"]], "unidecode() (in module lasso.util)": [[6, "lasso.util.unidecode"]], "union() (lasso.util.point method)": [[6, "lasso.util.Point.union"]], "union() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.union"]], "within() (lasso.util.point method)": [[6, "lasso.util.Point.within"]], "within() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.within"]], "wkb (lasso.util.point property)": [[6, "lasso.util.Point.wkb"]], "wkb (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb"]], "wkb_hex (lasso.util.point property)": [[6, "lasso.util.Point.wkb_hex"]], "wkb_hex (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb_hex"]], "wkt (lasso.util.point property)": [[6, "lasso.util.Point.wkt"]], "wkt (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkt"]], "x (lasso.util.point property)": [[6, "lasso.util.Point.x"]], "xy (lasso.util.point property)": [[6, "lasso.util.Point.xy"]], "xy (lasso.util.polygon property)": [[6, "lasso.util.Polygon.xy"]], "y (lasso.util.point property)": [[6, "lasso.util.Point.y"]], "z (lasso.util.point property)": [[6, "lasso.util.Point.z"]], "lasso": [[7, "module-lasso"]]}}) \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/setup/index.html b/branch/seperate_maz_and_taz/setup/index.html new file mode 100644 index 0000000..224865c --- /dev/null +++ b/branch/seperate_maz_and_taz/setup/index.html @@ -0,0 +1,133 @@ + + + + + + + Setup — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Setup

+
+

Projects

+
+
+

Parameters

+
+
+

Settings

+
+
+

Additional Data Files

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/seperate_maz_and_taz/starting/index.html b/branch/seperate_maz_and_taz/starting/index.html new file mode 100644 index 0000000..1741429 --- /dev/null +++ b/branch/seperate_maz_and_taz/starting/index.html @@ -0,0 +1,434 @@ + + + + + + + Starting Out — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Starting Out

+
+

Installation

+

If you are managing multiple python versions, we suggest using virtualenv or conda virtual environments.

+

Example using a conda environment (recommended) and using the package manager pip to install Lasso from the source on GitHub.

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/Lasso@master
+
+
+

Lasso will install network_wrangler from the PyPi repository because it is included in Lasso’s requirements.txt.

+
+

Bleeding Edge

+

If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the develop branch of

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/network_wrangler@develop
+pip install git+https://github.com/wsp-sag/Lasso@develop
+
+
+
+
+

From Clone

+

If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in editable mode.

+

if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas osmnx -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+git clone https://github.com/wsp-sag/Lasso
+git clone https://github.com/wsp-sag/network_wrangler
+cd network_wrangler
+pip install -e .
+cd ..
+cd Lasso
+pip install -e .
+
+
+

Notes:

+
    +
  1. The -e installs it in editable mode.

  2. +
  3. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per these directions on github.

  4. +
  5. if you wanted to install from a specific tag/version number or branch, replace @master with @<branchname> or @tag

  6. +
  7. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler

  8. +
+

If you are going to be doing Lasso development, we also recommend:

+
    +
  • a good IDE such as Atom, VS Code, Sublime Text, etc. +with Python syntax highlighting turned on.

  • +
  • GitHub Desktop to locally update your clones

  • +
+
+
+
+

Brief Intro

+

Lasso is a ‘wrapper’ around the Network Wrangler utility.

+

Both Lasso and NetworkWrangler are built around the following data schemas:

+
    +
  • [roadway network], which is based on a mashup of Open Street Map and Shared Streets. In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category.

  • +
  • [transit network], which is based on a frequency-based implementation of the csv-based GTFS; and

  • +
  • [project card], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml.

  • +
+

In addition, Lasso utilizes the following data schemas:

+
    +
  • [MetCouncil Model Roadway Network Schema], which adds data fields to the roadway network schema that MetCouncil uses in their travel model including breaking out data fields by time period.

  • +
  • [MetCouncil Model Transit Network Schema], which uses the Cube PublicTransport format, and

  • +
  • [Cube Log Files], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler.

  • +
  • [Cube public transport line files], which define a set of transit lines in the cube software.

  • +
+
+

Components

+

Network Wrangler has the following atomic parts:

+
    +
  • RoadwayNetwork object, which represents the roadway network data as GeoDataFrames;

  • +
  • TransitNetwork object, which represents the transit network data as DataFrames;

  • +
  • ProjectCard object, which represents the data of the project card. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.;

  • +
  • Scenario object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network.

  • +
+

In addition, Lasso has the following atomic parts:

+
    +
  • Project object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file.

  • +
  • ModelRoadwayNetwork object is a subclass of RoadwayNetwork and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube.

  • +
  • StandardTransit, an object for holding a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s Cube Line files.

  • +
  • CubeTransit, an object for storing information about transit defined in Cube public transport line files +. Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries.

  • +
  • Parameters, A class representing all the parameters defining the networks +including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class.

  • +
+
+

RoadwayNetwork

+

Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames links_df, nodes_df, and shapes_df.

+
net = RoadwayNetwork.read(
+        link_filename=MY_LINK_FILE,
+        node_filename=MY_NODE_FILE,
+        shape_filename=MY_SHAPE_FILE,
+        shape_foreign_key ='shape_id',
+        
+    )
+my_selection = {
+    "link": [{"name": ["I 35E"]}],
+    "A": {"osm_node_id": "961117623"},  # start searching for segments at A
+    "B": {"osm_node_id": "2564047368"},
+}
+net.select_roadway_features(my_selection)
+
+my_change = [
+    {
+        'property': 'lanes',
+        'existing': 1,
+        'set': 2,
+     },
+     {
+        'property': 'drive_access',
+        'set': 0,
+      },
+]
+
+my_net.apply_roadway_feature_change(
+    my_net.select_roadway_features(my_selection),
+    my_change
+)
+
+ml_net = net.create_managed_lane_network(in_place=False)
+
+ml_net.is_network_connected(mode="drive"))
+
+_, disconnected_nodes = ml_net.assess_connectivity(
+  mode="walk",
+  ignore_end_nodes=True
+)
+ml_net.write(filename=my_out_prefix, path=my_dir)
+
+
+
+
+

TransitNetwork

+
+
+

ProjectCard

+
+
+

Scenario

+

Manages sets of project cards and tiering from a base scenario/set of networks.

+

+my_base_scenario = {
+    "road_net": RoadwayNetwork.read(
+        link_filename=STPAUL_LINK_FILE,
+        node_filename=STPAUL_NODE_FILE,
+        shape_filename=STPAUL_SHAPE_FILE,
+        fast=True,
+        shape_foreign_key ='shape_id',
+    ),
+    "transit_net": TransitNetwork.read(STPAUL_DIR),
+}
+
+card_filenames = [
+    "3_multiple_roadway_attribute_change.yml",
+    "multiple_changes.yml",
+    "4_simple_managed_lane.yml",
+]
+
+project_card_directory = os.path.join(STPAUL_DIR, "project_cards")
+
+project_cards_list = [
+    ProjectCard.read(os.path.join(project_card_directory, filename), validate=False)
+    for filename in card_filenames
+]
+
+my_scenario = Scenario.create_scenario(
+  base_scenario=my_base_scenario,
+  project_cards_list=project_cards_list,
+)
+my_scenario.check_scenario_requisites()
+
+my_scenario.apply_all_projects()
+
+my_scenario.scenario_summary()
+
+
+
+
+

Project

+

Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network;

+

+test_project = Project.create_project(
+  base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+  build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+  )
+
+test_project.evaluate_changes()
+
+test_project.write_project_card(
+  os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+  )
+
+
+
+
+

ModelRoadwayNetwork

+

A subclass of network_wrangler’s RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema.

+
net = ModelRoadwayNetwork.read(
+      link_filename=STPAUL_LINK_FILE,
+      node_filename=STPAUL_NODE_FILE,
+      shape_filename=STPAUL_SHAPE_FILE,
+      fast=True,
+      shape_foreign_key ='shape_id',
+  )
+
+net.write_roadway_as_fixedwidth()
+
+
+
+
+

StandardTransit

+

Translates the standard GTFS data to MetCouncil’s Cube Line files.

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+

CubeTransit

+

Used by the project class and has the capability to:

+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+

Parameters

+

Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary.

+
# read parameters from a yaml configuration  file
+# could also provide as a key/value pair
+with open(config_file) as f:
+      my_config = yaml.safe_load(f)
+
+# provide parameters at instantiation of ModelRoadwayNetwork
+model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork(
+            my_scenario.road_net, parameters=my_config.get("my_parameters", {})
+        )
+# network written with direction from the parameters given
+model_road_net.write_roadway_as_shp()
+
+
+
+
+
+

Typical Workflow

+

Workflows in Lasso and Network Wrangler typically accomplish one of two goals:

+
    +
  1. Create Project Cards to document network changes as a result of either transit or roadway projects.

  2. +
  3. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network.

  4. +
+
+

Project Cards from Transit LIN Files

+
+
+

Project Cards from Cube LOG Files

+
+
+

Model Network Files for a Scenario

+
+
+
+
+

Running Quickstart Jupyter Notebooks

+

To learn basic lasso functionality, please refer to the following jupyter notebooks in the /notebooks directory:

+
    +
  • Lasso Project Card Creation Quickstart.ipynb

  • +
  • Lasso Scenario Creation Quickstart.ipynb

  • +
+

Jupyter notebooks can be started by activating the lasso conda environment and typing jupyter notebook:

+
conda activate <my_lasso_environment>
+jupyter notebook
+
+
+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/.buildinfo b/branch/test_no_change/.buildinfo new file mode 100644 index 0000000..b0ac44f --- /dev/null +++ b/branch/test_no_change/.buildinfo @@ -0,0 +1,4 @@ +# Sphinx build info version 1 +# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. +config: a6ecde6fa215bda40c6c2c465fb089ee +tags: d77d1c0d9ca2f4c8421862c7c5a0d620 diff --git a/branch/test_no_change/_generated/lasso.CubeTransit/index.html b/branch/test_no_change/_generated/lasso.CubeTransit/index.html new file mode 100644 index 0000000..1bdcd1f --- /dev/null +++ b/branch/test_no_change/_generated/lasso.CubeTransit/index.html @@ -0,0 +1,571 @@ + + + + + + + lasso.CubeTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.CubeTransit

+
+
+class lasso.CubeTransit(parameters={})[source]
+

Bases: object

+

Class for storing information about transit defined in Cube line +files.

+

Has the capability to:

+
+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
+

Typical usage example:

+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+lines
+

list of strings representing unique line names in +the cube network.

+
+
Type:
+

list

+
+
+
+ +
+
+line_properties
+

dictionary of line properties keyed by line name. Property +values are stored in a dictionary by property name. These +properties are directly read from the cube line files and haven’t +been translated to standard transit values.

+
+
Type:
+

dict

+
+
+
+ +
+
+shapes
+

dictionary of shapes +keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns:

+
+
    +
  • ‘node_id’ (int): positive integer of node id

  • +
  • ‘node’ (int): node number, with negative indicating a non-stop

  • +
  • ‘stop’ (boolean): indicates if it is a stop

  • +
  • ‘order’ (int): order within this shape

  • +
+
+
+
Type:
+

dict

+
+
+
+ +
+
+program_type
+

Either PT or TRNBLD

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

Parameters instance that will be applied to this instance which +includes information about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+source_list
+

List of cube line file sources that have been read and added.

+
+
Type:
+

list

+
+
+
+ +
+
+diff_dict
+
+
Type:
+

dict

+
+
+
+ +
+
+__init__(parameters={})[source]
+

Constructor for CubeTransit

+

parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([parameters])

Constructor for CubeTransit

add_additional_time_periods(...)

Copies a route to another cube time period with appropriate values for time-period-specific properties.

add_cube(transit_source)

Reads a .lin file and adds it to existing TransitNetwork instance.

build_route_name([route_id, time_period, ...])

Create a route name by contatenating route, time period, agency, and direction

calculate_start_end_times(line_properties_dict)

Calculate the start and end times of the property change WARNING: Doesn't take care of discongruous time periods!!!!

create_add_route_card_dict(line)

Creates a project card change formatted dictionary for adding a route based on the information in self.route_properties for the line.

create_delete_route_card_dict(line, ...)

Creates a project card change formatted dictionary for deleting a line.

create_from_cube(transit_source[, parameters])

Reads a cube .lin file and stores as TransitNetwork object.

create_update_route_card_dict(line, ...)

Creates a project card change formatted dictionary for updating the line.

cube_properties_to_standard_properties(...)

Converts cube style properties to standard properties.

evaluate_differences(base_transit)

    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
+

evaluate_route_property_differences(...[, ...])

Checks if any values have been updated or added for a specific route and creates project card entries for each.

evaluate_route_shape_changes(shape_build, ...)

Compares two route shapes and constructs returns list of changes suitable for a project card.

get_time_period_numbers_from_cube_properties(...)

Finds properties that are associated with time periods and the returns the numbers in them.

unpack_route_name(line_name)

Unpacks route name into direction, route, agency, and time period info

+
+
+add_additional_time_periods(new_time_period_number, orig_line_name)[source]
+

Copies a route to another cube time period with appropriate +values for time-period-specific properties.

+
+
New properties are stored under the new name in:
    +
  • ::self.shapes

  • +
  • ::self.line_properties

  • +
+
+
+
+
Parameters:
+
    +
  • new_time_period_number (int) – cube time period number

  • +
  • orig_line_name (str) – name of the originating line, from which +the new line will copy its properties.

  • +
+
+
Returns:
+

Line name with new time period.

+
+
+
+ +
+
+add_cube(transit_source)[source]
+

Reads a .lin file and adds it to existing TransitNetwork instance.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
+
+ +
+
+static build_route_name(route_id='', time_period='', agency_id=0, direction_id=1)[source]
+

Create a route name by contatenating route, time period, agency, and direction

+
+
Parameters:
+
    +
  • route_id – i.e. 452-111

  • +
  • time_period – i.e. pk

  • +
  • direction_id – i.e. 1

  • +
  • agency_id – i.e. 0

  • +
+
+
Returns:
+

constructed line_name i.e. “0_452-111_452_pk1”

+
+
+
+ +
+
+calculate_start_end_times(line_properties_dict)[source]
+

Calculate the start and end times of the property change +WARNING: Doesn’t take care of discongruous time periods!!!!

+
+
Parameters:
+

line_properties_dict – dictionary of cube-flavor properties for a transit line

+
+
+
+ +
+
+create_add_route_card_dict(line)[source]
+

Creates a project card change formatted dictionary for adding +a route based on the information in self.route_properties for +the line.

+
+
Parameters:
+

line – name of line that is being updated

+
+
Returns:
+

A project card change-formatted dictionary for the route addition.

+
+
+
+ +
+
+create_delete_route_card_dict(line, base_transit_line_properties_dict)[source]
+

Creates a project card change formatted dictionary for deleting a line.

+
+
Parameters:
+
    +
  • line – name of line that is being deleted

  • +
  • base_transit_line_properties_dict – dictionary of cube-style +attribute values in order to find time periods and +start and end times.

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the route deletion.

+
+
+
+ +
+
+static create_from_cube(transit_source, parameters={})[source]
+

Reads a cube .lin file and stores as TransitNetwork object.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
Returns:
+

A ::CubeTransit object created from the transit_source.

+
+
+
+ +
+
+create_update_route_card_dict(line, updated_properties_dict)[source]
+

Creates a project card change formatted dictionary for updating +the line.

+
+
Parameters:
+
    +
  • line – name of line that is being updated

  • +
  • updated_properties_dict – dictionary of attributes to update as +‘property’: <property name>, +‘set’: <new property value>

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the attribute update.

+
+
+
+ +
+
+static cube_properties_to_standard_properties(cube_properties_dict)[source]
+

Converts cube style properties to standard properties.

+

This is most pertinent to time-period specific variables like headway, +and varibles that have stnadard units like headway, which is minutes +in cube and seconds in standard format.

+
+
Parameters:
+

cube_properties_dict – <cube style property name> : <property value>

+
+
Returns:
+

+
<standard

style property name>, “set” : <property value with correct units>`

+
+
+

+
+
Return type:
+

A list of dictionaries with values for `”property”

+
+
+
+ +
+
+evaluate_differences(base_transit)[source]
+
    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
  3. +
    For routes being added or updated, identify if the time periods

    have changed or if there are multiples, and make duplicate lines if so

    +
    +
    +
  4. +
  5. Create project card dictionaries for each change.

  6. +
+
+
Parameters:
+

base_transit (CubeTransit) – an instance of this class for the base condition

+
+
Returns:
+

A list of dictionaries containing project card changes +required to evaluate the differences between the base network +and this transit network instance.

+
+
+
+ +
+
+evaluate_route_property_differences(properties_build, properties_base, time_period_number, absolute=True, validate_base=False)[source]
+

Checks if any values have been updated or added for a specific +route and creates project card entries for each.

+
+
Parameters:
+
    +
  • properties_build – ::<property_name>: <property_value>

  • +
  • properties_base – ::<property_name>: <property_value>

  • +
  • time_period_number – time period to evaluate

  • +
  • absolute – if True, will use set command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway

  • +
  • validate_base – if True, will add the existing line in the project card

  • +
+
+
Returns:
+

+
a list of dictionary values suitable for writing to a project card

{ +‘property’: <property_name>, +‘set’: <set value>, +‘change’: <change from existing value>, +‘existing’: <existing value to check>, +}

+
+
+

+
+
Return type:
+

transit_change_list (list)

+
+
+
+ +
+
+static evaluate_route_shape_changes(shape_build, shape_base)[source]
+

Compares two route shapes and constructs returns list of changes +suitable for a project card.

+
+
Parameters:
+
    +
  • shape_build – DataFrame of the build-version of the route shape.

  • +
  • shape_base – dDataFrame of the base-version of the route shape.

  • +
+
+
Returns:
+

List of shape changes formatted as a project card-change dictionary.

+
+
+
+ +
+
+static get_time_period_numbers_from_cube_properties(properties_list)[source]
+

Finds properties that are associated with time periods and the +returns the numbers in them.

+
+
Parameters:
+

properties_list (list) – list of all properties.

+
+
Returns:
+

list of strings of the time period numbers found

+
+
+
+ +
+
+static unpack_route_name(line_name)[source]
+

Unpacks route name into direction, route, agency, and time period info

+
+
Parameters:
+

line_name (str) – i.e. “0_452-111_452_pk1”

+
+
Returns:
+

452-111 +time_period (str): i.e. pk +direction_id (str) : i.e. 1 +agency_id (str) : i.e. 0

+
+
Return type:
+

route_id (str)

+
+
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_generated/lasso.ModelRoadwayNetwork/index.html b/branch/test_no_change/_generated/lasso.ModelRoadwayNetwork/index.html new file mode 100644 index 0000000..8837127 --- /dev/null +++ b/branch/test_no_change/_generated/lasso.ModelRoadwayNetwork/index.html @@ -0,0 +1,1573 @@ + + + + + + + lasso.ModelRoadwayNetwork — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.ModelRoadwayNetwork

+
+
+class lasso.ModelRoadwayNetwork(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Bases: RoadwayNetwork

+

Subclass of network_wrangler class RoadwayNetwork

+

A representation of the physical roadway network and its properties.

+
+
+__init__(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Constructor

+
+
Parameters:
+
    +
  • nodes – geodataframe of nodes

  • +
  • links – dataframe of links

  • +
  • shapes – geodataframe of shapes

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. +If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(nodes, links, shapes[, parameters])

Constructor

add_counts([network_variable, ...])

Adds count variable.

add_incident_link_data_to_nodes([links_df, ...])

Add data from links going to/from nodes to node.

add_new_roadway_feature_change(links, nodes)

add the new roadway features defined in the project card.

add_variable_using_shst_reference([...])

Join network links with source data, via SHST API node match result.

addition_map(links, nodes)

Shows which links and nodes are added to the roadway network

apply(project_card_dictionary)

Wrapper method to apply a project to a roadway network.

apply_managed_lane_feature_change(link_idx, ...)

Apply the managed lane feature changes to the roadway network

apply_python_calculation(pycode[, in_place])

Changes roadway network object by executing pycode.

apply_roadway_feature_change(link_idx, ...)

Changes the roadway attributes for the selected features based on the project card information passed

assess_connectivity([mode, ...])

Returns a network graph and list of disconnected subgraphs as described by a list of their member nodes.

build_selection_key(selection_dict)

Selections are stored by a key combining the query and the A and B ids.

calculate_area_type([area_type_shape, ...])

#MC Calculates area type variable.

calculate_centroidconnect(parameters[, ...])

Calculates centroid connector variable.

calculate_county([county_shape, ...])

#MC Calculates county variable.

calculate_distance([network_variable, ...])

calculate link distance in miles

calculate_mpo([county_network_variable, ...])

Calculates mpo variable.

calculate_use([network_variable, ...])

Calculates use variable.

convert_int([int_col_names])

Convert integer columns

create_ML_variable([network_variable, overwrite])

Created ML lanes placeholder for project to write out ML changes

create_calculated_variables()

Creates calculated roadway variables.

create_dummy_connector_links(ml_df[, ...])

create dummy connector links between the general purpose and managed lanes

create_hov_corridor_variable([...])

Created hov corridor placeholder for project to write out corridor changes

create_managed_lane_network([...])

Create a roadway network with managed lanes links separated out.

create_managed_variable([network_variable, ...])

Created placeholder for project to write out managed

dataframe_to_fixed_width(df)

Convert dataframe to fixed width format, geometry column will not be transformed.

delete_roadway_feature_change(links, nodes)

delete the roadway features defined in the project card.

deletion_map(links, nodes)

Shows which links and nodes are deleted from the roadway network

fill_na()

Fill na values from create_managed_lane_network()

from_RoadwayNetwork(roadway_network_object)

RoadwayNetwork to ModelRoadwayNetwork

get_attribute(links_df, join_key, ...)

Gets attribute from source data using SHST match result.

get_managed_lane_node_ids(nodes_list[, scalar])

Transform a list of node IDS by a scalar.

get_modal_graph(links_df, nodes_df[, mode, ...])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

get_modal_links_nodes(links_df, nodes_df[, ...])

Returns nodes and link dataframes for specific mode.

get_property_by_time_period_and_group(prop)

Return a series for the properties with a specific group or time period.

identify_segment(O_id, D_id[, ...])

+
param endpoints:
+

list of length of two unique keys of nodes making up endpoints of segment

+
+
+

identify_segment_endpoints([mode, links_df, ...])

+
param mode:
+

list of modes of the network, one of drive,`transit`,

+
+
+

is_network_connected([mode, links_df, nodes_df])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

load_transform_network(node_filename, ...[, ...])

Reads roadway network files from disk and transforms them into GeoDataFrames.

network_connection_plot(G, ...)

Plot a graph to check for network connection.

orig_dest_nodes_foreign_key(selection[, ...])

Returns the foreign key id (whatever is used in the u and v variables in the links file) for the AB nodes as a tuple.

ox_graph(nodes_df, links_df[, ...])

create an osmnx-flavored network graph

path_search(candidate_links_df, O_id, D_id)

+
param candidate_links:
+

selection of links geodataframe with links likely to be part of path

+
+
+

read(link_filename, node_filename, ...[, ...])

Reads in links and nodes network standard.

read_match_result(path)

Reads the shst geojson match returns.

rename_variables_for_dbf(input_df[, ...])

Rename attributes for DBF/SHP, make sure length within 10 chars.

roadway_net_to_gdf(roadway_net)

+
rtype:
+

GeoDataFrame

+
+
+

roadway_standard_to_met_council_network([...])

Rename and format roadway attributes to be consistent with what metcouncil's model is expecting.

select_roadway_features(selection[, ...])

Selects roadway features that satisfy selection criteria

selection_has_unique_link_id(selection_dict)

+
rtype:
+

bool

+
+
+

selection_map(selected_link_idx[, A, B, ...])

Shows which links are selected for roadway property change or parallel managed lanes category of roadway projects.

shortest_path(graph_links_df, O_id, D_id[, ...])

+
rtype:
+

tuple

+
+
+

split_properties_by_time_period_and_category([...])

Splits properties by time period, assuming a variable structure of

update_distance([links_df, use_shapes, ...])

Calculate link distance in specified units to network variable using either straight line distance or (if specified) shape distance if available.

validate_link_schema(link_filename[, ...])

Validate roadway network data link schema and output a boolean

validate_node_schema(node_file[, ...])

Validate roadway network data node schema and output a boolean

validate_properties(properties[, ...])

If there are change or existing commands, make sure that that property exists in the network.

validate_selection(selection[, ...])

Evaluate whetther the selection dictionary contains the minimum required values.

validate_shape_schema(shape_file[, ...])

Validate roadway network data shape schema and output a boolean

validate_uniqueness()

Confirms that the unique identifiers are met.

write([path, filename])

Writes a network in the roadway network standard

write_roadway_as_fixedwidth(output_dir[, ...])

Writes out fixed width file.

write_roadway_as_shp(output_dir[, ...])

Write out dbf/shp/gpkg for cube.

+

Attributes

+ + + + + + +

CALCULATED_VALUES

+
+
+add_counts(network_variable='AADT', mndot_count_shst_data=None, widot_count_shst_data=None, mndot_count_variable_shp=None, widot_count_variable_shp=None)[source]
+

Adds count variable. +#MC +join the network with count node data, via SHST API node match result

+
+
Parameters:
+
    +
  • network_variable (str) – Name of the variable that should be written to. Default to “AADT”.

  • +
  • mndot_count_shst_data (str) – File path to MNDOT count location SHST API node match result.

  • +
  • widot_count_shst_data (str) – File path to WIDOT count location SHST API node match result.

  • +
  • mndot_count_variable_shp (str) – File path to MNDOT count location geodatabase.

  • +
  • widot_count_variable_shp (str) – File path to WIDOT count location geodatabase.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+ +

Add data from links going to/from nodes to node.

+
+
Return type:
+

DataFrame

+
+
Parameters:
+
    +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
  • link_variables – list of columns in links dataframe to add to incident nodes

  • +
+
+
Returns:
+

nodes DataFrame with link data where length is N*number of links going in/out

+
+
+
+ +
+
+add_new_roadway_feature_change(links, nodes)
+

add the new roadway features defined in the project card. +new shapes are also added for the new roadway links.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – list of dictionaries

  • +
  • nodes – list of dictionaries

  • +
+
+
+

returns: None

+
+ +
+
+add_variable_using_shst_reference(var_shst_csvdata=None, shst_csv_variable=None, network_variable=None, network_var_type=<class 'int'>, overwrite=False)[source]
+

Join network links with source data, via SHST API node match result.

+
+
Parameters:
+
    +
  • var_shst_csvdata (str) – File path to SHST API return.

  • +
  • shst_csv_variable (str) – Variable name in the source data.

  • +
  • network_variable (str) – Name of the variable that should be written to.

  • +
  • network_var_type – Variable type in the written network.

  • +
  • overwrite (bool) – True is overwriting existing variable. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+addition_map(links, nodes)
+

Shows which links and nodes are added to the roadway network

+
+ +
+
+apply(project_card_dictionary)
+

Wrapper method to apply a project to a roadway network.

+
+
Parameters:
+

project_card_dictionary – dict +a dictionary of the project card object

+
+
+
+ +
+
+apply_managed_lane_feature_change(link_idx, properties, in_place=True)
+

Apply the managed lane feature changes to the roadway network

+
+
Parameters:
+
    +
  • link_idx – list of lndices of all links to apply change to

  • +
  • properties – list of dictionarys roadway properties to change

  • +
  • in_place – boolean to indicate whether to update self or return +a new roadway network object

  • +
+
+
+
+ +
+
+apply_python_calculation(pycode, in_place=True)
+

Changes roadway network object by executing pycode.

+
+
Parameters:
+
    +
  • pycode – python code which changes values in the roadway network object

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+
+ +
+
+apply_roadway_feature_change(link_idx, properties, in_place=True)
+

Changes the roadway attributes for the selected features based on the +project card information passed

+
+
Parameters:
+
    +
  • link_idx – list +lndices of all links to apply change to

  • +
  • properties – list of dictionarys +roadway properties to change

  • +
  • in_place – boolean +update self or return a new roadway network object

  • +
+
+
+
+ +
+
+assess_connectivity(mode='', ignore_end_nodes=True, links_df=None, nodes_df=None)
+

Returns a network graph and list of disconnected subgraphs +as described by a list of their member nodes.

+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • ignore_end_nodes – if True, ignores stray singleton nodes

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+
Returns: Tuple of

Network Graph (osmnx flavored networkX DiGraph) +List of disconnected subgraphs described by the list of their

+
+

member nodes (as described by their model_node_id)

+
+
+
+
+ +
+
+build_selection_key(selection_dict)
+

Selections are stored by a key combining the query and the A and B ids. +This method combines the two for you based on the selection dictionary.

+
+
Return type:
+

tuple

+
+
Parameters:
+

selection_dictonary – Selection Dictionary

+
+
+

Returns: Tuple serving as the selection key.

+
+ +
+
+calculate_area_type(area_type_shape=None, area_type_shape_variable=None, network_variable='area_type', area_type_codes_dict=None, downtown_area_type_shape=None, downtown_area_type=None, overwrite=False)[source]
+

#MC +Calculates area type variable.

+

This uses the centroid of the geometry field to determine which area it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • area_type_shape (str) – The File path to area geodatabase.

  • +
  • area_type_shape_variable (str) – The variable name of area type in area geodadabase.

  • +
  • network_variable (str) – The variable name of area type in network standard. Default to “area_type”.

  • +
  • area_type_codes_dict – The dictionary to map input area_type_shape_variable to network_variable

  • +
  • downtown_area_type_shape – The file path to the downtown area type boundary.

  • +
  • downtown_area_type (int) – Integer value of downtown area type

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_centroidconnect(parameters, network_variable='centroidconnect', highest_taz_number=None, as_integer=True, overwrite=False)[source]
+

Calculates centroid connector variable.

+
+
Parameters:
+
    +
  • parameters (Parameters) – A Lasso Parameters, which stores input files.

  • +
  • network_variable (str) – Variable that should be written to in the network. Default to “centroidconnect”

  • +
  • highest_taz_number (int) – the max TAZ number in the network.

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Default to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

RoadwayNetwork

+
+
+
+ +
+
+calculate_county(county_shape=None, county_shape_variable=None, network_variable='county', county_codes_dict=None, overwrite=False)[source]
+

#MC +Calculates county variable.

+

This uses the centroid of the geometry field to determine which county it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • county_shape (str) – The File path to county geodatabase.

  • +
  • county_shape_variable (str) – The variable name of county in county geodadabase.

  • +
  • network_variable (str) – The variable name of county in network standard. Default to “county”.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_distance(network_variable='distance', centroidconnect_only=False, overwrite=False)[source]
+

calculate link distance in miles

+
+
Parameters:
+
    +
  • centroidconnect_only (Bool) – True if calculating distance for centroidconnectors only. Default to False.

  • +
  • overwrite (Bool) – True if overwriting existing variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_mpo(county_network_variable='county', network_variable='mpo', as_integer=True, mpo_counties=None, overwrite=False)[source]
+

Calculates mpo variable. +#MC +:param county_variable: Name of the variable where the county names are stored. Default to “county”. +:type county_variable: str +:param network_variable: Name of the variable that should be written to. Default to “mpo”. +:type network_variable: str +:param as_integer: If true, will convert true/false to 1/0s. +:type as_integer: bool +:param mpo_counties: List of county names that are within mpo region. +:type mpo_counties: list +:param overwrite: True if overwriting existing county variable in network. Default to False. +:type overwrite: Bool

+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_use(network_variable='use', as_integer=True, overwrite=False)[source]
+

Calculates use variable.

+
+
Parameters:
+
    +
  • network_variable (str) – Variable that should be written to in the network. Default to “use”

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Defauly to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+convert_int(int_col_names=[])[source]
+

Convert integer columns

+
+ +
+
+create_ML_variable(network_variable='ML_lanes', overwrite=False)[source]
+

Created ML lanes placeholder for project to write out ML changes

+

ML lanes default to 0, ML info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_calculated_variables()[source]
+

Creates calculated roadway variables.

+
+
Parameters:
+

None

+
+
+
+ +
+ +

create dummy connector links between the general purpose and managed lanes

+
+
Parameters:
+
    +
  • gp_df – GeoDataFrame +dataframe of general purpose links (where managed lane also exists)

  • +
  • ml_df – GeoDataFrame +dataframe of corresponding managed lane links,

  • +
  • access_lanes – int +number of lanes in access dummy link

  • +
  • egress_lanes – int +number of lanes in egress dummy link

  • +
  • access_roadway – str +roaday type for access dummy link

  • +
  • egress_roadway – str +roadway type for egress dummy link

  • +
  • access_name_prefix – str +prefix for access dummy link name

  • +
  • egress_name_prefix – str +prefix for egress dummy link name

  • +
+
+
+
+ +
+
+create_hov_corridor_variable(network_variable='segment_id', overwrite=False)[source]
+

Created hov corridor placeholder for project to write out corridor changes

+

hov corridor id default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_managed_lane_network(keep_same_attributes_ml_and_gp=None, keep_additional_attributes_ml_and_gp=[], managed_lanes_required_attributes=[], managed_lanes_node_id_scalar=None, managed_lanes_link_id_scalar=None, in_place=False)
+

Create a roadway network with managed lanes links separated out. +Add new parallel managed lane links, access/egress links, +and add shapes corresponding to the new links

+
+
Return type:
+

RoadwayNetwork

+
+
Parameters:
+
    +
  • keep_same_attributes_ml_and_gp – list of attributes to copy from general purpose +lane to managed lane. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to KEEP_SAME_ATTRIBUTES_ML_AND_GP.

  • +
  • keep_additional_attributes_ml_and_gp – list of additional attributes to add. This is useful +if you want to leave the default attributes and then ALSO some others.

  • +
  • managed_lanes_required_attributes – list of attributes that are required to be specified +in new managed lanes. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_REQUIRED_ATTRIBUTES.

  • +
  • managed_lanes_node_id_scalar – integer value added to original node IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_NODE_ID_SCALAR.

  • +
  • managed_lanes_link_id_scalar – integer value added to original link IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_LINK_ID_SCALAR.

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+

returns: A RoadwayNetwork instance

+
+ +
+
+create_managed_variable(network_variable='managed', overwrite=False)[source]
+

Created placeholder for project to write out managed

+

managed default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+static dataframe_to_fixed_width(df)[source]
+

Convert dataframe to fixed width format, geometry column will not be transformed.

+
+
Parameters:
+

df (pandas DataFrame) –

+
+
Returns:
+

dataframe with fixed width for each column. +dict: dictionary with columns names as keys, column width as values.

+
+
Return type:
+

pandas dataframe

+
+
+
+ +
+
+delete_roadway_feature_change(links, nodes, ignore_missing=True)
+

delete the roadway features defined in the project card. +valid links and nodes defined in the project gets deleted +and shapes corresponding to the deleted links are also deleted.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – dict +list of dictionaries

  • +
  • nodes – dict +list of dictionaries

  • +
  • ignore_missing – bool +If True, will only warn about links/nodes that are missing from +network but specified to “delete” in project card +If False, will fail.

  • +
+
+
+
+ +
+
+deletion_map(links, nodes)
+

Shows which links and nodes are deleted from the roadway network

+
+ +
+
+fill_na()[source]
+

Fill na values from create_managed_lane_network()

+
+ +
+
+static from_RoadwayNetwork(roadway_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • roadway_network_object (RoadwayNetwork) –

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static get_attribute(links_df, join_key, source_shst_ref_df, source_gdf, field_name)[source]
+

Gets attribute from source data using SHST match result.

+
+
Parameters:
+
    +
  • links_df (dataframe) – The network dataframe that new attribute should be written to.

  • +
  • join_key (str) – SHST ID variable name used to join source data with network dataframe.

  • +
  • source_shst_ref_df (str) – File path to source data SHST match result.

  • +
  • source_gdf (str) – File path to source data.

  • +
  • field_name (str) – Name of the attribute to get from source data.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+static get_managed_lane_node_ids(nodes_list, scalar=4500000)
+

Transform a list of node IDS by a scalar. +..todo #237 what if node ids are not a number?

+
+
Parameters:
+
    +
  • nodes_list – list of integers

  • +
  • scalar – value to add to node IDs

  • +
+
+
+

Returns: list of integers

+
+ +
+
+static get_modal_graph(links_df, nodes_df, mode=None, modes_to_network_link_variables={'bike': ['bike_access'], 'bus': ['bus_only', 'drive_access'], 'drive': ['drive_access'], 'rail': ['rail_only'], 'transit': ['bus_only', 'rail_only', 'drive_access'], 'walk': ['walk_access']})
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: networkx: osmnx: DiGraph of network

+
+ +
+ +

Returns nodes and link dataframes for specific mode.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • modes – list of the modes of the network to be kept, must be in drive,`transit`,`rail`,`bus`, +walk, bike. For example, if bike and walk are selected, both bike and walk links will be kept.

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: tuple of DataFrames for links, nodes filtered by mode

+

links with walk access are not marked as having walk access +Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145 +modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]

+
+ +
+
+get_property_by_time_period_and_group(prop, time_period=None, category=None, default_return=None)
+

Return a series for the properties with a specific group or time period.

+
+
Parameters:
+
    +
  • prop (str) – the variable that you want from network

  • +
  • time_period (list(str)) – the time period that you are querying for +i.e. [‘16:00’, ‘19:00’]

  • +
  • category (str or list(str)(Optional)) –

    the group category +i.e. “sov”

    +

    or

    +

    list of group categories in order of search, i.e. +[“hov3”,”hov2”]

    +

  • +
  • default_return (what to return if variable or time period not found. Default is None.) –

  • +
+
+
Return type:
+

pandas series

+
+
+
+ +
+
+identify_segment(O_id, D_id, selection_dict={}, mode=None, nodes_df=None, links_df=None)
+
+
Parameters:
+
    +
  • endpoints – list of length of two unique keys of nodes making up endpoints of segment

  • +
  • selection_dict – dictionary of link variables to select candidate links from, otherwise will create a graph of ALL links which will be both a RAM hog and could result in odd shortest paths.

  • +
  • segment_variables – list of variables to keep

  • +
+
+
+
+ +
+
+identify_segment_endpoints(mode='', links_df=None, nodes_df=None, min_connecting_links=10, min_distance=None, max_link_deviation=2)
+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+ +
+
+is_network_connected(mode=None, links_df=None, nodes_df=None)
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
+
+
+

Returns: boolean

+
+ +
+
+static load_transform_network(node_filename, link_filename, shape_filename, crs=4326, node_foreign_key='model_node_id', validate_schema=True, **kwargs)
+

Reads roadway network files from disk and transforms them into GeoDataFrames.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • node_filename – file name for nodes.

  • +
  • link_filename – file name for links.

  • +
  • shape_filename – file name for shapes.

  • +
  • crs – coordinate reference system. Defaults to value in CRS.

  • +
  • node_foreign_key – variable linking the node table to the link table. Defaults +to NODE_FOREIGN_KEY.

  • +
  • validate_schema – boolean indicating if network should be validated to schema.

  • +
+
+
+

returns: tuple of GeodataFrames nodes_df, links_df, shapes_df

+
+ +
+
+static network_connection_plot(G, disconnected_subgraph_nodes)
+

Plot a graph to check for network connection.

+
+
Parameters:
+
    +
  • G – OSMNX flavored networkX graph.

  • +
  • disconnected_subgraph_nodes – List of disconnected subgraphs described by the list of their +member nodes (as described by their model_node_id).

  • +
+
+
+

returns: fig, ax : tuple

+
+ +
+
+orig_dest_nodes_foreign_key(selection, node_foreign_key='')
+

Returns the foreign key id (whatever is used in the u and v +variables in the links file) for the AB nodes as a tuple.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • selection – selection dictionary with A and B keys

  • +
  • node_foreign_key – variable name for whatever is used by the u and v variable

  • +
  • specified (in the links_df file. If nothing is) –

  • +
  • whatever (assume) –

  • +
  • is (default) –

  • +
+
+
+

Returns: tuple of (A_id, B_id)

+
+ +
+
+static ox_graph(nodes_df, links_df, node_foreign_key='model_node_id', link_foreign_key=['A', 'B'], unique_link_key='model_link_id')
+

create an osmnx-flavored network graph

+

osmnx doesn’t like values that are arrays, so remove the variables +that have arrays. osmnx also requires that certain variables +be filled in, so do that too.

+
+
Parameters:
+
    +
  • nodes_df – GeoDataFrame of nodes

  • +
  • link_df – GeoDataFrame of links

  • +
  • node_foreign_key – field referenced in link_foreign_key

  • +
  • link_foreign_key – list of attributes that define the link start and end nodes to the node foreign key

  • +
  • unique_link_key – primary key for links

  • +
+
+
+

Returns: a networkx multidigraph

+
+ +
+ +
+
Parameters:
+
    +
  • candidate_links – selection of links geodataframe with links likely to be part of path

  • +
  • O_id – origin node foreigh key ID

  • +
  • D_id – destination node foreigh key ID

  • +
  • weight_column – column to use for weight of shortest path. Defaults to “i” (iteration)

  • +
  • weight_factor – optional weight to multiply the weight column by when finding the shortest path

  • +
  • search_breadth

  • +
+
+
+

Returns

+
+ +
+
+static read(link_filename, node_filename, shape_filename, fast=False, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Reads in links and nodes network standard.

+
+
Parameters:
+
    +
  • link_filename (str) – File path to link json.

  • +
  • node_filename (str) – File path to node geojson.

  • +
  • shape_filename (str) – File path to link true shape geojson

  • +
  • fast (bool) – boolean that will skip validation to speed up read time.

  • +
  • recalculate_calculated_variables (bool) – calculates fields from spatial joins, etc.

  • +
  • recalculate_distance (bool) – re-calculates distance.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static read_match_result(path)[source]
+

Reads the shst geojson match returns.

+

Returns shst dataframe.

+

Reading lots of same type of file and concatenating them into a single DataFrame.

+
+
Parameters:
+

path (str) – File path to SHST match results.

+
+
Returns:
+

geopandas geodataframe

+
+
Return type:
+

geodataframe

+
+
+

##todo +not sure why we need, but should be in utilities not this class

+
+ +
+
+rename_variables_for_dbf(input_df, variable_crosswalk=None, output_variables=None, convert_geometry_to_xy=False)[source]
+

Rename attributes for DBF/SHP, make sure length within 10 chars.

+
+
Parameters:
+
    +
  • input_df (dataframe) – Network standard DataFrame.

  • +
  • variable_crosswalk (str) – File path to variable name crosswalk from network standard to DBF names.

  • +
  • output_variables (list) – List of strings for DBF variables.

  • +
  • convert_geometry_to_xy (bool) – True if converting node geometry to X/Y

  • +
+
+
Returns:
+

dataframe

+
+
+
+ +
+
+static roadway_net_to_gdf(roadway_net)
+
+
Return type:
+

GeoDataFrame

+
+
+

Turn the roadway network into a GeoDataFrame +:param roadway_net: the roadway network to export

+

returns: shapes dataframe

+
+ +
+
+roadway_standard_to_met_council_network(output_epsg=None)[source]
+

Rename and format roadway attributes to be consistent with what metcouncil’s model is expecting. +#MC +:param output_epsg: epsg number of output network. +:type output_epsg: int

+
+
Returns:
+

None

+
+
+
+ +
+
+select_roadway_features(selection, search_mode='drive', force_search=False, sp_weight_factor=None)
+

Selects roadway features that satisfy selection criteria

+
+
Return type:
+

GeoDataFrame

+
+
+
+
Example usage:
+
net.select_roadway_features(
+
selection = [ {

# a match condition for the from node using osm, +# shared streets, or model node number +‘from’: {‘osm_model_link_id’: ‘1234’}, +# a match for the to-node.. +‘to’: {‘shstid’: ‘4321’}, +# a regex or match for facility condition +# could be # of lanes, facility type, etc. +‘facility’: {‘name’:’Main St’}, +}, … ])

+
+
+
+
+
+
+
+
Parameters:
+
    +
  • selection – dictionary with keys for: +A - from node +B - to node +link - which includes at least a variable for name

  • +
  • search_mode – mode which you are searching for; defaults to “drive”

  • +
  • force_search – boolean directing method to perform search even if one +with same selection dict is stored from a previous search.

  • +
  • sp_weight_factor – multiple used to discourage shortest paths which +meander from original search returned from name or ref query. +If not set here, will default to value of sp_weight_factor in +RoadwayNetwork instance. If not set there, will defaul to SP_WEIGHT_FACTOR.

  • +
+
+
+

Returns: a list of link IDs in selection

+
+ +
+ +
+
Return type:
+

bool

+
+
Parameters:
+

selection_dictionary – Dictionary representation of selection +of roadway features, containing a “link” key.

+
+
+
+
Returns: A boolean indicating if the selection dictionary contains

a unique identifier for links.

+
+
+
+ +
+
+selection_map(selected_link_idx, A=None, B=None, candidate_link_idx=[])
+

Shows which links are selected for roadway property change or parallel +managed lanes category of roadway projects.

+
+
Parameters:
+
    +
  • selected_links_idx – list of selected link indices

  • +
  • candidate_links_idx – optional list of candidate link indices to also include in map

  • +
  • A – optional foreign key of starting node of a route selection

  • +
  • B – optional foreign key of ending node of a route selection

  • +
+
+
+
+ +
+
+shortest_path(graph_links_df, O_id, D_id, nodes_df=None, weight_column='i', weight_factor=100)
+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • graph_links_df

  • +
  • O_id – foreign key for start node

  • +
  • D_id – foreign key for end node

  • +
  • nodes_df – optional nodes df, otherwise will use network instance

  • +
  • weight_column – column to use as a weight, defaults to “i”

  • +
  • weight_factor – any additional weighting to multiply the weight column by, defaults to SP_WEIGHT_FACTOR

  • +
+
+
+

Returns: tuple with length of four +- Boolean if shortest path found +- nx Directed graph of graph links +- route of shortest path nodes as List +- links in shortest path selected from links_df

+
+ +
+
+split_properties_by_time_period_and_category(properties_to_split=None)[source]
+

Splits properties by time period, assuming a variable structure of

+
+
Parameters:
+

properties_to_split

dict +dictionary of output variable prefix mapped to the source variable and what to stratify it by +e.g. +{

+
+

’lanes’ : {‘v’:’lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘ML_lanes’ : {‘v’:’ML_lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘use’ : {‘v’:’use’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}},

+
+

}

+

+
+
+
+ +
+
+update_distance(links_df=None, use_shapes=False, units='miles', network_variable='distance', overwrite=True, inplace=True)
+

Calculate link distance in specified units to network variable using either straight line +distance or (if specified) shape distance if available.

+
+
Parameters:
+
    +
  • links_df – Links GeoDataFrame. Useful if want to update a portion of network links +(i.e. only centroid connectors). If not provided, will use entire self.links_df.

  • +
  • use_shapes – if True, will add length information from self.shapes_df rather than crow-fly. +If no corresponding shape found in self.shapes_df, will default to crow-fly.

  • +
  • units – units to use. Defaults to the standard unit of miles. Available units: “meters”, “miles”.

  • +
  • network_variable – variable to store link distance in. Defaults to “distance”.

  • +
  • overwrite – Defaults to True and will overwrite all existing calculated distances. +False will only update NaNs.

  • +
  • inplace – updates self.links_df

  • +
+
+
Returns:
+

links_df with updated distance

+
+
+
+ +
+ +

Validate roadway network data link schema and output a boolean

+
+ +
+
+static validate_node_schema(node_file, schema_location='roadway_network_node.json')
+

Validate roadway network data node schema and output a boolean

+
+ +
+
+validate_properties(properties, ignore_existing=False, require_existing_for_change=False)
+

If there are change or existing commands, make sure that that +property exists in the network.

+
+
Return type:
+

bool

+
+
Parameters:
+
    +
  • properties – properties dictionary to be evaluated

  • +
  • ignore_existing – If True, will only warn about properties +that specify an “existing” value. If False, will fail.

  • +
  • require_existing_for_change – If True, will fail if there isn’t +a specified value in theproject card for existing when a +change is specified.

  • +
+
+
+

Returns: boolean value as to whether the properties dictonary is valid.

+
+ +
+
+validate_selection(selection, selection_requires=['link'])
+

Evaluate whetther the selection dictionary contains the +minimum required values.

+
+
Return type:
+

bool

+
+
Parameters:
+

selection – selection dictionary to be evaluated

+
+
+

Returns: boolean value as to whether the selection dictonary is valid.

+
+ +
+
+static validate_shape_schema(shape_file, schema_location='roadway_network_shape.json')
+

Validate roadway network data shape schema and output a boolean

+
+ +
+
+validate_uniqueness()
+

Confirms that the unique identifiers are met.

+
+
Return type:
+

bool

+
+
+
+ +
+
+write(path='.', filename=None)
+

Writes a network in the roadway network standard

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • path – the path were the output will be saved

  • +
  • filename – the name prefix of the roadway files that will be generated

  • +
+
+
+
+ +
+
+write_roadway_as_fixedwidth(output_dir, node_output_variables=None, link_output_variables=None, output_link_txt=None, output_node_txt=None, output_link_header_width_txt=None, output_node_header_width_txt=None, output_cube_network_script=None, drive_only=False)[source]
+

Writes out fixed width file.

+

This function does: +1. write out link and node fixed width data files for cube. +2. write out header and width correspondence. +3. write out cube network building script with header and width specification.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to where links, nodes and script will be written and run

  • +
  • node_output_variables (list) – list of node variable names.

  • +
  • link_output_variables (list) – list of link variable names.

  • +
  • output_link_txt (str) – File name of output link database (within output_dir)

  • +
  • output_node_txt (str) – File name of output node database (within output_dir)

  • +
  • output_link_header_width_txt (str) – File name of link column width records (within output_dir)

  • +
  • output_node_header_width_txt (str) – File name of node column width records (within output_dir)

  • +
  • output_cube_network_script (str) – File name of CUBE network building script (within output_dir)

  • +
  • drive_only (bool) – If True, only writes drive nodes and links

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+write_roadway_as_shp(output_dir, node_output_variables=None, link_output_variables=None, data_to_csv=True, data_to_dbf=False, output_link_shp=None, output_node_shp=None, output_link_csv=None, output_node_csv=None, output_gpkg=None, output_link_gpkg_layer=None, output_node_gpkg_layer=None, output_gpkg_link_filter=None)[source]
+

Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to directory

  • +
  • node_output_variables (list) – List of strings for node output variables.

  • +
  • link_output_variables (list) – List of strings for link output variables.

  • +
  • data_to_csv (bool) – True if write network in csv format.

  • +
  • data_to_dbf (bool) – True if write network in dbf/shp format.

  • +
  • output_link_shp (str) – File name to output link dbf/shp.

  • +
  • output_node_shp (str) – File name of output node dbf/shp.

  • +
  • output_link_csv (str) – File name to output link csv.

  • +
  • output_node_csv (str) – File name to output node csv.

  • +
  • output_gpkg (str) – File name to output GeoPackage.

  • +
  • output_link_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_node_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_gpkg_link_filter (str) – Optional column name to additional output link subset layers

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_generated/lasso.Parameters/index.html b/branch/test_no_change/_generated/lasso.Parameters/index.html new file mode 100644 index 0000000..9654be6 --- /dev/null +++ b/branch/test_no_change/_generated/lasso.Parameters/index.html @@ -0,0 +1,555 @@ + + + + + + + lasso.Parameters — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Parameters

+
+
+class lasso.Parameters(**kwargs)[source]
+

Bases: object

+

A class representing all the parameters defining the networks +including time of day, categories, etc.

+

Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class. +.. highlight:: python

+
+
Attr:
+
time_period_to_time (dict): Maps time period abbreviations used in

Cube to time of days used on gtfs and highway network standard +Default:

+
{
+    "EA": ("3:00", "6:00"),
+    "AM": ("6:00, "10:00"),
+    "MD": ("10:00", "15:00"),
+    "PM": ("15:00", "19:00"),
+    "EV": ("19:00", "3:00"),
+}
+
+
+
+
cube_time_periods (dict): Maps cube time period numbers used in

transit line files to the time period abbreviations in time_period_to_time +dictionary. +Default:

+
{"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"}
+
+
+
+
categories (dict): Maps demand category abbreviations to a list of

network categories they are allowed to use. +Default:

+
{
+    # suffix, source (in order of search)
+    "sov": ["sov", "default"],
+    "hov2": ["hov2", "default", "sov"],
+    "hov3": ["hov3", "hov2", "default", "sov"],
+    "truck": ["trk", "sov", "default"],
+}
+
+
+
+
properties_to_split (dict): Dictionary mapping variables in standard

roadway network to categories and time periods that need to be +split out in final model network to get variables like LANES_AM. +Default:

+
{
+    "lanes": {
+        "v": "lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "ML_lanes": {
+        "v": "ML_lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "use": {
+        "v": "use",
+        "time_periods": self.time_periods_to_time
+    },
+}
+
+
+
+
county_shape (str): File location of shapefile defining counties.

Default:

+
r"metcouncil_data/county/cb_2017_us_county_5m.shp"
+
+
+
+
county_variable_shp (str): Property defining the county n ame in

the county_shape file. +Default:

+
NAME
+
+
+
+
lanes_lookup_file (str): Lookup table of number of lanes for different data sources.

Default:

+
r"metcouncil_data/lookups/lanes.csv"
+
+
+
+
centroid_connect_lanes (int): Number of lanes for centroid connectors.

Default:

+
1
+
+
+
+
mpo_counties (list): list of county names within MPO boundary.

Default:

+
[
+    "ANOKA",
+    "DAKOTA",
+    "HENNEPIN",
+    "RAMSEY",
+    "SCOTT",
+    "WASHINGTON",
+    "CARVER",
+]
+
+
+
+
taz_shape (str):

Default:

+
r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp"
+
+
+
+
taz_data (str):

Default:

+
??
+
+
+
+
highest_taz_number (int): highest TAZ number in order to define

centroid connectors. +Default:

+
3100
+
+
+
+
output_variables (list): list of variables to output in final model

network. +Default:

+
[
+    "model_link_id",
+    "link_id",
+    "A",
+    "B",
+    "shstGeometryId",
+    "distance",
+    "roadway",
+    "name",
+    "roadway_class",
+    "bike_access",
+    "walk_access",
+    "drive_access",
+    "truck_access",
+    "trn_priority_EA",
+    "trn_priority_AM",
+    "trn_priority_MD",
+    "trn_priority_PM",
+    "trn_priority_EV",
+    "ttime_assert_EA",
+    "ttime_assert_AM",
+    "ttime_assert_MD",
+    "ttime_assert_PM",
+    "ttime_assert_EV",
+    "lanes_EA",
+    "lanes_AM",
+    "lanes_MD",
+    "lanes_PM",
+    "lanes_EV",
+    "price_sov_EA",
+    "price_hov2_EA",
+    "price_hov3_EA",
+    "price_truck_EA",
+    "price_sov_AM",
+    "price_hov2_AM",
+    "price_hov3_AM",
+    "price_truck_AM",
+    "price_sov_MD",
+    "price_hov2_MD",
+    "price_hov3_MD",
+    "price_truck_MD",
+    "price_sov_PM",
+    "price_hov2_PM",
+    "price_hov3_PM",
+    "price_truck_PM",
+    "price_sov_EV",
+    "price_hov2_EV",
+    "price_hov3_EV",
+    "price_truck_EV",
+    "roadway_class_idx",
+    "facility_type",
+    "county",
+    "centroidconnect",
+    "model_node_id",
+    "N",
+    "osm_node_id",
+    "bike_node",
+    "transit_node",
+    "walk_node",
+    "drive_node",
+    "geometry",
+    "X",
+    "Y",
+    "ML_lanes_EA",
+    "ML_lanes_AM",
+    "ML_lanes_MD",
+    "ML_lanes_PM",
+    "ML_lanes_EV",
+    "segment_id",
+    "managed",
+    "bus_only",
+    "rail_only"
+]
+
+
+
+
osm_facility_type_dict (dict): Mapping between OSM Roadway variable

and facility type. Default:

+
+
area_type_shape (str): Location of shapefile defining area type.

Default:

+
r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp"
+
+
+
+
area_type_variable_shp (str): property in area_type_shape with area

type in it. +Default:

+
"COMDES2040"
+
+
+
+
area_type_code_dict (dict): Mapping of the area_type_variable_shp to

the area type code used in the MetCouncil cube network. +Default:

+
{
+    23: 4,  # urban center
+    24: 3,
+    25: 2,
+    35: 2,
+    36: 1,
+    41: 1,
+    51: 1,
+    52: 1,
+    53: 1,
+    60: 1,
+}
+
+
+
+
downtown_area_type_shape (str): Location of shapefile defining downtown area type.

Default:

+
r"metcouncil_data/area_type/downtownzones_TAZ.shp"
+
+
+
+
downtown_area_type (int): Area type integer for downtown.

Default:

+
5
+
+
+
+
mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property

associated with roadway class. Default:

+
r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp"
+
+
+
+
mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp

associated with roadway class. Default:

+
"ROUTE_SYS"
+
+
+
+
widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property

associated with roadway class. Default:

+
r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp"
+
+
+
+
widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape

associated with roadway class.Default:

+
"RDWY_CTGY_"
+
+
+
+
mndot_count_shape (str): Shapefile of MnDOT links with a property

associated with counts. Default:

+
r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp"
+
+
+
+
mndot_count_variable_shp (str): The property in mndot_count_shape

associated with counts. Default:

+
+
::

“lookups/osm_highway_facility_type_crosswalk.csv”

+
+
+
+
legacy_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from Legacy TM2 network. Default:

+
"lookups/legacy_tm2_attributes.csv"
+
+
+
+
osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId

from OSM. Default:

+
"lookups/osm_lanes_attributes.csv"
+
+
+
+
tam_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from TAM TM2 network. Default:

+
"lookups/tam_tm2_attributes.csv"
+
+
+
+
tom_tom_attributes (str): CSV file of link attributes by

shStReferenceId from TomTom network. Default:

+
"lookups/tomtom_attributes.csv"
+
+
+
+
sfcta_attributes (str): CSV file of link attributes by

shStReferenceId from SFCTA network. Default:

+
"lookups/sfcta_attributes.csv"
+
+
+
+
output_epsg (int): EPSG type of geographic projection for output

shapefiles. Default:

+
102646
+
+
+
+
output_link_shp (str): Output shapefile for roadway links. Default:
+
::

r”tests/scratch/links.shp”

+
+
+
+
output_node_shp (str): Output shapefile for roadway nodes. Default:
+
::

r”tests/scratch/nodes.shp”

+
+
+
+
output_link_csv (str): Output csv for roadway links. Default:
+
::

r”tests/scratch/links.csv”

+
+
+
+
output_node_csv (str): Output csv for roadway nodes. Default:
+
::

r”tests/scratch/nodes.csv”

+
+
+
+
output_link_txt (str): Output fixed format txt for roadway links. Default:
+
::

r”tests/scratch/links.txt”

+
+
+
+
output_node_txt (str): Output fixed format txt for roadway nodes. Default:
+
::

r”tests/scratch/nodes.txt”

+
+
+
+
output_link_header_width_txt (str): Header for txt roadway links. Default:
+
::

r”tests/scratch/links_header_width.txt”

+
+
+
+
output_node_header_width_txt (str): Header for txt for roadway Nodes. Default:
+
::

r”tests/scratch/nodes_header_width.txt”

+
+
+
+
output_cube_network_script (str): Cube script for importing

fixed-format roadway network. Default:

+
r"tests/scratch/make_complete_network_from_fixed_width_file.s
+
+
+
+
+
+
+
+
+__init__(**kwargs)[source]
+

Time period and category splitting info

+
+ +

Methods

+ + + + + + +

__init__(**kwargs)

Time period and category splitting info

+

Attributes

+ + + + + + + + + + + + + + + +

maz_shape_file

#MC self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7}

properties_to_split

Details for calculating the county based on the centroid of the link.

county_link_range_dict

self.county_code_dict = {

zones

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+
self.county_code_dict = {

“Anoka”: 1, +“Carver”: 2, +“Dakota”: 3, +“Hennepin”: 4, +“Ramsey”: 5, +“Scott”: 6, +“Washington”: 7, +“external”: 10, +“Chisago”: 11, +“Goodhue”: 12, +“Isanti”: 13, +“Le Sueur”: 14, +“McLeod”: 15, +“Pierce”: 16, +“Polk”: 17, +“Rice”: 18, +“Sherburne”: 19, +“Sibley”: 20, +“St. Croix”: 21, +“Wright”: 22,

+
+
+

}

+
+ +
+
+maz_shape_file
+

#MC +self.route_type_bus_mode_dict = {“Urb Loc”: 5, “Sub Loc”: 6, “Express”: 7}

+

self.route_type_mode_dict = {0: 8, 2: 9}

+

self.cube_time_periods = {“1”: “AM”, “2”: “MD”} +self.cube_time_periods_name = {“AM”: “pk”, “MD”: “op”}

+
+ +
+
+properties_to_split
+

Details for calculating the county based on the centroid of the link. +The NAME varible should be the name of a field in shapefile.

+
+ +
+
+zones
+

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_generated/lasso.Project/index.html b/branch/test_no_change/_generated/lasso.Project/index.html new file mode 100644 index 0000000..db9cc2c --- /dev/null +++ b/branch/test_no_change/_generated/lasso.Project/index.html @@ -0,0 +1,521 @@ + + + + + + + lasso.Project — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Project

+
+
+class lasso.Project(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

Bases: object

+

A single or set of changes to the roadway or transit system.

+

Compares a base and a build transit network or a base and build +highway network and produces project cards.

+

Typical usage example:

+
test_project = Project.create_project(
+    base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+    build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+)
+test_project.evaluate_changes()
+test_project.write_project_card(
+    os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+)
+
+
+
+
+DEFAULT_PROJECT_NAME
+

a class-level constant that defines what +the project name will be if none is set.

+
+ +
+
+STATIC_VALUES
+

a class-level constant which defines values that +are not evaluated when assessing changes.

+
+ +
+
+card_data
+

{“project”: <project_name>, “changes”: <list of change dicts>}

+
+
Type:
+

dict

+
+
+
+ +
+ +

pandas dataframe of CUBE roadway link changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+roadway_node_changes
+

pandas dataframe of CUBE roadway node changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+transit_changes
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+base_roadway_network
+
+
Type:
+

RoadwayNetwork

+
+
+
+ +
+
+base_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+build_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+project_name
+

name of the project, set to DEFAULT_PROJECT_NAME if not provided

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

an instance of the Parameters class which sets a bunch of parameters

+
+ +
+
+__init__(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

ProjectCard constructor.

+
+
Parameters:
+
    +
  • roadway_link_changes – dataframe of roadway changes read from a log file

  • +
  • roadway_node_changes – dataframe of roadway changes read from a log file

  • +
  • transit_changes – dataframe of transit changes read from a log file

  • +
  • base_roadway_network – RoadwayNetwork instance for base case

  • +
  • base_transit_network – StandardTransit instance for base case

  • +
  • base_cube_transit_network – CubeTransit instance for base transit network

  • +
  • build_cube_transit_network – CubeTransit instance for build transit network

  • +
  • project_name – name of the project

  • +
  • evaluate – defaults to false, but if true, will create card data

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
+

returns: instance of ProjectCard

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([roadway_link_changes, ...])

ProjectCard constructor.

add_highway_changes([...])

Evaluates changes from the log file based on the base highway object and adds entries into the self.card_data dictionary.

add_transit_changes()

Evaluates changes between base and build transit objects and adds entries into the self.card_data dictionary.

create_project([roadway_log_file, ...])

Constructor for a Project instance.

determine_roadway_network_changes_compatability(...)

Checks to see that any links or nodes that change exist in base roadway network.

emme_id_to_wrangler_id(emme_link_change_df, ...)

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

emme_name_to_wrangler_name(...)

rename emme names to wrangler names using crosswalk file

evaluate_changes()

Determines which changes should be evaluated, initiates self.card_data to be an aggregation of transit and highway changes.

get_object_from_network_build_command()

determine the network build object is node or link

get_operation_from_network_build_command()

determine the network build object action type

read_logfile(logfilename)

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

read_network_build_file(networkbuildfilename)

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

write_project_card([filename])

Writes project cards.

+

Attributes

+ + + + + + + + + + + + +

CALCULATED_VALUES

DEFAULT_PROJECT_NAME

STATIC_VALUES

+
+
+add_highway_changes(limit_variables_to_existing_network=False)[source]
+

Evaluates changes from the log file based on the base highway object and +adds entries into the self.card_data dictionary.

+
+
Parameters:
+

limit_variables_to_existing_network (bool) – True if no ad-hoc variables. Default to False.

+
+
+
+ +
+
+add_transit_changes()[source]
+

Evaluates changes between base and build transit objects and +adds entries into the self.card_data dictionary.

+
+ +
+
+static create_project(roadway_log_file=None, roadway_shp_file=None, roadway_csv_file=None, network_build_file=None, emme_node_id_crosswalk_file=None, emme_name_crosswalk_file=None, base_roadway_dir=None, base_transit_dir=None, base_cube_transit_source=None, build_cube_transit_source=None, roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name=None, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Constructor for a Project instance.

+
+
Parameters:
+
    +
  • roadway_log_file – File path to consuming logfile or a list of logfile paths.

  • +
  • roadway_shp_file – File path to consuming shape file for roadway changes.

  • +
  • roadway_csv_file – File path to consuming csv file for roadway changes.

  • +
  • network_build_file – File path to consuming EMME network build for network changes.

  • +
  • base_roadway_dir – Folder path to base roadway network.

  • +
  • base_transit_dir – Folder path to base transit network.

  • +
  • base_cube_transit_source – Folder path to base transit network or cube line file string.

  • +
  • base_cube_transit_file – File path to base transit network.

  • +
  • build_cube_transit_source – Folder path to build transit network or cube line file string.

  • +
  • build_cube_transit_file – File path to build transit network.

  • +
  • roadway_link_changes – pandas dataframe of CUBE roadway link changes.

  • +
  • roadway_node_changes – pandas dataframe of CUBE roadway node changes.

  • +
  • transit_changes – build transit changes.

  • +
  • base_roadway_network – Base roadway network object.

  • +
  • base_cube_transit_network – Base cube transit network object.

  • +
  • build_cube_transit_network – Build cube transit network object.

  • +
  • project_name – If not provided, will default to the roadway_log_file filename if +provided (or the first filename if a list is provided)

  • +
  • recalculate_calculated_variables – if reading in a base network, if this is true it +will recalculate variables such as area type, etc. This only needs to be true +if you are creating project cards that are changing the calculated variables.

  • +
  • recalculate_distance – recalculate the distance variable. This only needs to be +true if you are creating project cards that change the distance.

  • +
  • parameters – dictionary of parameters

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in +the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables +in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

A Project instance.

+
+
+
+ +
+
+static determine_roadway_network_changes_compatability(base_roadway_network, roadway_link_changes, roadway_node_changes, parameters)[source]
+

Checks to see that any links or nodes that change exist in base roadway network.

+
+ +
+
+static emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file)[source]
+

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

+
+ +
+
+static emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file)[source]
+

rename emme names to wrangler names using crosswalk file

+
+ +
+
+evaluate_changes()[source]
+

Determines which changes should be evaluated, initiates +self.card_data to be an aggregation of transit and highway changes.

+
+ +
+
+get_object_from_network_build_command()[source]
+

determine the network build object is node or link

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘N’ for node, ‘L’ for link

+
+
+
+ +
+
+get_operation_from_network_build_command()[source]
+

determine the network build object action type

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘A’, ‘C’, ‘D’

+
+
+
+ +
+
+static read_logfile(logfilename)[source]
+

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

logfilename (str or list[str]) – File path to CUBE logfile or list of logfile paths.

+
+
Returns:
+

A DataFrame reprsentation of the log file.

+
+
+
+ +
+
+static read_network_build_file(networkbuildfilename)[source]
+

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

networkbuildfilename (str or list[str]) – File path to emme nework build file or list of network build file paths.

+
+
Returns:
+

A DataFrame representation of the network build file

+
+
+
+ +
+
+write_project_card(filename=None)[source]
+

Writes project cards.

+
+
Parameters:
+

filename (str) – File path to output .yml

+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'assign_group', 'centroidconnect']
+
+ +
+
+DEFAULT_PROJECT_NAME = 'USER TO define'
+
+ +
+
+STATIC_VALUES = ['model_link_id', 'area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_generated/lasso.StandardTransit/index.html b/branch/test_no_change/_generated/lasso.StandardTransit/index.html new file mode 100644 index 0000000..07f2d1d --- /dev/null +++ b/branch/test_no_change/_generated/lasso.StandardTransit/index.html @@ -0,0 +1,453 @@ + + + + + + + lasso.StandardTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.StandardTransit

+
+
+class lasso.StandardTransit(ptg_feed, parameters={})[source]
+

Bases: object

+

Holds a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s +Cube Line files.

+

Typical usage example:

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+feed
+

Partridge Feed object containing read-only access to GTFS feed

+
+ +
+
+parameters
+

Parameters instance containing information +about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+__init__(ptg_feed, parameters={})[source]
+
+
Parameters:
+
    +
  • ptg_feed – partridge feed object

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(ptg_feed[, parameters])

+
param ptg_feed:
+

partridge feed object

+
+
+

calculate_cube_mode(row)

Assigns a cube mode number by following logic.

cube_format(row)

Creates a string represnting the route in cube line file notation. #MC :param row: row of a DataFrame representing a cube-formatted trip, with the Attributes trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR.

evaluate_differences(transit_changes)

Compare changes from the transit_changes dataframe with the standard transit network returns the project card changes in dictionary format

fromTransitNetwork(transit_network_object[, ...])

RoadwayNetwork to ModelRoadwayNetwork

read_gtfs(gtfs_feed_dir[, parameters])

Reads GTFS files from a directory and returns a StandardTransit instance.

route_properties_gtfs_to_cube(self)

Prepare gtfs for cube lin file.

shape_gtfs_to_cube(row[, add_nntime])

Creates a list of nodes that for the route in appropriate cube format.

shape_gtfs_to_dict_list(trip_id, shape_id, ...)

This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of stepping through the routed nodes and corresponding them with shape nodes.

shape_gtfs_to_emme(trip_row)

Creates transit segment for the trips in appropriate emme format.

time_to_cube_time_period(start_time_secs[, ...])

Converts seconds from midnight to the cube time period.

write_as_cube_lin([outpath])

Writes the gtfs feed as a cube line file after converting gtfs properties to MetCouncil cube properties.

+
+
+calculate_cube_mode(row)[source]
+

Assigns a cube mode number by following logic. +#MC +For rail, uses GTFS route_type variable: +https://developers.google.com/transit/gtfs/reference

+
+
::

# route_type : cube_mode +route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail

+
+

3: 0, # Bus; further disaggregated for cube +2: 9} # Rail

+
+
+
+

For buses, uses route id numbers and route name to find +express and suburban buses as follows:

+
+
::
+
if not cube_mode:
+
if ‘express’ in row[‘LONGNAME’].lower():

cube_mode = 7 # Express

+
+
elif int(row[‘route_id’].split(“-“)[0]) > 99:

cube_mode = 6 # Suburban Local

+
+
else:

cube_mode = 5 # Urban Local

+
+
+
+
+
+
+
+
Parameters:
+

row – A DataFrame row with route_type, route_long_name, and route_id

+
+
Returns:
+

cube mode number

+
+
+
+ +
+
+cube_format(row)[source]
+

Creates a string represnting the route in cube line file notation. +#MC +:param row: row of a DataFrame representing a cube-formatted trip, with the Attributes

+
+

trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR

+
+
+
Returns:
+

string representation of route in cube line file notation

+
+
+
+ +
+
+evaluate_differences(transit_changes)[source]
+

Compare changes from the transit_changes dataframe with the standard transit network +returns the project card changes in dictionary format

+
+ +
+
+static fromTransitNetwork(transit_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • transit_network_object – Reference to an instance of TransitNetwork.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit

+
+
+
+ +
+
+static read_gtfs(gtfs_feed_dir, parameters={})[source]
+

Reads GTFS files from a directory and returns a StandardTransit +instance.

+
+
Parameters:
+
    +
  • gtfs_feed_dir – location of the GTFS files

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit instance

+
+
+
+ +
+
+static route_properties_gtfs_to_cube(self)[source]
+

Prepare gtfs for cube lin file. +#MC +Does the following operations: +1. Combines route, frequency, trip, and shape information +2. Converts time of day to time periods +3. Calculates cube route name from gtfs route name and properties +4. Assigns a cube-appropriate mode number +5. Assigns a cube-appropriate operator number

+
+
Returns:
+

+
DataFrame of trips with cube-appropriate values for:
    +
  • NAME

  • +
  • ONEWAY

  • +
  • OPERATOR

  • +
  • MODE

  • +
  • HEADWAY

  • +
+
+
+

+
+
Return type:
+

trip_df (DataFrame)

+
+
+
+ +
+
+shape_gtfs_to_cube(row, add_nntime=False)[source]
+

Creates a list of nodes that for the route in appropriate +cube format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a string representation of the node list

for a route in cube format.

+
+
+
+ +
+
+shape_gtfs_to_dict_list(trip_id, shape_id, add_nntime)[source]
+

This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of +stepping through the routed nodes and corresponding them with shape nodes.

+

TODO: eliminate this necessity by tagging the stop nodes in the shapes to begin with when +the transit routing on the roadway network is first performed.

+

As such, I’m copying the code from StandardTransit.shape_gtfs_to_cube() with minimal modifications.

+
+
Parameters:
+
    +
  • question (shape_id of the trip in) –

  • +
  • question

  • +
+
+
Returns:
+

trip_id +shape_id +shape_pt_sequence +shape_mode_node_id +is_stop +access +stop_sequence

+
+
Return type:
+

list of dict records with columns

+
+
+
+ +
+
+shape_gtfs_to_emme(trip_row)[source]
+

Creates transit segment for the trips in appropriate +emme format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a dataframe representation of the transit segment

for a trip in emme format.

+
+
+
+ +
+
+time_to_cube_time_period(start_time_secs, as_str=True, verbose=False)[source]
+

Converts seconds from midnight to the cube time period.

+
+
Parameters:
+
    +
  • start_time_secs – start time for transit trip in seconds +from midnight

  • +
  • as_str – if True, returns the time period as a string, +otherwise returns a numeric time period

  • +
+
+
Returns:
+

+
if as_str is False, returns the numeric

time period

+
+
this_tp: if as_str is True, returns the Cube time period

name abbreviation

+
+
+

+
+
Return type:
+

this_tp_num

+
+
+
+ +
+
+write_as_cube_lin(outpath=None)[source]
+

Writes the gtfs feed as a cube line file after +converting gtfs properties to MetCouncil cube properties. +#MC +:param outpath: File location for output cube line file.

+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_generated/lasso.logger/index.html b/branch/test_no_change/_generated/lasso.logger/index.html new file mode 100644 index 0000000..b6cabdb --- /dev/null +++ b/branch/test_no_change/_generated/lasso.logger/index.html @@ -0,0 +1,143 @@ + + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.logger

+

Functions

+ + + + + + +

setupLogging(infoLogFilename, debugLogFilename)

Sets up the logger.

+
+
+lasso.logger.setupLogging(infoLogFilename, debugLogFilename, logToConsole=True)[source]
+

Sets up the logger. The infoLog is terse, just gives the bare minimum of details +so the network composition will be clear later. +The debuglog is very noisy, for debugging.

+

Pass none to either. +Spews it all out to console too, if logToConsole is true.

+
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_generated/lasso.util/index.html b/branch/test_no_change/_generated/lasso.util/index.html new file mode 100644 index 0000000..d088911 --- /dev/null +++ b/branch/test_no_change/_generated/lasso.util/index.html @@ -0,0 +1,1696 @@ + + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.util

+

Functions

+ + + + + + + + + + + + + + + + + + + + + + + + +

column_name_to_parts(c[, parameters])

create_locationreference(node, link)

geodesic_point_buffer(lat, lon, meters)

creates circular buffer polygon for node

get_shared_streets_intersection_hash(lat, long)

Calculated per:

hhmmss_to_datetime(hhmmss_str)

Creates a datetime time object from a string of hh:mm:ss

secs_to_datetime(secs)

Creates a datetime time object from a seconds from midnight

shorten_name(name)

+
+
+class lasso.util.Point(*args)[source]
+

Bases: BaseGeometry

+

A geometry type that represents a single coordinate with +x,y and possibly z values.

+

A point is a zero-dimensional feature and has zero length and zero area.

+
+
Parameters:
+

args (float, or sequence of floats) –

The coordinates can either be passed as a single parameter, or as +individual float values using multiple parameters:

+
    +
  1. 1 parameter: a sequence or array-like of with 2 or 3 values.

  2. +
  3. 2 or 3 parameters (float): x, y, and possibly z.

  4. +
+

+
+
+
+
+x, y, z
+

Coordinate values

+
+
Type:
+

float

+
+
+
+ +

Examples

+

Constructing the Point using separate parameters for x and y:

+
>>> p = Point(1.0, -1.0)
+
+
+

Constructing the Point using a list of x, y coordinates:

+
>>> p = Point([1.0, -1.0])
+>>> print(p)
+POINT (1 -1)
+>>> p.y
+-1.0
+>>> p.x
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG circle element for the Point geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG circle diameter. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property x
+

Return x coordinate.

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+

Example

+
>>> x, y = Point(0, 0).xy
+>>> list(x)
+[0.0]
+>>> list(y)
+[0.0]
+
+
+
+ +
+
+property y
+

Return y coordinate.

+
+ +
+
+property z
+

Return z coordinate.

+
+ +
+ +
+
+class lasso.util.Polygon(shell=None, holes=None)[source]
+

Bases: BaseGeometry

+

A geometry type representing an area that is enclosed by a linear ring.

+

A polygon is a two-dimensional feature and has a non-zero area. It may +have one or more negative-space “holes” which are also bounded by linear +rings. If any rings cross each other, the feature is invalid and +operations on it may fail.

+
+
Parameters:
+
    +
  • shell (sequence) – A sequence of (x, y [,z]) numeric coordinate pairs or triples, or +an array-like with shape (N, 2) or (N, 3). +Also can be a sequence of Point objects.

  • +
  • holes (sequence) – A sequence of objects which satisfy the same requirements as the +shell parameters above

  • +
+
+
+
+
+exterior
+

The ring which bounds the positive space of the polygon.

+
+
Type:
+

LinearRing

+
+
+
+ +
+
+interiors
+

A sequence of rings which bound all existing holes.

+
+
Type:
+

sequence

+
+
+
+ +

Examples

+

Create a square polygon with no holes

+
>>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.))
+>>> polygon = Polygon(coords)
+>>> polygon.area
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+classmethod from_bounds(xmin, ymin, xmax, ymax)[source]
+

Construct a Polygon() from spatial bounds.

+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG path element for the Polygon geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG stroke-width. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property exterior
+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property interiors
+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+
+ +
+ +
+
+class lasso.util.partial[source]
+

Bases: object

+

partial(func, *args, **keywords) - new function with partial application +of the given arguments and keywords.

+
+
+args
+

tuple of arguments to future partial calls

+
+ +
+
+func
+

function object to use in future partial calls

+
+ +
+
+keywords
+

dictionary of keyword arguments to future partial calls

+
+ +
+ +
+
+lasso.util.column_name_to_parts(c, parameters=None)[source]
+
+ +
+
+lasso.util.create_locationreference(node, link)[source]
+
+ +
+
+lasso.util.geodesic_point_buffer(lat, lon, meters)[source]
+

creates circular buffer polygon for node

+
+
Parameters:
+
    +
  • lat – node lat

  • +
  • lon – node lon

  • +
  • meters – buffer distance, radius of circle

  • +
+
+
Returns:
+

Polygon

+
+
+
+ +
+
+lasso.util.get_shared_streets_intersection_hash(lat, long, osm_node_id=None)[source]
+
+
Calculated per:

https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565

+
+
Expected in/out
+
-93.0965985, 44.952112199999995 osm_node_id = 954734870

69f13f881649cb21ee3b359730790bb9

+
+
+
+
+
+ +
+
+lasso.util.hhmmss_to_datetime(hhmmss_str)[source]
+

Creates a datetime time object from a string of hh:mm:ss

+
+
Parameters:
+

hhmmss_str – string of hh:mm:ss

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.secs_to_datetime(secs)[source]
+

Creates a datetime time object from a seconds from midnight

+
+
Parameters:
+

secs – seconds from midnight

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.shorten_name(name)[source]
+
+ +
+
+lasso.util.transform(func, geom)[source]
+

Applies func to all coordinates of geom and returns a new +geometry of the same type from the transformed coordinates.

+

func maps x, y, and optionally z to output xp, yp, zp. The input +parameters may iterable types like lists or arrays or single values. +The output shall be of the same type. Scalars in, scalars out. +Lists in, lists out.

+

For example, here is an identity function applicable to both types +of input.

+
+
+
def id_func(x, y, z=None):

return tuple(filter(None, [x, y, z]))

+
+
+

g2 = transform(id_func, g1)

+
+

Using pyproj >= 2.1, this example will accurately project Shapely geometries:

+
+

import pyproj

+

wgs84 = pyproj.CRS(‘EPSG:4326’) +utm = pyproj.CRS(‘EPSG:32618’)

+

project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform

+

g2 = transform(project, g1)

+
+

Note that the always_xy kwarg is required here as Shapely geometries only support +X,Y coordinate ordering.

+

Lambda expressions such as the one in

+
+

g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1)

+
+

also satisfy the requirements for func.

+
+ +
+
+lasso.util.unidecode(string, errors='ignore', replace_str='?')
+

Transliterate an Unicode object into an ASCII string

+
+
Return type:
+

str

+
+
+
>>> unidecode("北亰")
+"Bei Jing "
+
+
+

This function first tries to convert the string using ASCII codec. +If it fails (because of non-ASCII characters), it falls back to +transliteration using the character tables.

+

This is approx. five times faster if the string only contains ASCII +characters, but slightly slower than unicode_expect_nonascii if +non-ASCII characters are present.

+

errors specifies what to do with characters that have not been +found in replacement tables. The default is ‘ignore’ which ignores +the character. ‘strict’ raises an UnidecodeError. ‘replace’ +substitutes the character with replace_str (default is ‘?’). +‘preserve’ keeps the original character.

+

Note that if ‘preserve’ is used the returned string might not be +ASCII!

+
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_modules/functools/index.html b/branch/test_no_change/_modules/functools/index.html new file mode 100644 index 0000000..b3abe79 --- /dev/null +++ b/branch/test_no_change/_modules/functools/index.html @@ -0,0 +1,1083 @@ + + + + + + functools — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for functools

+"""functools.py - Tools for working with functions and callable objects
+"""
+# Python module wrapper for _functools C module
+# to allow utilities written in Python to be added
+# to the functools module.
+# Written by Nick Coghlan <ncoghlan at gmail.com>,
+# Raymond Hettinger <python at rcn.com>,
+# and Łukasz Langa <lukasz at langa.pl>.
+#   Copyright (C) 2006-2013 Python Software Foundation.
+# See C source code for _functools credits/copyright
+
+__all__ = ['update_wrapper', 'wraps', 'WRAPPER_ASSIGNMENTS', 'WRAPPER_UPDATES',
+           'total_ordering', 'cmp_to_key', 'lru_cache', 'reduce', 'partial',
+           'partialmethod', 'singledispatch', 'singledispatchmethod',
+           "cached_property"]
+
+from abc import get_cache_token
+from collections import namedtuple
+# import types, weakref  # Deferred to single_dispatch()
+from reprlib import recursive_repr
+from _thread import RLock
+
+
+################################################################################
+### update_wrapper() and wraps() decorator
+################################################################################
+
+# update_wrapper() and wraps() are tools to help write
+# wrapper functions that can handle naive introspection
+
+WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__',
+                       '__annotations__')
+WRAPPER_UPDATES = ('__dict__',)
+def update_wrapper(wrapper,
+                   wrapped,
+                   assigned = WRAPPER_ASSIGNMENTS,
+                   updated = WRAPPER_UPDATES):
+    """Update a wrapper function to look like the wrapped function
+
+       wrapper is the function to be updated
+       wrapped is the original function
+       assigned is a tuple naming the attributes assigned directly
+       from the wrapped function to the wrapper function (defaults to
+       functools.WRAPPER_ASSIGNMENTS)
+       updated is a tuple naming the attributes of the wrapper that
+       are updated with the corresponding attribute from the wrapped
+       function (defaults to functools.WRAPPER_UPDATES)
+    """
+    for attr in assigned:
+        try:
+            value = getattr(wrapped, attr)
+        except AttributeError:
+            pass
+        else:
+            setattr(wrapper, attr, value)
+    for attr in updated:
+        getattr(wrapper, attr).update(getattr(wrapped, attr, {}))
+    # Issue #17482: set __wrapped__ last so we don't inadvertently copy it
+    # from the wrapped function when updating __dict__
+    wrapper.__wrapped__ = wrapped
+    # Return the wrapper so this can be used as a decorator via partial()
+    return wrapper
+
+def wraps(wrapped,
+          assigned = WRAPPER_ASSIGNMENTS,
+          updated = WRAPPER_UPDATES):
+    """Decorator factory to apply update_wrapper() to a wrapper function
+
+       Returns a decorator that invokes update_wrapper() with the decorated
+       function as the wrapper argument and the arguments to wraps() as the
+       remaining arguments. Default arguments are as for update_wrapper().
+       This is a convenience function to simplify applying partial() to
+       update_wrapper().
+    """
+    return partial(update_wrapper, wrapped=wrapped,
+                   assigned=assigned, updated=updated)
+
+
+################################################################################
+### total_ordering class decorator
+################################################################################
+
+# The total ordering functions all invoke the root magic method directly
+# rather than using the corresponding operator.  This avoids possible
+# infinite recursion that could occur when the operator dispatch logic
+# detects a NotImplemented result and then calls a reflected method.
+
+def _gt_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a < b) and (a != b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _le_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (a < b) or (a == b).'
+    op_result = self.__lt__(other)
+    return op_result or self == other
+
+def _ge_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a < b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _ge_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a <= b) or (a == b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _lt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (a <= b) and (a != b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _gt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a <= b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _lt_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a > b) and (a != b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _ge_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (a > b) or (a == b).'
+    op_result = self.__gt__(other)
+    return op_result or self == other
+
+def _le_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a > b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _le_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a >= b) or (a == b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _gt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (a >= b) and (a != b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _lt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a >= b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+_convert = {
+    '__lt__': [('__gt__', _gt_from_lt),
+               ('__le__', _le_from_lt),
+               ('__ge__', _ge_from_lt)],
+    '__le__': [('__ge__', _ge_from_le),
+               ('__lt__', _lt_from_le),
+               ('__gt__', _gt_from_le)],
+    '__gt__': [('__lt__', _lt_from_gt),
+               ('__ge__', _ge_from_gt),
+               ('__le__', _le_from_gt)],
+    '__ge__': [('__le__', _le_from_ge),
+               ('__gt__', _gt_from_ge),
+               ('__lt__', _lt_from_ge)]
+}
+
+def total_ordering(cls):
+    """Class decorator that fills in missing ordering methods"""
+    # Find user-defined comparisons (not those inherited from object).
+    roots = {op for op in _convert if getattr(cls, op, None) is not getattr(object, op, None)}
+    if not roots:
+        raise ValueError('must define at least one ordering operation: < > <= >=')
+    root = max(roots)       # prefer __lt__ to __le__ to __gt__ to __ge__
+    for opname, opfunc in _convert[root]:
+        if opname not in roots:
+            opfunc.__name__ = opname
+            setattr(cls, opname, opfunc)
+    return cls
+
+
+################################################################################
+### cmp_to_key() function converter
+################################################################################
+
+def cmp_to_key(mycmp):
+    """Convert a cmp= function into a key= function"""
+    class K(object):
+        __slots__ = ['obj']
+        def __init__(self, obj):
+            self.obj = obj
+        def __lt__(self, other):
+            return mycmp(self.obj, other.obj) < 0
+        def __gt__(self, other):
+            return mycmp(self.obj, other.obj) > 0
+        def __eq__(self, other):
+            return mycmp(self.obj, other.obj) == 0
+        def __le__(self, other):
+            return mycmp(self.obj, other.obj) <= 0
+        def __ge__(self, other):
+            return mycmp(self.obj, other.obj) >= 0
+        __hash__ = None
+    return K
+
+try:
+    from _functools import cmp_to_key
+except ImportError:
+    pass
+
+
+################################################################################
+### reduce() sequence to a single item
+################################################################################
+
+_initial_missing = object()
+
+def reduce(function, sequence, initial=_initial_missing):
+    """
+    reduce(function, sequence[, initial]) -> value
+
+    Apply a function of two arguments cumulatively to the items of a sequence,
+    from left to right, so as to reduce the sequence to a single value.
+    For example, reduce(lambda x, y: x+y, [1, 2, 3, 4, 5]) calculates
+    ((((1+2)+3)+4)+5).  If initial is present, it is placed before the items
+    of the sequence in the calculation, and serves as a default when the
+    sequence is empty.
+    """
+
+    it = iter(sequence)
+
+    if initial is _initial_missing:
+        try:
+            value = next(it)
+        except StopIteration:
+            raise TypeError("reduce() of empty sequence with no initial value") from None
+    else:
+        value = initial
+
+    for element in it:
+        value = function(value, element)
+
+    return value
+
+try:
+    from _functools import reduce
+except ImportError:
+    pass
+
+
+################################################################################
+### partial() argument application
+################################################################################
+
+# Purely functional, no descriptor behaviour
+
[docs]class partial: + """New function with partial application of the given arguments + and keywords. + """ + + __slots__ = "func", "args", "keywords", "__dict__", "__weakref__" + + def __new__(cls, func, /, *args, **keywords): + if not callable(func): + raise TypeError("the first argument must be callable") + + if hasattr(func, "func"): + args = func.args + args + keywords = {**func.keywords, **keywords} + func = func.func + + self = super(partial, cls).__new__(cls) + + self.func = func + self.args = args + self.keywords = keywords + return self + + def __call__(self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(*self.args, *args, **keywords) + + @recursive_repr() + def __repr__(self): + qualname = type(self).__qualname__ + args = [repr(self.func)] + args.extend(repr(x) for x in self.args) + args.extend(f"{k}={v!r}" for (k, v) in self.keywords.items()) + if type(self).__module__ == "functools": + return f"functools.{qualname}({', '.join(args)})" + return f"{qualname}({', '.join(args)})" + + def __reduce__(self): + return type(self), (self.func,), (self.func, self.args, + self.keywords or None, self.__dict__ or None) + + def __setstate__(self, state): + if not isinstance(state, tuple): + raise TypeError("argument to __setstate__ must be a tuple") + if len(state) != 4: + raise TypeError(f"expected 4 items in state, got {len(state)}") + func, args, kwds, namespace = state + if (not callable(func) or not isinstance(args, tuple) or + (kwds is not None and not isinstance(kwds, dict)) or + (namespace is not None and not isinstance(namespace, dict))): + raise TypeError("invalid partial state") + + args = tuple(args) # just in case it's a subclass + if kwds is None: + kwds = {} + elif type(kwds) is not dict: # XXX does it need to be *exactly* dict? + kwds = dict(kwds) + if namespace is None: + namespace = {} + + self.__dict__ = namespace + self.func = func + self.args = args + self.keywords = kwds
+ +try: + from _functools import partial +except ImportError: + pass + +# Descriptor version +class partialmethod(object): + """Method descriptor with partial application of the given arguments + and keywords. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(*args, **keywords): + if len(args) >= 2: + self, func, *args = args + elif not args: + raise TypeError("descriptor '__init__' of partialmethod " + "needs an argument") + elif 'func' in keywords: + func = keywords.pop('func') + self, *args = args + import warnings + warnings.warn("Passing 'func' as keyword argument is deprecated", + DeprecationWarning, stacklevel=2) + else: + raise TypeError("type 'partialmethod' takes at least one argument, " + "got %d" % (len(args)-1)) + args = tuple(args) + + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError("{!r} is not callable or a descriptor" + .format(func)) + + # func could be a descriptor like classmethod which isn't callable, + # so we can't inherit from partial (it verifies func is callable) + if isinstance(func, partialmethod): + # flattening is mandatory in order to place cls/self before all + # other arguments + # it's also more efficient since only one function will be called + self.func = func.func + self.args = func.args + args + self.keywords = {**func.keywords, **keywords} + else: + self.func = func + self.args = args + self.keywords = keywords + __init__.__text_signature__ = '($self, func, /, *args, **keywords)' + + def __repr__(self): + args = ", ".join(map(repr, self.args)) + keywords = ", ".join("{}={!r}".format(k, v) + for k, v in self.keywords.items()) + format_string = "{module}.{cls}({func}, {args}, {keywords})" + return format_string.format(module=self.__class__.__module__, + cls=self.__class__.__qualname__, + func=self.func, + args=args, + keywords=keywords) + + def _make_unbound_method(self): + def _method(cls_or_self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(cls_or_self, *self.args, *args, **keywords) + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method._partialmethod = self + return _method + + def __get__(self, obj, cls=None): + get = getattr(self.func, "__get__", None) + result = None + if get is not None: + new_func = get(obj, cls) + if new_func is not self.func: + # Assume __get__ returning something new indicates the + # creation of an appropriate callable + result = partial(new_func, *self.args, **self.keywords) + try: + result.__self__ = new_func.__self__ + except AttributeError: + pass + if result is None: + # If the underlying descriptor didn't do anything, treat this + # like an instance method + result = self._make_unbound_method().__get__(obj, cls) + return result + + @property + def __isabstractmethod__(self): + return getattr(self.func, "__isabstractmethod__", False) + +# Helper functions + +def _unwrap_partial(func): + while isinstance(func, partial): + func = func.func + return func + +################################################################################ +### LRU Cache function decorator +################################################################################ + +_CacheInfo = namedtuple("CacheInfo", ["hits", "misses", "maxsize", "currsize"]) + +class _HashedSeq(list): + """ This class guarantees that hash() will be called no more than once + per element. This is important because the lru_cache() will hash + the key multiple times on a cache miss. + + """ + + __slots__ = 'hashvalue' + + def __init__(self, tup, hash=hash): + self[:] = tup + self.hashvalue = hash(tup) + + def __hash__(self): + return self.hashvalue + +def _make_key(args, kwds, typed, + kwd_mark = (object(),), + fasttypes = {int, str}, + tuple=tuple, type=type, len=len): + """Make a cache key from optionally typed positional and keyword arguments + + The key is constructed in a way that is flat as possible rather than + as a nested structure that would take more memory. + + If there is only a single argument and its data type is known to cache + its hash value, then that argument is returned without a wrapper. This + saves space and improves lookup speed. + + """ + # All of code below relies on kwds preserving the order input by the user. + # Formerly, we sorted() the kwds before looping. The new way is *much* + # faster; however, it means that f(x=1, y=2) will now be treated as a + # distinct call from f(y=2, x=1) which will be cached separately. + key = args + if kwds: + key += kwd_mark + for item in kwds.items(): + key += item + if typed: + key += tuple(type(v) for v in args) + if kwds: + key += tuple(type(v) for v in kwds.values()) + elif len(key) == 1 and type(key[0]) in fasttypes: + return key[0] + return _HashedSeq(key) + +def lru_cache(maxsize=128, typed=False): + """Least-recently-used cache decorator. + + If *maxsize* is set to None, the LRU features are disabled and the cache + can grow without bound. + + If *typed* is True, arguments of different types will be cached separately. + For example, f(3.0) and f(3) will be treated as distinct calls with + distinct results. + + Arguments to the cached function must be hashable. + + View the cache statistics named tuple (hits, misses, maxsize, currsize) + with f.cache_info(). Clear the cache and statistics with f.cache_clear(). + Access the underlying function with f.__wrapped__. + + See: http://en.wikipedia.org/wiki/Cache_replacement_policies#Least_recently_used_(LRU) + + """ + + # Users should only access the lru_cache through its public API: + # cache_info, cache_clear, and f.__wrapped__ + # The internals of the lru_cache are encapsulated for thread safety and + # to allow the implementation to change (including a possible C version). + + if isinstance(maxsize, int): + # Negative maxsize is treated as 0 + if maxsize < 0: + maxsize = 0 + elif callable(maxsize) and isinstance(typed, bool): + # The user_function was passed in directly via the maxsize argument + user_function, maxsize = maxsize, 128 + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + elif maxsize is not None: + raise TypeError( + 'Expected first argument to be an integer, a callable, or None') + + def decorating_function(user_function): + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + + return decorating_function + +def _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo): + # Constants shared by all lru cache instances: + sentinel = object() # unique object used to signal cache misses + make_key = _make_key # build a key from the function arguments + PREV, NEXT, KEY, RESULT = 0, 1, 2, 3 # names for the link fields + + cache = {} + hits = misses = 0 + full = False + cache_get = cache.get # bound method to lookup a key or return None + cache_len = cache.__len__ # get cache size without calling len() + lock = RLock() # because linkedlist updates aren't threadsafe + root = [] # root of the circular doubly linked list + root[:] = [root, root, None, None] # initialize by pointing to self + + if maxsize == 0: + + def wrapper(*args, **kwds): + # No caching -- just a statistics update + nonlocal misses + misses += 1 + result = user_function(*args, **kwds) + return result + + elif maxsize is None: + + def wrapper(*args, **kwds): + # Simple caching without ordering or size limit + nonlocal hits, misses + key = make_key(args, kwds, typed) + result = cache_get(key, sentinel) + if result is not sentinel: + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + cache[key] = result + return result + + else: + + def wrapper(*args, **kwds): + # Size limited caching that tracks accesses by recency + nonlocal root, hits, misses, full + key = make_key(args, kwds, typed) + with lock: + link = cache_get(key) + if link is not None: + # Move the link to the front of the circular queue + link_prev, link_next, _key, result = link + link_prev[NEXT] = link_next + link_next[PREV] = link_prev + last = root[PREV] + last[NEXT] = root[PREV] = link + link[PREV] = last + link[NEXT] = root + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + with lock: + if key in cache: + # Getting here means that this same key was added to the + # cache while the lock was released. Since the link + # update is already done, we need only return the + # computed result and update the count of misses. + pass + elif full: + # Use the old root to store the new key and result. + oldroot = root + oldroot[KEY] = key + oldroot[RESULT] = result + # Empty the oldest link and make it the new root. + # Keep a reference to the old key and old result to + # prevent their ref counts from going to zero during the + # update. That will prevent potentially arbitrary object + # clean-up code (i.e. __del__) from running while we're + # still adjusting the links. + root = oldroot[NEXT] + oldkey = root[KEY] + oldresult = root[RESULT] + root[KEY] = root[RESULT] = None + # Now update the cache dictionary. + del cache[oldkey] + # Save the potentially reentrant cache[key] assignment + # for last, after the root and links have been put in + # a consistent state. + cache[key] = oldroot + else: + # Put result in a new link at the front of the queue. + last = root[PREV] + link = [last, root, key, result] + last[NEXT] = root[PREV] = cache[key] = link + # Use the cache_len bound method instead of the len() function + # which could potentially be wrapped in an lru_cache itself. + full = (cache_len() >= maxsize) + return result + + def cache_info(): + """Report cache statistics""" + with lock: + return _CacheInfo(hits, misses, maxsize, cache_len()) + + def cache_clear(): + """Clear the cache and cache statistics""" + nonlocal hits, misses, full + with lock: + cache.clear() + root[:] = [root, root, None, None] + hits = misses = 0 + full = False + + wrapper.cache_info = cache_info + wrapper.cache_clear = cache_clear + return wrapper + +try: + from _functools import _lru_cache_wrapper +except ImportError: + pass + + +################################################################################ +### singledispatch() - single-dispatch generic function decorator +################################################################################ + +def _c3_merge(sequences): + """Merges MROs in *sequences* to a single MRO using the C3 algorithm. + + Adapted from http://www.python.org/download/releases/2.3/mro/. + + """ + result = [] + while True: + sequences = [s for s in sequences if s] # purge empty sequences + if not sequences: + return result + for s1 in sequences: # find merge candidates among seq heads + candidate = s1[0] + for s2 in sequences: + if candidate in s2[1:]: + candidate = None + break # reject the current head, it appears later + else: + break + if candidate is None: + raise RuntimeError("Inconsistent hierarchy") + result.append(candidate) + # remove the chosen candidate + for seq in sequences: + if seq[0] == candidate: + del seq[0] + +def _c3_mro(cls, abcs=None): + """Computes the method resolution order using extended C3 linearization. + + If no *abcs* are given, the algorithm works exactly like the built-in C3 + linearization used for method resolution. + + If given, *abcs* is a list of abstract base classes that should be inserted + into the resulting MRO. Unrelated ABCs are ignored and don't end up in the + result. The algorithm inserts ABCs where their functionality is introduced, + i.e. issubclass(cls, abc) returns True for the class itself but returns + False for all its direct base classes. Implicit ABCs for a given class + (either registered or inferred from the presence of a special method like + __len__) are inserted directly after the last ABC explicitly listed in the + MRO of said class. If two implicit ABCs end up next to each other in the + resulting MRO, their ordering depends on the order of types in *abcs*. + + """ + for i, base in enumerate(reversed(cls.__bases__)): + if hasattr(base, '__abstractmethods__'): + boundary = len(cls.__bases__) - i + break # Bases up to the last explicit ABC are considered first. + else: + boundary = 0 + abcs = list(abcs) if abcs else [] + explicit_bases = list(cls.__bases__[:boundary]) + abstract_bases = [] + other_bases = list(cls.__bases__[boundary:]) + for base in abcs: + if issubclass(cls, base) and not any( + issubclass(b, base) for b in cls.__bases__ + ): + # If *cls* is the class that introduces behaviour described by + # an ABC *base*, insert said ABC to its MRO. + abstract_bases.append(base) + for base in abstract_bases: + abcs.remove(base) + explicit_c3_mros = [_c3_mro(base, abcs=abcs) for base in explicit_bases] + abstract_c3_mros = [_c3_mro(base, abcs=abcs) for base in abstract_bases] + other_c3_mros = [_c3_mro(base, abcs=abcs) for base in other_bases] + return _c3_merge( + [[cls]] + + explicit_c3_mros + abstract_c3_mros + other_c3_mros + + [explicit_bases] + [abstract_bases] + [other_bases] + ) + +def _compose_mro(cls, types): + """Calculates the method resolution order for a given class *cls*. + + Includes relevant abstract base classes (with their respective bases) from + the *types* iterable. Uses a modified C3 linearization algorithm. + + """ + bases = set(cls.__mro__) + # Remove entries which are already present in the __mro__ or unrelated. + def is_related(typ): + return (typ not in bases and hasattr(typ, '__mro__') + and issubclass(cls, typ)) + types = [n for n in types if is_related(n)] + # Remove entries which are strict bases of other entries (they will end up + # in the MRO anyway. + def is_strict_base(typ): + for other in types: + if typ != other and typ in other.__mro__: + return True + return False + types = [n for n in types if not is_strict_base(n)] + # Subclasses of the ABCs in *types* which are also implemented by + # *cls* can be used to stabilize ABC ordering. + type_set = set(types) + mro = [] + for typ in types: + found = [] + for sub in typ.__subclasses__(): + if sub not in bases and issubclass(cls, sub): + found.append([s for s in sub.__mro__ if s in type_set]) + if not found: + mro.append(typ) + continue + # Favor subclasses with the biggest number of useful bases + found.sort(key=len, reverse=True) + for sub in found: + for subcls in sub: + if subcls not in mro: + mro.append(subcls) + return _c3_mro(cls, abcs=mro) + +def _find_impl(cls, registry): + """Returns the best matching implementation from *registry* for type *cls*. + + Where there is no registered implementation for a specific type, its method + resolution order is used to find a more generic implementation. + + Note: if *registry* does not contain an implementation for the base + *object* type, this function may return None. + + """ + mro = _compose_mro(cls, registry.keys()) + match = None + for t in mro: + if match is not None: + # If *match* is an implicit ABC but there is another unrelated, + # equally matching implicit ABC, refuse the temptation to guess. + if (t in registry and t not in cls.__mro__ + and match not in cls.__mro__ + and not issubclass(match, t)): + raise RuntimeError("Ambiguous dispatch: {} or {}".format( + match, t)) + break + if t in registry: + match = t + return registry.get(match) + +def singledispatch(func): + """Single-dispatch generic function decorator. + + Transforms a function into a generic function, which can have different + behaviours depending upon the type of its first argument. The decorated + function acts as the default implementation, and additional + implementations can be registered using the register() attribute of the + generic function. + """ + # There are many programs that use functools without singledispatch, so we + # trade-off making singledispatch marginally slower for the benefit of + # making start-up of such applications slightly faster. + import types, weakref + + registry = {} + dispatch_cache = weakref.WeakKeyDictionary() + cache_token = None + + def dispatch(cls): + """generic_func.dispatch(cls) -> <function implementation> + + Runs the dispatch algorithm to return the best available implementation + for the given *cls* registered on *generic_func*. + + """ + nonlocal cache_token + if cache_token is not None: + current_token = get_cache_token() + if cache_token != current_token: + dispatch_cache.clear() + cache_token = current_token + try: + impl = dispatch_cache[cls] + except KeyError: + try: + impl = registry[cls] + except KeyError: + impl = _find_impl(cls, registry) + dispatch_cache[cls] = impl + return impl + + def register(cls, func=None): + """generic_func.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_func*. + + """ + nonlocal cache_token + if func is None: + if isinstance(cls, type): + return lambda f: register(cls, f) + ann = getattr(cls, '__annotations__', {}) + if not ann: + raise TypeError( + f"Invalid first argument to `register()`: {cls!r}. " + f"Use either `@register(some_class)` or plain `@register` " + f"on an annotated function." + ) + func = cls + + # only import typing if annotation parsing is necessary + from typing import get_type_hints + argname, cls = next(iter(get_type_hints(func).items())) + if not isinstance(cls, type): + raise TypeError( + f"Invalid annotation for {argname!r}. " + f"{cls!r} is not a class." + ) + registry[cls] = func + if cache_token is None and hasattr(cls, '__abstractmethods__'): + cache_token = get_cache_token() + dispatch_cache.clear() + return func + + def wrapper(*args, **kw): + if not args: + raise TypeError(f'{funcname} requires at least ' + '1 positional argument') + + return dispatch(args[0].__class__)(*args, **kw) + + funcname = getattr(func, '__name__', 'singledispatch function') + registry[object] = func + wrapper.register = register + wrapper.dispatch = dispatch + wrapper.registry = types.MappingProxyType(registry) + wrapper._clear_cache = dispatch_cache.clear + update_wrapper(wrapper, func) + return wrapper + + +# Descriptor version +class singledispatchmethod: + """Single-dispatch generic method descriptor. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(self, func): + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError(f"{func!r} is not callable or a descriptor") + + self.dispatcher = singledispatch(func) + self.func = func + + def register(self, cls, method=None): + """generic_method.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_method*. + """ + return self.dispatcher.register(cls, func=method) + + def __get__(self, obj, cls=None): + def _method(*args, **kwargs): + method = self.dispatcher.dispatch(args[0].__class__) + return method.__get__(obj, cls)(*args, **kwargs) + + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method.register = self.register + update_wrapper(_method, self.func) + return _method + + @property + def __isabstractmethod__(self): + return getattr(self.func, '__isabstractmethod__', False) + + +################################################################################ +### cached_property() - computed once per instance, cached as attribute +################################################################################ + +_NOT_FOUND = object() + + +class cached_property: + def __init__(self, func): + self.func = func + self.attrname = None + self.__doc__ = func.__doc__ + self.lock = RLock() + + def __set_name__(self, owner, name): + if self.attrname is None: + self.attrname = name + elif name != self.attrname: + raise TypeError( + "Cannot assign the same cached_property to two different names " + f"({self.attrname!r} and {name!r})." + ) + + def __get__(self, instance, owner=None): + if instance is None: + return self + if self.attrname is None: + raise TypeError( + "Cannot use cached_property instance without calling __set_name__ on it.") + try: + cache = instance.__dict__ + except AttributeError: # not all objects have __dict__ (e.g. class defines slots) + msg = ( + f"No '__dict__' attribute on {type(instance).__name__!r} " + f"instance to cache {self.attrname!r} property." + ) + raise TypeError(msg) from None + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + with self.lock: + # check if another thread filled cache while we awaited lock + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + val = self.func(instance) + try: + cache[self.attrname] = val + except TypeError: + msg = ( + f"The '__dict__' attribute on {type(instance).__name__!r} instance " + f"does not support item assignment for caching {self.attrname!r} property." + ) + raise TypeError(msg) from None + return val +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_modules/index.html b/branch/test_no_change/_modules/index.html new file mode 100644 index 0000000..d50382b --- /dev/null +++ b/branch/test_no_change/_modules/index.html @@ -0,0 +1,116 @@ + + + + + + Overview: module code — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+ +
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_modules/lasso/logger/index.html b/branch/test_no_change/_modules/lasso/logger/index.html new file mode 100644 index 0000000..53dd18d --- /dev/null +++ b/branch/test_no_change/_modules/lasso/logger/index.html @@ -0,0 +1,153 @@ + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.logger

+import logging
+
+__all__ = ["WranglerLogger", "setupLogging"]
+
+
+# for all the Wrangler logging needs!
+WranglerLogger = logging.getLogger("WranglerLogger")
+
+
+
[docs]def setupLogging(infoLogFilename, debugLogFilename, logToConsole=True): + """Sets up the logger. The infoLog is terse, just gives the bare minimum of details + so the network composition will be clear later. + The debuglog is very noisy, for debugging. + + Pass none to either. + Spews it all out to console too, if logToConsole is true. + """ + # clear handlers if any exist already + WranglerLogger.handlers = [] + + # create a logger + WranglerLogger.setLevel(logging.DEBUG) + + if infoLogFilename: + infologhandler = logging.StreamHandler(open(infoLogFilename, "w")) + infologhandler.setLevel(logging.INFO) + infologhandler.setFormatter( + logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") + ) + WranglerLogger.addHandler(infologhandler) + + if debugLogFilename: + debugloghandler = logging.StreamHandler(open(debugLogFilename, "w")) + debugloghandler.setLevel(logging.DEBUG) + debugloghandler.setFormatter( + logging.Formatter("%(asctime)s %(levelname)s %(message)s", "%Y-%m-%d %H:%M") + ) + WranglerLogger.addHandler(debugloghandler) + + if logToConsole: + consolehandler = logging.StreamHandler() + consolehandler.setLevel(logging.DEBUG) + consolehandler.setFormatter( + logging.Formatter("%(name)-12s: %(levelname)-8s %(message)s") + ) + WranglerLogger.addHandler(consolehandler)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_modules/lasso/parameters/index.html b/branch/test_no_change/_modules/lasso/parameters/index.html new file mode 100644 index 0000000..8e396ec --- /dev/null +++ b/branch/test_no_change/_modules/lasso/parameters/index.html @@ -0,0 +1,1062 @@ + + + + + + lasso.parameters — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.parameters

+import os
+from .logger import WranglerLogger
+
+
+from pyproj import CRS
+
+
+def get_base_dir(lasso_base_dir=os.getcwd()):
+    d = lasso_base_dir
+    for i in range(3):
+        if "metcouncil_data" in os.listdir(d):
+
+            WranglerLogger.info("Lasso base directory set as: {}".format(d))
+            return d
+        d = os.path.dirname(d)
+
+    msg = "Cannot find Lasso base directory from {}, please input using keyword in parameters: `lasso_base_dir =` ".format(
+        lasso_base_dir
+    )
+    WranglerLogger.error(msg)
+    raise (ValueError(msg))
+
+# should be a dataclass
+
[docs]class Parameters: + """A class representing all the parameters defining the networks + including time of day, categories, etc. + + Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + .. highlight:: python + + Attr: + time_period_to_time (dict): Maps time period abbreviations used in + Cube to time of days used on gtfs and highway network standard + Default: + :: + { + "EA": ("3:00", "6:00"), + "AM": ("6:00, "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + cube_time_periods (dict): Maps cube time period numbers used in + transit line files to the time period abbreviations in time_period_to_time + dictionary. + Default: + :: + {"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"} + categories (dict): Maps demand category abbreviations to a list of + network categories they are allowed to use. + Default: + :: + { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + properties_to_split (dict): Dictionary mapping variables in standard + roadway network to categories and time periods that need to be + split out in final model network to get variables like LANES_AM. + Default: + :: + { + "lanes": { + "v": "lanes", + "time_periods": self.time_periods_to_time + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_periods_to_time + }, + "use": { + "v": "use", + "time_periods": self.time_periods_to_time + }, + } + + county_shape (str): File location of shapefile defining counties. + Default: + :: + r"metcouncil_data/county/cb_2017_us_county_5m.shp" + + county_variable_shp (str): Property defining the county n ame in + the county_shape file. + Default: + :: + NAME + lanes_lookup_file (str): Lookup table of number of lanes for different data sources. + Default: + :: + r"metcouncil_data/lookups/lanes.csv" + centroid_connect_lanes (int): Number of lanes for centroid connectors. + Default: + :: + 1 + mpo_counties (list): list of county names within MPO boundary. + Default: + :: + [ + "ANOKA", + "DAKOTA", + "HENNEPIN", + "RAMSEY", + "SCOTT", + "WASHINGTON", + "CARVER", + ] + + taz_shape (str): + Default: + :: + r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp" + taz_data (str): + Default: + :: + ?? + highest_taz_number (int): highest TAZ number in order to define + centroid connectors. + Default: + :: + 3100 + + output_variables (list): list of variables to output in final model + network. + Default: + :: + [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + "distance", + "roadway", + "name", + "roadway_class", + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "trn_priority_EA", + "trn_priority_AM", + "trn_priority_MD", + "trn_priority_PM", + "trn_priority_EV", + "ttime_assert_EA", + "ttime_assert_AM", + "ttime_assert_MD", + "ttime_assert_PM", + "ttime_assert_EV", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "price_sov_EA", + "price_hov2_EA", + "price_hov3_EA", + "price_truck_EA", + "price_sov_AM", + "price_hov2_AM", + "price_hov3_AM", + "price_truck_AM", + "price_sov_MD", + "price_hov2_MD", + "price_hov3_MD", + "price_truck_MD", + "price_sov_PM", + "price_hov2_PM", + "price_hov3_PM", + "price_truck_PM", + "price_sov_EV", + "price_hov2_EV", + "price_hov3_EV", + "price_truck_EV", + "roadway_class_idx", + "facility_type", + "county", + "centroidconnect", + "model_node_id", + "N", + "osm_node_id", + "bike_node", + "transit_node", + "walk_node", + "drive_node", + "geometry", + "X", + "Y", + "ML_lanes_EA", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "segment_id", + "managed", + "bus_only", + "rail_only" + ] + + osm_facility_type_dict (dict): Mapping between OSM Roadway variable + and facility type. Default: + + area_type_shape (str): Location of shapefile defining area type. + Default: + :: + r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp" + area_type_variable_shp (str): property in area_type_shape with area + type in it. + Default: + :: + "COMDES2040" + area_type_code_dict (dict): Mapping of the area_type_variable_shp to + the area type code used in the MetCouncil cube network. + Default: + :: + { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + downtown_area_type_shape (str): Location of shapefile defining downtown area type. + Default: + :: + r"metcouncil_data/area_type/downtownzones_TAZ.shp" + downtown_area_type (int): Area type integer for downtown. + Default: + :: + 5 + mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp" + mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp + associated with roadway class. Default: + :: + "ROUTE_SYS" + widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp" + widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape + associated with roadway class.Default: + :: + "RDWY_CTGY_" + mndot_count_shape (str): Shapefile of MnDOT links with a property + associated with counts. Default: + :: + r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp" + mndot_count_variable_shp (str): The property in mndot_count_shape + associated with counts. Default: + + :: + "lookups/osm_highway_facility_type_crosswalk.csv" + legacy_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from Legacy TM2 network. Default: + :: + "lookups/legacy_tm2_attributes.csv" + osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId + from OSM. Default: + :: + "lookups/osm_lanes_attributes.csv" + tam_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from TAM TM2 network. Default: + :: + "lookups/tam_tm2_attributes.csv" + tom_tom_attributes (str): CSV file of link attributes by + shStReferenceId from TomTom network. Default: + :: + "lookups/tomtom_attributes.csv" + sfcta_attributes (str): CSV file of link attributes by + shStReferenceId from SFCTA network. Default: + :: + "lookups/sfcta_attributes.csv" + output_epsg (int): EPSG type of geographic projection for output + shapefiles. Default: + :: + 102646 + output_link_shp (str): Output shapefile for roadway links. Default: + :: + r"tests/scratch/links.shp" + output_node_shp (str): Output shapefile for roadway nodes. Default: + :: + r"tests/scratch/nodes.shp" + output_link_csv (str): Output csv for roadway links. Default: + :: + r"tests/scratch/links.csv" + output_node_csv (str): Output csv for roadway nodes. Default: + :: + r"tests/scratch/nodes.csv" + output_link_txt (str): Output fixed format txt for roadway links. Default: + :: + r"tests/scratch/links.txt" + output_node_txt (str): Output fixed format txt for roadway nodes. Default: + :: + r"tests/scratch/nodes.txt" + output_link_header_width_txt (str): Header for txt roadway links. Default: + :: + r"tests/scratch/links_header_width.txt" + output_node_header_width_txt (str): Header for txt for roadway Nodes. Default: + :: + r"tests/scratch/nodes_header_width.txt" + output_cube_network_script (str): Cube script for importing + fixed-format roadway network. Default: + :: + r"tests/scratch/make_complete_network_from_fixed_width_file.s + + + + """ + +
[docs] def __init__(self, **kwargs): + """ + Time period and category splitting info + """ + if "time_periods_to_time" in kwargs: + self.time_periods_to_time = kwargs.get("time_periods_to_time") + else: + self.time_period_to_time = { + "EA": ("3:00", "6:00"), + "AM": ("6:00", "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + + #MTC + self.cube_time_periods = { + "1": "EA", + "2": "AM", + "3": "MD", + "4": "PM", + "5": "EV", + } + + self.taz_net_max_ft = 6 + self.maz_net_max_ft = 7 + # Potentially make a named tuple + self.taz_node_join_tolerance = (100, "US survey foot") + self.max_length_centroid_connector_when_none_in_taz = 999999999999999999999999999999999999 + + #TODO make this relative + self.taz_shape_file = r"C:\Users\USLP095001\code\MTC\travel-model-two\maz_taz\shapefiles\tazs_TM2_v2_2.shp" + self.maz_shape_file = r"C:\Users\USLP095001\code\MTC\travel-model-two\maz_taz\shapefiles\mazs_TM2_v2_2.shp" + + """ + #MC + self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7} + + self.route_type_mode_dict = {0: 8, 2: 9} + + self.cube_time_periods = {"1": "AM", "2": "MD"} + self.cube_time_periods_name = {"AM": "pk", "MD": "op"} + """ + if "categories" in kwargs: + self.categories = kwargs.get("categories") + else: + self.categories = { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + + # prefix, source variable, categories + self.properties_to_split = { + "lanes": { + "v": "lanes", + "time_periods": self.time_period_to_time, + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_period_to_time, + }, + "useclass": { + "v": "useclass", + "time_periods": self.time_period_to_time, + }, + } + + """ + Details for calculating the county based on the centroid of the link. + The NAME varible should be the name of a field in shapefile. + """ + #MTC + if 'lasso_base_dir' in kwargs: + self.base_dir = get_base_dir(lasso_base_dir = kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + + if 'data_file_location' in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "mtc_data") + + #MC + if "lasso_base_dir" in kwargs: + self.base_dir = get_base_dir(lasso_base_dir=kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + """ + if "data_file_location" in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "metcouncil_data") + """ + + #-------- + if "settings_location" in kwargs: + self.settings_location = kwargs.get("settings_location") + else: + self.settings_location = os.path.join(self.base_dir, "examples", "settings") + + if "scratch_location" in kwargs: + self.scratch_location = kwargs.get("scratch_location") + else: + self.scratch_location = os.path.join(self.base_dir, "tests", "scratch") + + ### COUNTIES + + self.county_shape = os.path.join( + self.data_file_location, "county", "county.shp" + ) + self.county_variable_shp = "NAME" + + #MTC + self.county_code_dict = { + 'San Francisco':1, + 'San Mateo':2, + 'Santa Clara':3, + 'Alameda':4, + 'Contra Costa':5, + 'Solano':6, + 'Napa':7, + 'Sonoma':8, + 'Marin':9, + 'External':10, + } + + self.county_centroid_range_dict = { + 'San Francisco':range(1,100000), + 'San Mateo':range(100001,200000), + 'Santa Clara':range(200001,300000), + 'Alameda':range(300001,400000), + 'Contra Costa':range(400001,500000), + 'Solano':range(500001,600000), + 'Napa':range(600001,700000), + 'Sonoma':range(700001,800000), + 'Marin':range(800001,900000), + 'External':range(900001,1000000) + } + + self.county_node_range_dict = { + 'San Francisco':range(1000000,1500000), + 'San Mateo':range(1500000,2000000), + 'Santa Clara':range(2000000,2500000), + 'Alameda':range(2500000,3000000), + 'Contra Costa':range(3000000,3500000), + 'Solano':range(3500000,4000000), + 'Napa':range(4000000,4500000), + 'Sonoma':range(4500000,5000000), + 'Marin':range(5000000,5500000), + } + + self.county_hov_node_range_dict = { + 'San Francisco':range(5500000,6000000), + 'San Mateo':range(6000000,6500000), + 'Santa Clara':range(6500000,7000000), + 'Alameda':range(7000000,7500000), + 'Contra Costa':range(7500000,8000000), + 'Solano':range(8000000,8500000), + 'Napa':range(8500000,9000000), + 'Sonoma':range(9000000,9500000), + 'Marin':range(9500000,10000000), + } + + self.county_link_range_dict = { + 'San Francisco':range(1,1000000), + 'San Mateo':range(1000000,2000000), + 'Santa Clara':range(2000000,3000000), + 'Alameda':range(3000000,4000000), + 'Contra Costa':range(4000000,5000000), + 'Solano':range(5000000,6000000), + 'Napa':range(6000000,7000000), + 'Sonoma':range(7000000,8000000), + 'Marin':range(8000000,9000000) + } + + #MC + """ + self.county_code_dict = { + "Anoka": 1, + "Carver": 2, + "Dakota": 3, + "Hennepin": 4, + "Ramsey": 5, + "Scott": 6, + "Washington": 7, + "external": 10, + "Chisago": 11, + "Goodhue": 12, + "Isanti": 13, + "Le Sueur": 14, + "McLeod": 15, + "Pierce": 16, + "Polk": 17, + "Rice": 18, + "Sherburne": 19, + "Sibley": 20, + "St. Croix": 21, + "Wright": 22, + } + """ + + self.mpo_counties = [ + 1, + 3, + 4, + 5, + 6, + 7, + 8, + 9 + ] + + + self.emme_drive_filter_criteria = [ + "" + ] + + self.taz_N_list = list(range(1, 10000)) + list(range(100001, 110000)) + list(range(200001, 210000)) + list(range(300001, 310000))\ + + list(range(400001, 410000)) + list(range(500001, 510000)) + list(range(600001, 610000)) + list(range(700001, 710000))\ + + list(range(800001, 810000)) + list(range(900001, 1000000)) + + self.maz_N_list = list(range(10001, 90000)) + list(range(110001, 190000)) + list(range(210001, 290000)) + list(range(310001, 390000))\ + + list(range(410001, 490000)) + list(range(510001, 590000)) + list(range(610001, 690000)) + list(range(710001, 790000))\ + + list(range(810001, 890000)) + + self.tap_N_list = list(range(90001, 99999)) + list(range(190001, 199999)) + list(range(290001, 299999)) + list(range(390001, 399999))\ + + list(range(490001, 499999)) + list(range(590001, 599999)) + list(range(690001, 699999)) + list(range(790001, 799999))\ + + list(range(890001, 899999)) + + self.tap_N_start = { + "San Francisco" : 90001, + "San Mateo" : 190001, + "Santa Clara" : 290001, + "Alameda" : 390001, + "Contra Costa" : 490001, + "Solano" : 590001, + "Napa" : 690001, + "Sonoma" : 790001, + "Marin" : 890001 + } + + #MTC + self.osm_facility_type_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_facility_type_crosswalk.csv" + ) + #MC + ### Lanes + self.lanes_lookup_file = os.path.join( + self.data_file_location, "lookups", "lanes.csv" + ) + + ### TAZS + + self.taz_shape = os.path.join( + self.data_file_location, "TAZ", "TAZOfficialWCurrentForecasts.shp" + ) + ###### + #MTC + self.osm_lanes_attributes = os.path.join( + self.data_file_location, "lookups", "osm_lanes_attributes.csv" + ) + + self.legacy_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "legacy_tm2_attributes.csv" + ) + + self.assignable_analysis = os.path.join( + self.data_file_location, "lookups", "assignable_analysis_links.csv" + ) + ### + ### AREA TYPE - MC + self.area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "ThriveMSP2040CommunityDesignation.shp", + ) + self.area_type_variable_shp = "COMDES2040" + # area type map from raw data to model category + + # source https://metrocouncil.org/Planning/Publications-And-Resources/Thrive-MSP-2040-Plan-(1)/7_ThriveMSP2040_LandUsePoliciesbyCD.aspx + # urban center + # urban + # suburban + # suburban edge + # emerging suburban edge + # rural center + # diversified rural + # rural residential + # agricultural + self.area_type_code_dict = { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + + self.downtown_area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "downtownzones_TAZ.shp", + ) + + self.downtown_area_type = int(5) + + self.centroid_connect_lanes = int(1) + + self.osm_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_asgngrp_crosswalk.csv" + ) + self.mrcc_roadway_class_shape = os.path.join( + self.data_file_location, "mrcc", "trans_mrcc_centerlines.shp" + ) + #### + ###MTC + self.tam_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "tam_tm2_attributes.csv" + ) + + self.sfcta_attributes = os.path.join( + self.data_file_location, "lookups", "sfcta_attributes.csv" + ) + + self.tomtom_attributes = os.path.join( + self.data_file_location, "lookups", "tomtom_attributes.csv" + ) + + self.pems_attributes = os.path.join( + self.data_file_location, "lookups", "pems_attributes.csv" + ) + + self.centroid_file = os.path.join( + self.data_file_location, "centroid", "centroid_node.pickle" + ) + #### + ###MC + self.widot_shst_data = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "widot.out.matched.geojson", + ) + #### + + self.centroid_connector_link_file = os.path.join( + self.data_file_location, "centroid", "cc_link.pickle" + ) + + self.centroid_connector_shape_file = os.path.join( + self.data_file_location, "centroid", "cc_shape.pickle" + ) + + self.tap_file = os.path.join( + self.data_file_location, "tap", "tap_node.pickle" + ) + + self.tap_connector_link_file = os.path.join( + self.data_file_location, "tap", "tap_link.pickle" + ) + + self.tap_connector_shape_file = os.path.join( + self.data_file_location, "tap", "tap_shape.pickle" + ) + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + ###MTC + self.log_to_net_crosswalk = os.path.join(self.settings_location, "log_to_net.csv") + + self.emme_name_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "emme_attribute_names.csv" + ) + #### + #MC + self.mndot_count_variable_shp = "AADT_mn" + + self.widot_county_shape = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "TRADAS_(counts).shp", + ) + ### + ###MTC + self.mode_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "gtfs_to_tm2_mode_crosswalk.csv" + ) + + self.veh_cap_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "transitSeatCap.csv" + ) + + self.faresystem_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "faresystem_crosswalk.txt" + ) + + # https://app.asana.com/0/12291104512575/1200287255197808/f + self.fare_2015_to_2010_deflator = 0.927 + self.fare_2015_to_2000_deflator = 180.20/258.27 + #### + #MC + self.widot_count_variable_shp = "AADT_wi" + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + self.log_to_net_crosswalk = os.path.join( + self.settings_location, "log_to_net.csv" + ) + + self.subregion_boundary_file = os.path.join( + self.data_file_location, 'emme', 'subregion_boundary_for_active_modes.shp' + ) + + self.subregion_boundary_id_variable = 'subregion' + #### + + self.output_variables = [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + #MTC + 'name', + "distance", + #"roadway", + #"name", + #MC + #"shape_id", + #"distance", + #"roadway", + #"name", + #"roadway_class", + #### + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "county", + "model_node_id", + "N", + "osm_node_id", + "geometry", + "X", + "Y", + "segment_id", + "managed", + "bus_only", + "rail_only", + #MTC + "assignable", + "cntype", + "useclass_AM", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "useclass_EA", + "transit", + "tollbooth", + "tollseg", + "ft", + "tap_drive", + "tollbooth", + "tollseg", + "farezone", + "tap_id", + #### + #MC + "bike_facility", + "mrcc_id", + "ROUTE_SYS", # mrcc functional class + ] + + self.output_link_shp = os.path.join(self.scratch_location, "links.shp") + self.output_node_shp = os.path.join(self.scratch_location, "nodes.shp") + self.output_link_csv = os.path.join(self.scratch_location, "links.csv") + self.output_node_csv = os.path.join(self.scratch_location, "nodes.csv") + self.output_link_txt = os.path.join(self.scratch_location, "links.txt") + self.output_node_txt = os.path.join(self.scratch_location, "nodes.txt") + self.output_link_header_width_txt = os.path.join( + self.scratch_location, "links_header_width.txt" + ) + self.output_node_header_width_txt = os.path.join( + self.scratch_location, "nodes_header_width.txt" + ) + self.output_cube_network_script = os.path.join( + self.scratch_location, "make_complete_network_from_fixed_width_file.s" + ) + self.output_dir = os.path.join(self.scratch_location) + self.output_proj = CRS("epsg:2875") + self.output_proj4 = '+proj=lcc +lat_0=32.1666666666667 +lon_0=-116.25 +lat_1=33.8833333333333 +lat_2=32.7833333333333 +x_0=2000000.0001016 +y_0=500000.0001016 +ellps=GRS80 +towgs84=-0.991,1.9072,0.5129,-1.25033e-07,-4.6785e-08,-5.6529e-08,0 +units=us-ft +no_defs +type=crs' + self.prj_file = os.path.join(self.data_file_location, 'projection', '2875.prj') + self.wkt_projection = 'PROJCS["NAD83(HARN) / California zone 6 (ftUS)",GEOGCS["NAD83(HARN)",DATUM["NAD83_High_Accuracy_Reference_Network",SPHEROID["GRS 1980",6378137,298.257222101],TOWGS84[-0.991,1.9072,0.5129,-1.25033E-07,-4.6785E-08,-5.6529E-08,0]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4152"]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["latitude_of_origin",32.1666666666667],PARAMETER["central_meridian",-116.25],PARAMETER["standard_parallel_1",33.8833333333333],PARAMETER["standard_parallel_2",32.7833333333333],PARAMETER["false_easting",6561666.667],PARAMETER["false_northing",1640416.667],UNIT["US survey foot",0.304800609601219],AXIS["Easting",EAST],AXIS["Northing",NORTH],AUTHORITY["EPSG","2875"]]' + + self.fare_matrix_output_variables = ["faresystem", "origin_farezone", "destination_farezone", "price"] + + self.zones = 4756 + """ + Create all the possible headway variable combinations based on the cube time periods setting + """ + self.time_period_properties_list = [ + p + "[" + str(t) + "]" + for p in ["HEADWAY", "FREQ"] + for t in self.cube_time_periods.keys() + ] + + self.int_col = [ + "model_link_id", + "model_node_id", + "A", + "B", + #MTC + #"county", + ### + #MC + # "lanes", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "roadway_class", + "assign_group", + #"county", + "area_type", + "trn_priority", + "AADT", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "centroidconnect", + "bike_facility", + #### + "drive_access", + "walk_access", + "bike_access", + "truck_access", + #MTC + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "ML_lanes_EA", + ### + #MC + "drive_node", + "walk_node", + "bike_node", + "transit_node", + # "ML_lanes", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_NT", + #### + "segment_id", + "managed", + "bus_only", + "rail_only", + "transit", + ##MTC + "ft", + "assignable", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EA", + "lanes_EV", + "useclass_AM", + "useclass_EA", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "tollseg", + "tollbooth", + "farezone", + "tap_id", + ] + + self.float_col = [ + "distance", + "price", + "X", + "Y" + "mrcc_id", + ] + + self.float_col = ["distance", "ttime_assert", "price", "X", "Y"] + + self.string_col = [ + "osm_node_id", + "name", + "roadway", + "shstGeometryId", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "ROUTE_SYS", + ] + + # paramters added for PNR simulation + self.pnr_node_location = os.path.join( + self.data_file_location, "lookups", "pnr_stations.csv" + ) + self.pnr_buffer = 20 + self.knr_buffer = 2.5 + self.walk_buffer = 0.75 + self.transfer_buffer = 1 + self.taz_list = os.path.join( + self.data_file_location, "lookups", "taz_lists.csv" + ) + self.sf_county = os.path.join( + self.data_file_location, "lookups", "SFcounty.shp" + ) + + self.__dict__.update(kwargs)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_modules/lasso/project/index.html b/branch/test_no_change/_modules/lasso/project/index.html new file mode 100644 index 0000000..65ad8b5 --- /dev/null +++ b/branch/test_no_change/_modules/lasso/project/index.html @@ -0,0 +1,1505 @@ + + + + + + lasso.project — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.project

+import json
+import os
+import re
+from typing import Any, Dict, Optional, Union, List
+from csv import reader
+
+from pandas.core import base
+
+import numpy as np
+import pandas as pd
+from pandas import DataFrame
+import geopandas as gpd
+
+from network_wrangler import ProjectCard
+from network_wrangler import RoadwayNetwork
+
+from .transit import CubeTransit, StandardTransit
+from .logger import WranglerLogger
+from .parameters import Parameters
+from .roadway import ModelRoadwayNetwork
+from .util import column_name_to_parts
+
+
+
[docs]class Project(object): + """A single or set of changes to the roadway or transit system. + + Compares a base and a build transit network or a base and build + highway network and produces project cards. + + .. highlight:: python + + Typical usage example: + :: + test_project = Project.create_project( + base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + test_project.evaluate_changes() + test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + + Attributes: + DEFAULT_PROJECT_NAME: a class-level constant that defines what + the project name will be if none is set. + STATIC_VALUES: a class-level constant which defines values that + are not evaluated when assessing changes. + card_data (dict): {"project": <project_name>, "changes": <list of change dicts>} + roadway_link_changes (DataFrame): pandas dataframe of CUBE roadway link changes. + roadway_node_changes (DataFrame): pandas dataframe of CUBE roadway node changes. + transit_changes (CubeTransit): + base_roadway_network (RoadwayNetwork): + base_cube_transit_network (CubeTransit): + build_cube_transit_network (CubeTransit): + project_name (str): name of the project, set to DEFAULT_PROJECT_NAME if not provided + parameters: an instance of the Parameters class which sets a bunch of parameters + """ + + DEFAULT_PROJECT_NAME = "USER TO define" + + STATIC_VALUES = [ + "model_link_id", + "area_type", + "county", + # "assign_group", + "centroidconnect", + ] + CALCULATED_VALUES = [ + "area_type", + "county", + "assign_group", + "centroidconnect", + ] + +
[docs] def __init__( + self, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[DataFrame] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = "", + evaluate: Optional[bool] = False, + parameters: Union[dict, Parameters] = {}, + ): + """ + ProjectCard constructor. + + args: + roadway_link_changes: dataframe of roadway changes read from a log file + roadway_node_changes: dataframe of roadway changes read from a log file + transit_changes: dataframe of transit changes read from a log file + base_roadway_network: RoadwayNetwork instance for base case + base_transit_network: StandardTransit instance for base case + base_cube_transit_network: CubeTransit instance for base transit network + build_cube_transit_network: CubeTransit instance for build transit network + project_name: name of the project + evaluate: defaults to false, but if true, will create card data + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + returns: instance of ProjectCard + """ + self.card_data = Dict[str, Dict[str, Any]] + + self.roadway_link_changes = roadway_link_changes + self.roadway_node_changes = roadway_node_changes + self.base_roadway_network = base_roadway_network + self.base_transit_network = base_transit_network + self.base_cube_transit_network = base_cube_transit_network + self.build_cube_transit_network = build_cube_transit_network + self.transit_changes = transit_changes + self.project_name = ( + project_name if project_name else Project.DEFAULT_PROJECT_NAME + ) + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + if base_roadway_network != None: + self.determine_roadway_network_changes_compatibility( + self.base_roadway_network, + self.roadway_link_changes, + self.roadway_node_changes, + self.parameters + ) + + if evaluate: + self.evaluate_changes()
+ +
[docs] def write_project_card(self, filename: str = None): + """ + Writes project cards. + + Args: + filename (str): File path to output .yml + + Returns: + None + """ + ProjectCard(self.card_data).write(out_filename=filename)
+ +
[docs] @staticmethod + def create_project( + roadway_log_file: Union[str, List[str], None] = None, + roadway_shp_file: Optional[str] = None, + roadway_csv_file: Optional[str] = None, + network_build_file: Optional[str] = None, + emme_node_id_crosswalk_file: Optional[str] = None, + emme_name_crosswalk_file: Optional[str] = None, + base_roadway_dir: Optional[str] = None, + base_transit_dir: Optional[str] = None, + base_cube_transit_source: Optional[str] = None, + build_cube_transit_source: Optional[str] = None, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[CubeTransit] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = None, + recalculate_calculated_variables: Optional[bool] = False, + recalculate_distance: Optional[bool] = False, + parameters: Optional[dict] = {}, + **kwargs, + ): + """ + Constructor for a Project instance. + + Args: + roadway_log_file: File path to consuming logfile or a list of logfile paths. + roadway_shp_file: File path to consuming shape file for roadway changes. + roadway_csv_file: File path to consuming csv file for roadway changes. + network_build_file: File path to consuming EMME network build for network changes. + base_roadway_dir: Folder path to base roadway network. + base_transit_dir: Folder path to base transit network. + base_cube_transit_source: Folder path to base transit network or cube line file string. + base_cube_transit_file: File path to base transit network. + build_cube_transit_source: Folder path to build transit network or cube line file string. + build_cube_transit_file: File path to build transit network. + roadway_link_changes: pandas dataframe of CUBE roadway link changes. + roadway_node_changes: pandas dataframe of CUBE roadway node changes. + transit_changes: build transit changes. + base_roadway_network: Base roadway network object. + base_cube_transit_network: Base cube transit network object. + build_cube_transit_network: Build cube transit network object. + project_name: If not provided, will default to the roadway_log_file filename if + provided (or the first filename if a list is provided) + recalculate_calculated_variables: if reading in a base network, if this is true it + will recalculate variables such as area type, etc. This only needs to be true + if you are creating project cards that are changing the calculated variables. + recalculate_distance: recalculate the distance variable. This only needs to be + true if you are creating project cards that change the distance. + parameters: dictionary of parameters + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in + the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables + in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + + Returns: + A Project instance. + """ + + if base_cube_transit_source and base_cube_transit_network: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_cube_transit_network' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_cube_transit_source: + base_cube_transit_network = CubeTransit.create_from_cube(base_cube_transit_source, parameters) + WranglerLogger.debug( + "Base network has {} lines".format(len(base_cube_transit_network.lines)) + ) + if len(base_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Base network lines: {}".format( + "\n - ".join(base_cube_transit_network.lines) + ) + ) + elif base_cube_transit_network: + pass + else: + msg = "No base cube transit network." + WranglerLogger.info(msg) + base_cube_transit_network = None + + if build_cube_transit_source and transit_changes: + msg = "Method takes only one of 'build_cube_transit_source' and 'transit_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if build_cube_transit_source: + WranglerLogger.debug("build") + build_cube_transit_network = CubeTransit.create_from_cube(build_cube_transit_source, parameters) + WranglerLogger.debug( + "Build network has {} lines".format(len(build_cube_transit_network.lines)) + ) + if len(build_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Build network lines: {}".format( + "\n - ".join(build_cube_transit_network.lines) + ) + ) + elif transit_changes: + pass + else: + msg = "No cube transit changes given or processed." + WranglerLogger.info(msg) + transit_changes = None + + if roadway_log_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_log_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_csv_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_csv_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_shp_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_shp_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and not project_name: + if type(roadway_log_file) == list: + project_name = os.path.splitext(os.path.basename(roadway_log_file[0]))[ + 0 + ] + WranglerLogger.info( + "No Project Name - Using name of first log file in list" + ) + else: + project_name = os.path.splitext(os.path.basename(roadway_log_file))[0] + WranglerLogger.info("No Project Name - Using name of log file") + if network_build_file and not project_name: + if type(network_build_file) == list: + with open(network_build_file[0]) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info( + "No Project Name - Using metadata of first network build file in list" + ) + else: + with open(network_build_file) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info("No Project Name - Using metadata of network build file") + if roadway_log_file: + roadway_link_changes, roadway_node_changes = Project.read_logfile(roadway_log_file) + elif roadway_shp_file: + roadway_changes = gpd.read_file(roadway_shp_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_link_changes = DataFrame(roadway_link_changes.drop("geometry", axis=1)) + roadway_node_changes = DataFrame(roadway_node_changes.drop("geometry", axis=1)) + roadway_node_changes["model_node_id"] = 0 + elif roadway_csv_file: + roadway_changes = pd.read_csv(roadway_csv_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_node_changes["model_node_id"] = 0 + elif network_build_file: + roadway_link_changes, roadway_node_changes, transit_changes = Project.read_network_build_file(network_build_file) + if emme_node_id_crosswalk_file: + # get wrangler IDs from emme element_id + roadway_link_changes, roadway_node_changes, transit_changes = Project.emme_id_to_wrangler_id( + roadway_link_changes, + roadway_node_changes, + transit_changes, + emme_node_id_crosswalk_file + ) + else: + msg = "User needs to specify emme node id crosswalk file using emme_node_id_crosswalk_file = " + WranglerLogger.error(msg) + raise ValueError(msg) + # rename emme attributes to wrangler attributes + if emme_name_crosswalk_file is None: + emme_name_crosswalk_file = parameters.emme_name_crosswalk_file + roadway_link_changes, roadway_node_changes = Project.emme_name_to_wrangler_name( + roadway_link_changes, + roadway_node_changes, + emme_name_crosswalk_file + ) + elif roadway_link_changes: + pass + elif roadway_node_changes: + pass + else: + msg = "No roadway changes given or processed." + WranglerLogger.info(msg) + roadway_link_changes = pd.DataFrame({}) + roadway_node_changes = pd.DataFrame({}) + + if base_roadway_network and base_roadway_dir: + msg = "Method takes only one of 'base_roadway_network' and 'base_roadway_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_roadway_dir: + base_roadway_network = ModelRoadwayNetwork.read( + os.path.join(base_roadway_dir, "link.json"), + os.path.join(base_roadway_dir, "node.geojson"), + os.path.join(base_roadway_dir, "shape.geojson"), + fast=True, + recalculate_calculated_variables=recalculate_calculated_variables, + recalculate_distance=recalculate_distance, + parameters=parameters, + **kwargs, + ) + base_roadway_network.split_properties_by_time_period_and_category() + elif base_roadway_network: + base_roadway_network.split_properties_by_time_period_and_category() + else: + msg = "No base roadway network." + WranglerLogger.info(msg) + base_roadway_network = None + + if base_cube_transit_source and base_transit_dir: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_transit_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_transit_dir: + base_transit_network = StandardTransit.read_gtfs( + gtfs_feed_dir=base_transit_dir, + parameters=parameters + ) + else: + msg = "No base transit network." + WranglerLogger.info(msg) + base_transit_network = None + + project = Project( + roadway_link_changes=roadway_link_changes, + roadway_node_changes=roadway_node_changes, + transit_changes=transit_changes, + base_roadway_network=base_roadway_network, + base_transit_network=base_transit_network, + base_cube_transit_network=base_cube_transit_network, + build_cube_transit_network=build_cube_transit_network, + evaluate=True, + project_name=project_name, + parameters=parameters, + ) + + return project
+ +
[docs] @staticmethod + def read_logfile(logfilename: Union[str, List[str]]): + """ + Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + logfilename (str or list[str]): File path to CUBE logfile or list of logfile paths. + + Returns: + A DataFrame reprsentation of the log file. + """ + if type(logfilename) == str: + logfilename = [logfilename] + + link_df = pd.DataFrame() + node_df = pd.DataFrame() + + for file in logfilename: + WranglerLogger.info("Reading logfile: {}".format(file)) + with open(file) as f: + _content = f.readlines() + + _node_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("N") + ] + WranglerLogger.debug("node lines: {}".format(_node_lines)) + _link_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("L") + ] + WranglerLogger.debug("link lines: {}".format(_link_lines)) + + _nodecol = ["OBJECT", "OPERATION", "GROUP"] + _node_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Node Cols: {}".format(_nodecol)) + _linkcol = ["OBJECT", "OPERATION", "GROUP"] + _link_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Link Cols: {}".format(_linkcol)) + + def split_log(x): + return list(reader([x], delimiter=',', quotechar='"'))[0] + + _node_df = pd.DataFrame([split_log(x) for x in _node_lines[1:]],columns = _nodecol) + WranglerLogger.debug("Node DF: {}".format(_node_df)) + _link_df = pd.DataFrame([split_log(x) for x in _link_lines[1:]],columns = _linkcol) + WranglerLogger.debug("Link DF: {}".format(_link_df)) + + node_df = pd.concat([node_df, _node_df]) + link_df = pd.concat([link_df, _link_df]) + + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + link_df.columns = [c.split("[")[0] for c in link_df.columns] + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + node_df.columns = [c.split("[")[0] for c in node_df.columns] + + if len(link_df) > 0: + # create operation history + action_history_df = ( + link_df.groupby(['A', 'B'])["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + link_df = pd.merge(link_df, action_history_df, on=['A', 'B'], how="left") + + if len(node_df) > 0: + action_history_df = ( + node_df.groupby('N')["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + node_df = pd.merge(node_df, action_history_df, on='N', how="left") + + WranglerLogger.info( + "Processed {} Node lines and {} Link lines".format( + node_df.shape[0], link_df.shape[0] + ) + ) + + return link_df, node_df
+ +
[docs] @staticmethod + def read_network_build_file(networkbuildfilename: Union[str, List[str]]): + """ + Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + networkbuildfilename (str or list[str]): File path to emme nework build file or list of network build file paths. + + Returns: + A DataFrame representation of the network build file + """ + if type(networkbuildfilename) == str: + networkbuildfilename = [networkbuildfilename] + + _link_command_history_df = DataFrame() + _node_command_history_df = DataFrame() + _transit_command_history_df = DataFrame() + + for file in networkbuildfilename: + WranglerLogger.info("Reading network build file: {}".format(file)) + with open(file) as f: + _content = json.load(f) + + _command_history = _content.get('command_history') + + # loop through all the commands + for command in _command_history: + if command.get('command') == 'set_attribute': + element_id = command.get('parameters').get('element_ids') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + _command_df[command.get('parameters').get('attribute_name')] = command.get('parameters').get('value') + + if command.get('command') in ['create_link', 'create_node']: + if command.get('command') == 'create_link': + element_id = command.get('results').get('changes').get('added').get('LINK') + if command.get('command') == 'create_node': + element_id = command.get('results').get('changes').get('added').get('NODE') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + for attribute_name, attribute_value in command.get('parameters').get('attributes').items(): + _command_df[attribute_name] = attribute_value + + if command.get('command') == 'delete_link': + element_id = command.get('results').get('changes').get('removed').get('LINK') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + if command.get('command') == 'modify_transit_line': + element_id = command.get('parameters').get('line_id') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : pd.Series(element_id), + 'object' : pd.Series(object), + 'operation' : pd.Series(operation) + } + ) + + _command_df['new_itinerary'] = [command.get('parameters').get('new_itinerary')] + + if ('L' in _command_df['object'].unique()): + _link_command_history_df = _link_command_history_df.append( + _command_df[_command_df['object'] == 'L'], + sort = False, + ignore_index = True + ) + + if ('N' in _command_df['object'].unique()): + _node_command_history_df = _node_command_history_df.append( + _command_df[_command_df['object'] == 'N'], + sort = False, + ignore_index = True + ) + + if ( + ('TRANSIT_LINE' in _command_df['object'].unique()) | + ('TRANSIT_STOP' in _command_df['object'].unique()) | + ('TRANSIT_SHAPE' in _command_df['object'].unique()) + ): + _transit_command_history_df = _transit_command_history_df.append( + _command_df[_command_df['object'].isin(['TRANSIT_LINE', 'TRANSIT_STOP', 'TRANSIT_SHAPE'])], + sort = False, + ignore_index = True + ) + + if len(_link_command_history_df) > 0: + # create operation history + link_action_history_df = ( + _link_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + link_action_history_df["operation_final"] = link_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + # get the last none null value for each element + # consolidate elements to single record + def get_last_valid(series): + if len(series.dropna()) > 0: + return series.dropna().iloc[-1] + else: + return np.nan + + #_command_history_df = _command_history_df.groupby(['element_id']).apply(get_last_valid).reset_index() + _link_command_history_df = _link_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _link_command_history_df = pd.merge(_link_command_history_df, link_action_history_df, on='element_id', how="left") + + if len(_node_command_history_df) > 0: + # create node operation history + node_action_history_df = ( + _node_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + node_action_history_df["operation_final"] = node_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + _node_command_history_df = _node_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _node_command_history_df = pd.merge(_node_command_history_df, node_action_history_df, on='element_id', how="left") + + WranglerLogger.info( + "Processed {} link element commands, {} node element commands".format( + _link_command_history_df.shape[0], + _node_command_history_df.shape[0] + ) + ) + + return _link_command_history_df, _node_command_history_df, _transit_command_history_df
+ +
[docs] @staticmethod + def emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file): + """ + rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder + """ + WranglerLogger.info('Reading emme node id crosswalk file from {}'.format(emme_node_id_crosswalk_file)) + emme_node_id_crosswalk_df = pd.read_csv(emme_node_id_crosswalk_file) + emme_node_id_dict = dict(zip(emme_node_id_crosswalk_df['emme_node_id'], emme_node_id_crosswalk_df['model_node_id'])) + + # get node changes + if len(emme_node_change_df) > 0: + emme_node_change_df['emme_id'] = emme_node_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + + # get new emme nodes + new_emme_node_id_list = [ + n for n in emme_node_change_df['emme_id'] if n not in emme_node_id_crosswalk_df['emme_node_id'] + ] + WranglerLogger.info('New emme node id list {}'.format(new_emme_node_id_list)) + new_wrangler_node = emme_node_id_crosswalk_df['model_node_id'].max() + + # add crosswalk for new emme nodes + for new_emme_node in new_emme_node_id_list: + new_wrangler_node = new_wrangler_node + 1 + emme_node_id_dict.update({new_emme_node : new_wrangler_node}) + + # for nodes update model_node_id + emme_node_change_df['model_node_id'] = emme_node_change_df['emme_id'].map(emme_node_id_dict).fillna(0) + + if len(emme_link_change_df) > 0: + emme_link_change_df['A'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + emme_link_change_df['B'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[-1])) + # for links update A,B nodes + emme_link_change_df['A'] = emme_link_change_df['A'].map(emme_node_id_dict) + emme_link_change_df['B'] = emme_link_change_df['B'].map(emme_node_id_dict) + + if len(emme_transit_changes_df) > 0: + emme_transit_changes_df['i_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-3] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + emme_transit_changes_df['j_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-2] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + # update i,j nodes + emme_transit_changes_df['i_node'] = emme_transit_changes_df[ + 'i_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + emme_transit_changes_df['j_node'] = emme_transit_changes_df[ + 'j_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + + # update routing nodes + emme_transit_changes_df['new_itinerary'] = emme_transit_changes_df.apply( + lambda x: [emme_node_id_dict.get(n) for n in x['new_itinerary']] if x['object'] == 'TRANSIT_SHAPE' else 0, + axis = 1 + ) + + return emme_link_change_df, emme_node_change_df, emme_transit_changes_df
+ +
[docs] def get_object_from_network_build_command(row): + """ + determine the network build object is node or link + + Args: + row: network build command history dataframe + + Returns: + 'N' for node, 'L' for link + """ + + if row.get('command') == 'create_link': + return 'L' + + if row.get('command') == 'create_node': + return 'N' + + if row.get('command') == 'delete_link': + return 'L' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'L' + if row.get('parameters').get('element_type') == 'NODE': + return 'N' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'TRANSIT_LINE' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'TRANSIT_STOP' + + if row.get('command') == 'modify_transit_line': + return 'TRANSIT_SHAPE'
+ +
[docs] def get_operation_from_network_build_command(row): + """ + determine the network build object action type + + Args: + row: network build command history dataframe + + Returns: + 'A', 'C', 'D' + """ + + if row.get('command') == 'create_link': + return 'A' + + if row.get('command') == 'create_node': + return 'A' + + if row.get('command') == 'delete_link': + return 'D' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'C' + if row.get('parameters').get('element_type') == 'NODE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'C' + + if row.get('command') == 'modify_transit_line': + return 'C'
+ +
[docs] @staticmethod + def emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file): + """ + rename emme names to wrangler names using crosswalk file + """ + + WranglerLogger.info('Reading emme attribute name crosswalk file {}'.format(emme_name_crosswalk_file)) + emme_name_crosswalk_df = pd.read_csv(emme_name_crosswalk_file) + emme_name_crosswalk_dict = dict(zip(emme_name_crosswalk_df['emme_name'], emme_name_crosswalk_df['wrangler_name'])) + + # drop columns we don't need from emme to avoid confusion + ignore_columns = [ + c for c in emme_link_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'A', 'B'] + ] + WranglerLogger.info('Ignoring link changes in {}'.format(ignore_columns)) + emme_link_change_df = emme_link_change_df.drop(ignore_columns, axis = 1) + + ignore_columns = [ + c for c in emme_node_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'model_node_id'] + ] + WranglerLogger.info('Ignoring node changes in {}'.format(ignore_columns)) + emme_node_change_df = emme_node_change_df.drop(ignore_columns, axis = 1) + + # rename emme name to wrangler name + emme_link_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + emme_node_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + + return emme_link_change_df, emme_node_change_df
+ +
[docs] @staticmethod + def determine_roadway_network_changes_compatability( + base_roadway_network: ModelRoadwayNetwork, + roadway_link_changes: DataFrame, + roadway_node_changes: DataFrame, + parameters: Parameters, + ): + """ + Checks to see that any links or nodes that change exist in base roadway network. + """ + WranglerLogger.info( + "Evaluating compatibility between roadway network changes and base network. Not evaluating deletions." + ) + + # CUBE log file saves all variable names in upper cases, need to convert them to be same as network + log_to_net_df = pd.read_csv(parameters.log_to_net_crosswalk) + log_to_net_dict = dict(zip(log_to_net_df["log"], log_to_net_df["net"])) + + dbf_to_net_df = pd.read_csv(parameters.net_to_dbf_crosswalk) + dbf_to_net_dict = dict(zip(dbf_to_net_df["dbf"], dbf_to_net_df["net"])) + + for c in roadway_link_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_link_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_link_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_link_changes.rename(columns=dbf_to_net_dict, inplace=True) + + for c in roadway_node_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_node_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_node_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_node_changes.rename(columns=dbf_to_net_dict, inplace=True) + + # for links "L" that change "C", + # find locations where there isn't a base roadway link + if len(roadway_link_changes) > 0: + link_changes_df = roadway_link_changes[ + roadway_link_changes["operation_final"] == "C" + ].copy() + + link_merge_df = pd.merge( + link_changes_df[["A", "B"]].astype(str), + base_roadway_network.links_df[["A", "B", "model_link_id"]].astype(str), + how="left", + on=["A", "B"], + ) + + missing_links = link_merge_df.loc[link_merge_df["model_link_id"].isna()] + + if missing_links.shape[0]: + msg = "Network missing the following AB links:\n{}".format(missing_links) + WranglerLogger.error(msg) + raise ValueError(msg) + + # for links "N" that change "C", + # find locations where there isn't a base roadway node + if len(roadway_node_changes) > 0: + node_changes_df = roadway_node_changes[ + roadway_node_changes["operation_final"] == "C" + ].copy() + + node_merge_df = pd.merge( + node_changes_df[["model_node_id"]], + base_roadway_network.nodes_df[["model_node_id", "geometry"]], + how="left", + on=["model_node_id"], + ) + missing_nodes = node_merge_df.loc[node_merge_df["geometry"].isna()] + if missing_nodes.shape[0]: + msg = "Network missing the following nodes:\n{}".format(missing_nodes) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] def evaluate_changes(self): + """ + Determines which changes should be evaluated, initiates + self.card_data to be an aggregation of transit and highway changes. + """ + highway_change_list = [] + transit_change_list = [] + + WranglerLogger.info("Evaluating project changes.") + + if (not self.roadway_link_changes.empty) | (not self.roadway_node_changes.empty): + highway_change_list = self.add_highway_changes() + + if (not self.transit_changes.empty) or ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + self.card_data = { + "project": self.project_name, + "changes": transit_change_list + highway_change_list, + }
+ +
[docs] def add_transit_changes(self): + """ + Evaluates changes between base and build transit objects and + adds entries into the self.card_data dictionary. + """ + if self.build_cube_transit_network: + transit_change_list = self.build_cube_transit_network.evaluate_differences( + self.base_cube_transit_network + ) + elif self.base_transit_network: + transit_change_list = self.base_transit_network.evaluate_differences( + self.transit_changes + ) + return transit_change_list
+ + @staticmethod + def _final_op(x): + if x["operation_history"][-1] == "D": + if "A" in x["operation_history"][:-1]: + return "N" + else: + return "D" + elif x["operation_history"][-1] == "A": + if "D" in x["operation_history"][:-1]: + return "C" + else: + return "A" + else: + if "A" in x["operation_history"][:-1]: + return "A" + else: + return "C" + +
[docs] def add_highway_changes(self, limit_variables_to_existing_network=False): + """ + Evaluates changes from the log file based on the base highway object and + adds entries into the self.card_data dictionary. + + Args: + limit_variables_to_existing_network (bool): True if no ad-hoc variables. Default to False. + """ + + for c in self.parameters.string_col: + if c in self.roadway_link_changes.columns: + self.roadway_link_changes[c] = self.roadway_link_changes[c].str.lstrip(" ") + if c in self.roadway_node_changes.columns: + self.roadway_node_changes[c] = self.roadway_node_changes[c].str.lstrip(" ") + + ## if worth it, could also add some functionality to network wrangler itself. + node_changes_df = self.roadway_node_changes.copy() + + link_changes_df = self.roadway_link_changes.copy() + + def _process_deletions(link_changes_df): + """ + create deletion section in project card + """ + WranglerLogger.debug("Processing link deletions") + + cube_delete_df = link_changes_df[link_changes_df["operation_final"] == "D"].copy() + + # make sure columns has the same type as base network + cube_delete_df['A'] = cube_delete_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_delete_df['B'] = cube_delete_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_delete_df.columns: + cube_delete_df.drop(['model_link_id'], axis = 1, inplace = True) + + cube_delete_df = pd.merge( + cube_delete_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if len(cube_delete_df) > 0: + links_to_delete = cube_delete_df["model_link_id"].tolist() + delete_link_dict = { + "category": "Roadway Deletion", + "links": {"model_link_id": links_to_delete}, + } + WranglerLogger.debug("{} Links Deleted.".format(len(links_to_delete))) + else: + delete_link_dict = None + WranglerLogger.debug("No link deletions processed") + + return delete_link_dict + + def _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ): + """""" + WranglerLogger.debug("Processing link additions") + cube_add_df = link_changes_df[link_changes_df["operation_final"] == "A"] + if len(cube_add_df) == 0: + WranglerLogger.debug("No link additions processed") + return {} + + if limit_variables_to_existing_network: + add_col = [ + c + for c in cube_add_df.columns + if c in self.base_roadway_network.links_df.columns + ] + else: + add_col = [ + c for c in cube_add_df.columns if c not in ["operation_final"] + ] + # can leave out "operation_final" from writing out, is there a reason to write it out? + + for x in add_col: + cube_add_df[x] = cube_add_df[x].astype(self.base_roadway_network.links_df[x].dtype) + + add_link_properties = cube_add_df[add_col].to_dict("records") + + # WranglerLogger.debug("Add Link Properties: {}".format(add_link_properties)) + WranglerLogger.debug("{} Links Added".format(len(add_link_properties))) + + return {"category": "Add New Roadway", "links": add_link_properties} + + def _process_node_additions(node_add_df): + """""" + WranglerLogger.debug("Processing node additions") + + if len(node_add_df) == 0: + WranglerLogger.debug("No node additions processed") + return [] + + node_add_df = node_add_df.drop(["operation_final"], axis=1) + + for x in node_add_df.columns: + node_add_df[x] = node_add_df[x].astype(self.base_roadway_network.nodes_df[x].dtype) + + add_nodes_dict_list = node_add_df.to_dict( + "records" + ) + WranglerLogger.debug("{} Nodes Added".format(len(add_nodes_dict_list))) + + return add_nodes_dict_list + + def _process_single_link_change(change_row, changeable_col): + """""" + + # 1. Find associated base year network values + base_df = self.base_roadway_network.links_df[ + (self.base_roadway_network.links_df["A"] == int(change_row.A)) + & (self.base_roadway_network.links_df["B"] == int(change_row.B)) + ] + + if not base_df.shape[0]: + msg = "No match found in network for AB combination: ({},{}). Incompatible base network.".format( + change_row.A, change_row.B + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + elif base_df.shape[0] > 1: + WranglerLogger.warning( + "Found more than one match in base network for AB combination: ({},{}). Selecting first one to operate on but AB should be unique to network.".format( + change_row.A, change_row.B + ) + ) + + base_row = base_df.iloc[0] + # WranglerLogger.debug("Properties with changes: {}".format(changeable_col)) + + # 2. find columns that changed (enough) + changed_col = [] + for col in changeable_col: + WranglerLogger.debug("Assessing Column: {}".format(col)) + # if it is the same as before, or a static value, don't process as a change + if str(change_row[col]).strip('"\'') == str(base_row[col]).strip('"\''): + continue + # if it is NaN or None, don't process as a change + if (change_row[col] != change_row[col]) | (change_row[col] is None): + continue + if (col == "roadway_class") & (change_row[col] == 0): + continue + # only look at distance if it has significantly changed + if col == "distance": + if ( + abs( + (change_row[col] - float(base_row[col])) + / base_row[col].astype(float) + ) + > 0.01 + ): + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + else: + continue + else: + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + + WranglerLogger.debug( + "Properties with changes that will be processed: {}".format(changed_col) + ) + + if not changed_col: + return pd.DataFrame() + + # 3. Iterate through columns with changed values and structure the changes as expected in project card + property_dict_list = [] + processed_properties = [] + + # check if it's a manged lane change + for c in changed_col: + if c.startswith("ML_"): + # TODO ML project card skeleton + msg = "Detected managed lane changes, please create managed lane project card!" + WranglerLogger.error(msg) + raise ValueError(msg) + return + + # regular roadway property change + for c in changed_col: + # WranglerLogger.debug("Processing Column: {}".format(c)) + ( + p_base_name, + p_time_period, + p_category, + managed_lane, + ) = column_name_to_parts(c, self.parameters) + + _d = { + "existing": base_row[c], + "set": change_row[c], + } + if c in Project.CALCULATED_VALUES: + _d = { + "set": change_row[c], + } + if p_time_period: + if managed_lane == 1: + _d["time"] = list( + self.parameters.time_period_to_time[p_time_period] + ) + if p_category: + _d["category"] = p_category + + # iterate through existing properties that have been changed and see if you should just add + if (p_base_name in processed_properties) & (managed_lane == 1): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + processed_p["timeofday"] += [_d] + elif (p_base_name in processed_properties) & (managed_lane == 0): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + if processed_p["set"] != change_row[c]: + msg = "Detected different changes for split-property variables on regular roadway links: " + msg += "conflicting \"{}\" values \"{}\", \"{}\"".format(p_base_name, processed_p["set"], change_row[c]) + WranglerLogger.error(msg) + raise ValueError(msg) + elif p_time_period: + if managed_lane == 1: + property_dict = {"property": p_base_name, "timeofday": [_d]} + processed_properties.append(p_base_name) + property_dict_list.append(property_dict) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + + card_df = pd.DataFrame( + { + "properties": pd.Series([property_dict_list]), + "model_link_id": pd.Series(base_row["model_link_id"]), + } + ) + + # WranglerLogger.debug('single change card_df:\n {}'.format(card_df)) + + return card_df + + def _process_link_changes(link_changes_df, changeable_col): + """""" + cube_change_df = link_changes_df[link_changes_df["operation_final"] == "C"].copy() + + # make sure columns has the same type as base network + cube_change_df['A'] = cube_change_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_change_df['B'] = cube_change_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_change_df.columns: + cube_change_df.drop('model_link_id', axis = 1, inplace = True) + + cube_change_df = pd.merge( + cube_change_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if not cube_change_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + change_link_dict_df = pd.DataFrame(columns=["properties", "model_link_id"]) + + for index, row in cube_change_df.iterrows(): + card_df = _process_single_link_change(row, changeable_col) + + change_link_dict_df = pd.concat( + [change_link_dict_df, card_df], ignore_index=True, sort=False + ) + + if not change_link_dict_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + # WranglerLogger.debug('change_link_dict_df Unaggregated:\n {}'.format(change_link_dict_df)) + + # Have to change to string so that it is a hashable type for the aggregation + change_link_dict_df["properties"] = change_link_dict_df[ + "properties" + ].astype(str) + # Group the changes that are the same + change_link_dict_df = ( + change_link_dict_df.groupby("properties")[["model_link_id"]] + .agg(lambda x: list(x)) + .reset_index() + ) + # WranglerLogger.debug('change_link_dict_df Aggregated:\n {}'.format(change_link_dict_df)) + + # Reformat model link id to correct "facility" format + change_link_dict_df["facility"] = change_link_dict_df.apply( + lambda x: {"link": [{"model_link_id": x.model_link_id}]}, axis=1 + ) + + # WranglerLogger.debug('change_link_dict_df 3: {}'.format(change_link_dict_df)) + change_link_dict_df["properties"] = change_link_dict_df["properties"].apply( + lambda x: json.loads( + x.replace("'\"", "'").replace("\"'", "'").replace("'", '"') + ) + ) + + change_link_dict_df["category"] = "Roadway Property Change" + + change_link_dict_list = change_link_dict_df[ + ["category", "facility", "properties"] + ].to_dict("record") + + WranglerLogger.debug( + "{} Changes Processed".format(len(change_link_dict_list)) + ) + return change_link_dict_list + + def _consolidate_actions(log, base, key_list): + log_df = log.copy() + # will be changed if to allow new variables being added/changed that are not in base network + changeable_col = [x for x in log_df.columns if x in base.columns] + #print(log_df) + #for x in changeable_col: + # print(x) + #log_df[x] = log_df[x].astype(base[x].dtype) + + if 'operation_final' not in log_df.columns: + action_history_df = ( + log_df.groupby(key_list)["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + + log_df = pd.merge(log_df, action_history_df, on=key_list, how="left") + log_df.drop_duplicates(subset=key_list, keep="last", inplace=True) + log_df["operation_final"] = log_df.apply(lambda x: Project._final_op(x), axis=1) + + return log_df[changeable_col + ["operation_final"]] + + delete_link_dict = None + add_link_dict = None + change_link_dict_list = [] + + if len(link_changes_df) != 0: + link_changes_df = _consolidate_actions( + link_changes_df, self.base_roadway_network.links_df, ["A", "B"] + ) + + # process deletions + delete_link_dict = _process_deletions(link_changes_df) + + # process additions + add_link_dict = _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ) + + # process changes + WranglerLogger.debug("Processing changes") + WranglerLogger.debug(link_changes_df) + changeable_col = list( + ( + set(link_changes_df.columns) + & set(self.base_roadway_network.links_df.columns) + ) + - set(Project.STATIC_VALUES) + ) + + cols_in_changes_not_in_net = list( + set(link_changes_df.columns) + - set(self.base_roadway_network.links_df.columns) + ) + + if cols_in_changes_not_in_net: + WranglerLogger.warning( + "The following attributes are specified in the changes but do not exist in the base network: {}".format( + cols_in_changes_not_in_net + ) + ) + + change_link_dict_list = _process_link_changes(link_changes_df, changeable_col) + + if len(node_changes_df) != 0: + node_changes_df = _consolidate_actions( + node_changes_df, self.base_roadway_network.nodes_df, ["model_node_id"] + ) + + # print error message for node change and node deletion + if ( + len(node_changes_df[node_changes_df["operation_final"].isin(["C", "D"])]) + > 0 + ): + msg = "NODE changes and deletions are not allowed!" + WranglerLogger.warning(msg) + #raise ValueError(msg) + node_add_df = node_changes_df[node_changes_df["operation_final"] == "A"] + + if add_link_dict: + add_link_dict["nodes"] = _process_node_additions(node_add_df) + else: + add_link_dict = {"category": "Add New Roadway", "nodes": _process_node_additions(node_add_df)} + + else: + None + + # combine together + + highway_change_list = list( + filter(None, [delete_link_dict] + [add_link_dict] + change_link_dict_list) + ) + + return highway_change_list
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_modules/lasso/roadway/index.html b/branch/test_no_change/_modules/lasso/roadway/index.html new file mode 100644 index 0000000..be27108 --- /dev/null +++ b/branch/test_no_change/_modules/lasso/roadway/index.html @@ -0,0 +1,2036 @@ + + + + + + lasso.roadway — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.roadway

+import copy
+import glob
+import os
+from typing import Optional, Union
+
+import geopandas as gpd
+import pandas as pd
+
+from geopandas import GeoDataFrame
+from pandas import DataFrame
+import numpy as np
+
+from network_wrangler import RoadwayNetwork
+from .parameters import Parameters
+from .logger import WranglerLogger
+
+
+
[docs]class ModelRoadwayNetwork(RoadwayNetwork): + """ + Subclass of network_wrangler class :ref:`RoadwayNetwork <network_wrangler:RoadwayNetwork>` + + A representation of the physical roadway network and its properties. + """ + + CALCULATED_VALUES = [ + "area_type", + "county", + "centroidconnect", + ] + +
[docs] def __init__( + self, + nodes: GeoDataFrame, + links: DataFrame, + shapes: GeoDataFrame, + parameters: Union[Parameters, dict] = {}, + **kwargs, + ): + """ + Constructor + + Args: + nodes: geodataframe of nodes + links: dataframe of links + shapes: geodataframe of shapes + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. + If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + """ + super().__init__(nodes, links, shapes, **kwargs) + + # will have to change if want to alter them + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.links_metcouncil_df = None + self.nodes_metcouncil_df = None + + self.fill_na() + self.convert_int()
+ # self.shapes_metcouncil_df = None + ##todo also write to file + # WranglerLogger.debug("Used PARAMS\n", '\n'.join(['{}: {}'.format(k,v) for k,v in self.parameters.__dict__.items()])) + +
[docs] @staticmethod + def read( + link_filename: str, + node_filename: str, + shape_filename: str, + fast: bool = False, + recalculate_calculated_variables=False, + recalculate_distance=False, + parameters: Union[dict, Parameters] = {}, + **kwargs, + ): + """ + Reads in links and nodes network standard. + + Args: + link_filename (str): File path to link json. + node_filename (str): File path to node geojson. + shape_filename (str): File path to link true shape geojson + fast (bool): boolean that will skip validation to speed up read time. + recalculate_calculated_variables (bool): calculates fields from spatial joins, etc. + recalculate_distance (bool): re-calculates distance. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + Returns: + ModelRoadwayNetwork + """ + + nodes_df, links_df, shapes_df = RoadwayNetwork.load_transform_network( + node_filename, + link_filename, + shape_filename, + validate_schema=not fast, + **kwargs, + ) + + m_road_net = ModelRoadwayNetwork( + nodes_df, + links_df, + shapes_df, + parameters=parameters, + **kwargs, + ) + + if recalculate_calculated_variables: + m_road_net.create_calculated_variables() + if recalculate_distance: + m_road_net.calculate_distance(overwrite=True) + + m_road_net.fill_na() + # this method is making period values as string "NaN", need to revise. + m_road_net.split_properties_by_time_period_and_category() + for c in m_road_net.links_df.columns: + m_road_net.links_df[c] = m_road_net.links_df[c].replace("NaN", np.nan) + m_road_net.convert_int() + + return m_road_net
+ +
[docs] @staticmethod + def from_RoadwayNetwork( + roadway_network_object, + parameters: Union[dict, Parameters] = {}, + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + roadway_network_object (RoadwayNetwork). + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + Returns: + ModelRoadwayNetwork + """ + + additional_params_dict = { + k: v + for k, v in roadway_network_object.__dict__.items() + if k not in ["nodes_df", "links_df", "shapes_df", "parameters"] + } + + return ModelRoadwayNetwork( + roadway_network_object.nodes_df, + roadway_network_object.links_df, + roadway_network_object.shapes_df, + parameters=parameters, + **additional_params_dict, + )
+ +
[docs] def split_properties_by_time_period_and_category(self, properties_to_split=None): + """ + Splits properties by time period, assuming a variable structure of + + Args: + properties_to_split: dict + dictionary of output variable prefix mapped to the source variable and what to stratify it by + e.g. + { + 'lanes' : {'v':'lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'ML_lanes' : {'v':'ML_lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'use' : {'v':'use', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + } + + """ + import itertools + + if properties_to_split == None: + properties_to_split = self.parameters.properties_to_split + + for out_var, params in properties_to_split.items(): + if params["v"] not in self.links_df.columns: + WranglerLogger.warning( + "Specified variable to split: {} not in network variables: {}. Returning 0.".format( + params["v"], str(self.links_df.columns) + ) + ) + if params.get("time_periods") and params.get("categories"): + + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + time_suffix + "_" + category_suffix + ] = 0 + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[out_var + "_" + time_suffix] = 0 + elif params.get("time_periods") and params.get("categories"): + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + category_suffix + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=params["categories"][category_suffix], + time_period=params["time_periods"][time_suffix], + ) + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[ + out_var + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=None, + time_period=params["time_periods"][time_suffix], + ) + else: + raise ValueError( + "Shoudn't have a category without a time period: {}".format(params) + )
+ +
[docs] def create_calculated_variables(self): + """ + Creates calculated roadway variables. + + Args: + None + """ + WranglerLogger.info("Creating calculated roadway variables.") + + #MTC + self.create_ML_variable() + #/MTC + #MC + self.calculate_area_type() + self.calculate_county() + self.calculate_mpo() + self.add_counts() + self.create_ML_variable() + self.create_hov_corridor_variable() + self.create_managed_variable()
+ #/MC + +
[docs] def calculate_county( + self, + county_shape=None, + county_shape_variable=None, + network_variable="county", + county_codes_dict=None, + overwrite=False, + ): + """ + #MC + Calculates county variable. + + This uses the centroid of the geometry field to determine which county it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + county_shape (str): The File path to county geodatabase. + county_shape_variable (str): The variable name of county in county geodadabase. + network_variable (str): The variable name of county in network standard. Default to "county". + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing County Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "County Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + county_shape = county_shape if county_shape else self.parameters.county_shape + + county_shape_variable = ( + county_shape_variable + if county_shape_variable + else self.parameters.county_variable_shp + ) + + WranglerLogger.info( + "Adding roadway network variable for county using a spatial join with: {}".format( + county_shape + ) + ) + + county_codes_dict = ( + county_codes_dict if county_codes_dict else self.parameters.county_code_dict + ) + if not county_codes_dict: + msg = "No county codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + county_gdf = gpd.read_file(county_shape) + county_gdf = county_gdf.to_crs(epsg=self.crs) + joined_gdf = gpd.sjoin(centroids_gdf, county_gdf, how="left", op="intersects") + + joined_gdf[county_shape_variable] = ( + joined_gdf[county_shape_variable] + .map(county_codes_dict) + .fillna(10) + .astype(int) + ) + + self.links_df[network_variable] = joined_gdf[county_shape_variable] + + WranglerLogger.info( + "Finished Calculating county variable: {}".format(network_variable) + )
+ +
[docs] def calculate_area_type( + self, + area_type_shape=None, + area_type_shape_variable=None, + network_variable="area_type", + area_type_codes_dict=None, + downtown_area_type_shape=None, + downtown_area_type=None, + overwrite=False, + ): + """ + #MC + Calculates area type variable. + + This uses the centroid of the geometry field to determine which area it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + area_type_shape (str): The File path to area geodatabase. + area_type_shape_variable (str): The variable name of area type in area geodadabase. + network_variable (str): The variable name of area type in network standard. Default to "area_type". + area_type_codes_dict: The dictionary to map input area_type_shape_variable to network_variable + downtown_area_type_shape: The file path to the downtown area type boundary. + downtown_area_type (int): Integer value of downtown area type + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Area Type Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Area Type Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Area Type from Spatial Data and adding as roadway network variable: {}".format( + network_variable + ) + ) + + """ + Verify inputs + """ + + area_type_shape = ( + area_type_shape if area_type_shape else self.parameters.area_type_shape + ) + + if not area_type_shape: + msg = "No area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(area_type_shape): + msg = "File not found for area type shape: {}".format(area_type_shape) + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_shape_variable = ( + area_type_shape_variable + if area_type_shape_variable + else self.parameters.area_type_variable_shp + ) + + if not area_type_shape_variable: + msg = "No area type shape varible specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_codes_dict = ( + area_type_codes_dict + if area_type_codes_dict + else self.parameters.area_type_code_dict + ) + if not area_type_codes_dict: + msg = "No area type codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type_shape = ( + downtown_area_type_shape + if downtown_area_type_shape + else self.parameters.downtown_area_type_shape + ) + + if not downtown_area_type_shape: + msg = "No downtown area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(downtown_area_type_shape): + msg = "File not found for downtown area type shape: {}".format( + downtown_area_type_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type = ( + downtown_area_type + if downtown_area_type + else self.parameters.downtown_area_type + ) + if not downtown_area_type: + msg = "No downtown area type value specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + WranglerLogger.debug("Reading Area Type Shapefile {}".format(area_type_shape)) + area_type_gdf = gpd.read_file(area_type_shape) + area_type_gdf = area_type_gdf.to_crs(epsg=self.crs) + + downtown_gdf = gpd.read_file(downtown_area_type_shape) + downtown_gdf = downtown_gdf.to_crs(epsg=self.crs) + + joined_gdf = gpd.sjoin( + centroids_gdf, area_type_gdf, how="left", op="intersects" + ) + + joined_gdf[area_type_shape_variable] = ( + joined_gdf[area_type_shape_variable] + .map(area_type_codes_dict) + .fillna(1) + .astype(int) + ) + + WranglerLogger.debug("Area Type Codes Used: {}".format(area_type_codes_dict)) + + d_joined_gdf = gpd.sjoin( + centroids_gdf, downtown_gdf, how="left", op="intersects" + ) + + d_joined_gdf["downtown_area_type"] = d_joined_gdf["Id"].fillna(-99).astype(int) + + joined_gdf.loc[ + d_joined_gdf["downtown_area_type"] == 0, area_type_shape_variable + ] = downtown_area_type + + WranglerLogger.debug( + "Downtown Area Type used boundary file: {}".format(downtown_area_type_shape) + ) + + self.links_df[network_variable] = joined_gdf[area_type_shape_variable] + + WranglerLogger.info( + "Finished Calculating Area Type from Spatial Data into variable: {}".format( + network_variable + ) + )
+ +
[docs] def calculate_mpo( + self, + county_network_variable="county", + network_variable="mpo", + as_integer=True, + mpo_counties=None, + overwrite=False, + ): + """ + Calculates mpo variable. + #MC + Args: + county_variable (str): Name of the variable where the county names are stored. Default to "county". + network_variable (str): Name of the variable that should be written to. Default to "mpo". + as_integer (bool): If true, will convert true/false to 1/0s. + mpo_counties (list): List of county names that are within mpo region. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating MPO as roadway network variable: {}".format(network_variable) + ) + """ + Verify inputs + """ + county_network_variable = ( + county_network_variable + if county_network_variable + else self.parameters.county_network_variable + ) + + if not county_network_variable: + msg = "No variable specified as containing 'county' in the network." + WranglerLogger.error(msg) + raise ValueError(msg) + if county_network_variable not in self.links_df.columns: + msg = "Specified county network variable: {} does not exist in network. Try running or debuging county calculation." + WranglerLogger.error(msg) + raise ValueError(msg) + + mpo_counties = mpo_counties if mpo_counties else self.parameters.mpo_counties + + if not mpo_counties: + msg = "No MPO Counties specified in method call or in parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("MPO Counties: {}".format(",".join(str(mpo_counties)))) + + """ + Start actual process + """ + + mpo = self.links_df[county_network_variable].isin(mpo_counties) + + if as_integer: + mpo = mpo.astype(int) + + self.links_df[network_variable] = mpo + + WranglerLogger.info( + "Finished calculating MPO variable: {}".format(network_variable) + )
+ +
[docs] def add_variable_using_shst_reference( + self, + var_shst_csvdata=None, + shst_csv_variable=None, + network_variable=None, + network_var_type=int, + overwrite=False, + ): + """ + Join network links with source data, via SHST API node match result. + + Args: + var_shst_csvdata (str): File path to SHST API return. + shst_csv_variable (str): Variable name in the source data. + network_variable (str): Name of the variable that should be written to. + network_var_type : Variable type in the written network. + overwrite (bool): True is overwriting existing variable. Default to False. + + Returns: + None + + """ + WranglerLogger.info( + "Adding Variable {} using Shared Streets Reference from {}".format( + network_variable, var_shst_csvdata + ) + ) + + var_shst_df = pd.read_csv(var_shst_csvdata) + + if "shstReferenceId" not in var_shst_df.columns: + msg = "'shstReferenceId' required but not found in {}".format(var_shst_data) + WranglerLogger.error(msg) + raise ValueError(msg) + + if shst_csv_variable not in var_shst_df.columns: + msg = "{} required but not found in {}".format( + shst_csv_variable, var_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + join_gdf = pd.merge( + self.links_df, + var_shst_df[["shstReferenceId", shst_csv_variable]], + how="left", + on="shstReferenceId", + ) + + join_gdf[shst_csv_variable].fillna(0, inplace=True) + + if network_variable in self.links_df.columns and not overwrite: + join_gdf.loc[join_gdf[network_variable] > 0, network_variable] = join_gdf[ + shst_csv_variable + ].astype(network_var_type) + else: + join_gdf[network_variable] = join_gdf[shst_csv_variable].astype( + network_var_type + ) + + self.links_df[network_variable] = join_gdf[network_variable] + + WranglerLogger.info( + "Added variable: {} using Shared Streets Reference".format(network_variable) + )
+ +
[docs] def add_counts( + self, + network_variable="AADT", + mndot_count_shst_data=None, + widot_count_shst_data=None, + mndot_count_variable_shp=None, + widot_count_variable_shp=None, + ): + + """ + Adds count variable. + #MC + join the network with count node data, via SHST API node match result + + Args: + network_variable (str): Name of the variable that should be written to. Default to "AADT". + mndot_count_shst_data (str): File path to MNDOT count location SHST API node match result. + widot_count_shst_data (str): File path to WIDOT count location SHST API node match result. + mndot_count_variable_shp (str): File path to MNDOT count location geodatabase. + widot_count_variable_shp (str): File path to WIDOT count location geodatabase. + + Returns: + None + """ + + WranglerLogger.info("Adding Counts") + + """ + Verify inputs + """ + + mndot_count_shst_data = ( + mndot_count_shst_data + if mndot_count_shst_data + else self.parameters.mndot_count_shst_data + ) + widot_count_shst_data = ( + widot_count_shst_data + if widot_count_shst_data + else self.parameters.widot_count_shst_data + ) + mndot_count_variable_shp = ( + mndot_count_variable_shp + if mndot_count_variable_shp + else self.parameters.mndot_count_variable_shp + ) + widot_count_variable_shp = ( + widot_count_variable_shp + if widot_count_variable_shp + else self.parameters.widot_count_variable_shp + ) + + for varname, var in { + "mndot_count_shst_data": mndot_count_shst_data, + "widot_count_shst_data": widot_count_shst_data, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(var): + msg = "{}' not found at following location: {}.".format(varname, var) + WranglerLogger.error(msg) + raise ValueError(msg) + + for varname, var in { + "mndot_count_variable_shp": mndot_count_variable_shp, + "widot_count_variable_shp": widot_count_variable_shp, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + WranglerLogger.debug( + "Adding MNDOT Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + mndot_count_shst_data, mndot_count_variable_shp, network_variable + ) + ) + # Add Minnesota Counts + self.add_variable_using_shst_reference( + var_shst_csvdata=mndot_count_shst_data, + shst_csv_variable=mndot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=True, + ) + WranglerLogger.debug( + "Adding WiDot Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + widot_count_shst_data, widot_count_variable_shp, network_variable + ) + ) + # Add Wisconsin Counts, but don't overwrite Minnesota + self.add_variable_using_shst_reference( + var_shst_csvdata=widot_count_shst_data, + shst_csv_variable=widot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=False, + ) + + self.links_df["count_AM"] = self.links_df[network_variable] / 4 + self.links_df["count_MD"] = self.links_df[network_variable] / 4 + self.links_df["count_PM"] = self.links_df[network_variable] / 4 + self.links_df["count_NT"] = self.links_df[network_variable] / 4 + + self.links_df["count_daily"] = self.links_df[network_variable] + self.links_df["count_year"] = 2017 + + WranglerLogger.info( + "Finished adding counts variable: {}".format(network_variable) + )
+ +
[docs] @staticmethod + def read_match_result(path): + """ + Reads the shst geojson match returns. + + Returns shst dataframe. + + Reading lots of same type of file and concatenating them into a single DataFrame. + + Args: + path (str): File path to SHST match results. + + Returns: + geodataframe: geopandas geodataframe + + ##todo + not sure why we need, but should be in utilities not this class + """ + refId_gdf = DataFrame() + refid_file = glob.glob(path) + for i in refid_file: + new = gpd.read_file(i) + refId_gdf = pd.concat([refId_gdf, new], ignore_index=True, sort=False) + return refId_gdf
+ +
[docs] @staticmethod + def get_attribute( + links_df, + join_key, # either "shstReferenceId", or "shstGeometryId", tests showed the latter gave better coverage + source_shst_ref_df, # source shst refId + source_gdf, # source dataframe + field_name, # , # targetted attribute from source + ): + """ + Gets attribute from source data using SHST match result. + + Args: + links_df (dataframe): The network dataframe that new attribute should be written to. + join_key (str): SHST ID variable name used to join source data with network dataframe. + source_shst_ref_df (str): File path to source data SHST match result. + source_gdf (str): File path to source data. + field_name (str): Name of the attribute to get from source data. + + Returns: + None + """ + # join based on shared streets geometry ID + # pp_link_id is shared streets match return + # source_ink_id is mrcc + WranglerLogger.debug( + "source ShSt rename_variables_for_dbf columns\n{}".format( + source_shst_ref_df.columns + ) + ) + WranglerLogger.debug("source gdf columns\n{}".format(source_gdf.columns)) + # end up with OSM network with the MRCC Link ID + # could also do with route_sys...would that be quicker? + join_refId_df = pd.merge( + links_df, + source_shst_ref_df[[join_key, "pp_link_id", "score"]].rename( + columns={"pp_link_id": "source_link_id", "score": "source_score"} + ), + how="left", + on=join_key, + ) + + # joined with MRCC dataframe to get route_sys + + join_refId_df = pd.merge( + join_refId_df, + source_gdf[["LINK_ID", field_name]].rename( + columns={"LINK_ID": "source_link_id"} + ), + how="left", + on="source_link_id", + ) + + # drop duplicated records with same field value + + join_refId_df.drop_duplicates( + subset=["model_link_id", "shstReferenceId", field_name], inplace=True + ) + + # more than one match, take the best score + + join_refId_df.sort_values( + by=["model_link_id", "source_score"], + ascending=True, + na_position="first", + inplace=True, + ) + + join_refId_df.drop_duplicates( + subset=["model_link_id"], keep="last", inplace=True + ) + + # self.links_df[field_name] = join_refId_df[field_name] + + return join_refId_df[links_df.columns.tolist() + [field_name, "source_link_id"]]
+ +
[docs] def calculate_use( + self, + network_variable="use", + as_integer=True, + overwrite=False, + ): + """ + Calculates use variable. + + Args: + network_variable (str): Variable that should be written to in the network. Default to "use" + as_integer (bool): If True, will convert true/false to 1/0s. Defauly to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "'use' Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating hov and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + #MTC + self.links_df[network_variable] = int(1) + #/MTC + + self.links_df[network_variable] = 0 + + self.links_df.loc[ + (self.links_df["assign_group"] == 8) | (self.links_df["access"] == "hov"), + network_variable, + ] = 100 + #/MC + + + if as_integer: + self.links_df[network_variable] = self.links_df[network_variable].astype( + int + ) + WranglerLogger.info( + "Finished calculating hov variable: {}".format(network_variable) + )
+ +
[docs] def create_ML_variable( + self, + network_variable="ML_lanes", + overwrite=False, + ): + """ + Created ML lanes placeholder for project to write out ML changes + + ML lanes default to 0, ML info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing ML Variable '{}' already in network".format( + network_variable + ) + ) + self.links_df[network_variable] = int(0) + else: + WranglerLogger.info( + "ML Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + WranglerLogger.info( + "Finished creating ML lanes variable: {}".format(network_variable) + )
+ +
[docs] def create_hov_corridor_variable( + self, + network_variable="segment_id", + overwrite=False, + ): + """ + Created hov corridor placeholder for project to write out corridor changes + + hov corridor id default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov corridor Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Hov corridor Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating hov corridor variable: {}".format(network_variable) + )
+ +
[docs] def create_managed_variable( + self, + network_variable="managed", + overwrite=False, + ): + """ + Created placeholder for project to write out managed + + managed default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing managed Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Managed Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating managed variable: {}".format(network_variable) + )
+ +
[docs] def calculate_centroidconnect( + self, + parameters, + network_variable="centroidconnect", + highest_taz_number=None, + as_integer=True, + overwrite=False, + ): + """ + Calculates centroid connector variable. + + Args: + parameters (Parameters): A Lasso Parameters, which stores input files. + network_variable (str): Variable that should be written to in the network. Default to "centroidconnect" + highest_taz_number (int): the max TAZ number in the network. + as_integer (bool): If True, will convert true/false to 1/0s. Default to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + Returns: + RoadwayNetwork + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Centroid Connector Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Centroid Connector Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Centroid Connector and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + highest_taz_number = ( + highest_taz_number if highest_taz_number else parameters.highest_taz_number + ) + + if not highest_taz_number: + msg = "No highest_TAZ number specified in method variable or in parameters" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug( + "Calculating Centroid Connectors using highest TAZ number: {}".format( + highest_taz_number + ) + ) + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + self.links_df[network_variable] = False + + self.links_df.loc[ + (self.links_df["A"] <= highest_taz_number) + | (self.links_df["B"] <= highest_taz_number), + network_variable, + ] = True + + if as_integer: + self.links_df[network_variable] = self.links_df[ + network_variable + ].astype(int) + WranglerLogger.info( + "Finished calculating centroid connector variable: {}".format(network_variable) + )
+ + +
[docs] def calculate_distance( + self, network_variable="distance", centroidconnect_only=False, overwrite=False + ): + """ + calculate link distance in miles + + Args: + centroidconnect_only (Bool): True if calculating distance for centroidconnectors only. Default to False. + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing distance Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Distance Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + #MC + if ("centroidconnect" not in self.links_df) & ("taz" not in self.links_df.roadway.unique()): + if centroidconnect_only: + msg = "No variable specified for centroid connector, calculating centroidconnect first" + WranglerLogger.error(msg) + raise ValueError(msg) + #/MC + + """ + Start actual process + """ + + temp_links_gdf = self.links_df.copy() + temp_links_gdf.crs = "EPSG:4326" + temp_links_gdf = temp_links_gdf.to_crs(epsg=26915) + + #MTC + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MTC + #MC + if centroidconnect_only: + WranglerLogger.info( + "Calculating {} for centroid connectors".format(network_variable) + ) + temp_links_gdf[network_variable] = np.where( + temp_links_gdf.centroidconnect == 1, + temp_links_gdf.geometry.length / 1609.34, + temp_links_gdf[network_variable], + ) + else: + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MC + + self.links_df[network_variable] = temp_links_gdf[network_variable]
+ +
[docs] def convert_int(self, int_col_names=[]): + """ + Convert integer columns + """ + + #MTC + WranglerLogger.info( + "Converting variable type to mtc standard" + ) + + int_col_names = self.parameters.int_col + #/MTC + #MC + """ + WranglerLogger.info("Converting variable type to MetCouncil standard") + + if not int_col_names: + int_col_names = self.parameters.int_col + #/MC + """ + ##Why are we doing this? + # int_col_names.remove("lanes") + + for c in list(set(self.links_df.columns) & set(int_col_names)): + try: + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + self.links_df[c] = self.links_df[c].replace("", 0) + self.links_df[c] = self.links_df[c].astype(int) + except ValueError: + try: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + except: + msg = f"Could not convert column {c} to integer." + WranglerLogger.error(msg) + raise ValueError(msg) + + for c in list(set(self.nodes_df.columns) & set(int_col_names)): + self.nodes_df[c] = self.nodes_df[c].replace("", 0) + self.nodes_df[c] = self.nodes_df[c].astype(int)
+ +
[docs] def fill_na(self): + """ + Fill na values from create_managed_lane_network() + """ + + WranglerLogger.info("Filling nan for network from network wrangler") + + num_col = self.parameters.int_col + self.parameters.float_col + + for x in list(self.links_df.columns): + if x in num_col: + self.links_df[x].fillna(0, inplace=True) + self.links_df[x] = self.links_df[x].apply( + lambda k: 0 if k in [np.nan, "", float("nan"), "NaN"] else k + ) + + else: + self.links_df[x].fillna("", inplace=True) + + for x in list(self.nodes_df.columns): + if x in num_col: + self.nodes_df[x].fillna(0, inplace=True) + else: + self.nodes_df[x].fillna("", inplace=True)
+ + +
[docs] def roadway_standard_to_met_council_network(self, output_epsg=None): + """ + Rename and format roadway attributes to be consistent with what metcouncil's model is expecting. + #MC + Args: + output_epsg (int): epsg number of output network. + + Returns: + None + """ + + WranglerLogger.info( + "Renaming roadway attributes to be consistent with what metcouncil's model is expecting" + ) + + """ + Verify inputs + """ + + output_epsg = output_epsg if output_epsg else self.parameters.output_epsg + + """ + Start actual process + """ + if "managed" in self.links_df.columns: + WranglerLogger.info("Creating managed lane network.") + self.create_managed_lane_network(in_place=True) + + # when ML and assign_group projects are applied together, assign_group is filled as "" by wrangler for ML links + for c in ModelRoadwayNetwork.CALCULATED_VALUES: + if c in self.links_df.columns and c in self.parameters.int_col: + self.links_df[c] = self.links_df[c].replace("", 0) + else: + WranglerLogger.info("Didn't detect managed lanes in network.") + + self.calculate_centroidconnect(self.parameters) + self.create_calculated_variables() + self.calculate_distance(overwrite=True) + + self.fill_na() + # no method to calculate price yet, will be hard coded in project card + WranglerLogger.info("Splitting variables by time period and category") + self.split_properties_by_time_period_and_category() + self.convert_int() + + self.links_metcouncil_df = self.links_df.copy() + self.nodes_metcouncil_df = self.nodes_df.copy() + + self.links_metcouncil_df = pd.merge( + self.links_metcouncil_df.drop( + "geometry", axis=1 + ), # drop the stick geometry in links_df + self.shapes_df[["shape_id", "geometry"]], + how="left", + on="shape_id", + ) + + self.links_metcouncil_df.crs = "EPSG:4326" + self.nodes_metcouncil_df.crs = "EPSG:4326" + WranglerLogger.info("Setting Coordinate Reference System to EPSG 26915") + self.links_metcouncil_df = self.links_metcouncil_df.to_crs(epsg=26915) + self.nodes_metcouncil_df = self.nodes_metcouncil_df.to_crs(epsg=26915) + + self.nodes_metcouncil_df["X"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.x + ) + self.nodes_metcouncil_df["Y"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.y + ) + + # CUBE expect node id to be N + self.nodes_metcouncil_df.rename(columns={"model_node_id": "N"}, inplace=True)
+ +
[docs] def rename_variables_for_dbf( + self, + input_df, + variable_crosswalk: str = None, + output_variables: list = None, + convert_geometry_to_xy=False, + ): + """ + Rename attributes for DBF/SHP, make sure length within 10 chars. + + Args: + input_df (dataframe): Network standard DataFrame. + variable_crosswalk (str): File path to variable name crosswalk from network standard to DBF names. + output_variables (list): List of strings for DBF variables. + convert_geometry_to_xy (bool): True if converting node geometry to X/Y + + Returns: + dataframe + + """ + WranglerLogger.info("Renaming variables so that they are DBF-safe") + + """ + Verify inputs + """ + + variable_crosswalk = ( + variable_crosswalk + if variable_crosswalk + else self.parameters.net_to_dbf_crosswalk + ) + + output_variables = ( + output_variables if output_variables else self.parameters.output_variables + ) + + """ + Start actual process + """ + + crosswalk_df = pd.read_csv(variable_crosswalk) + WranglerLogger.debug( + "Variable crosswalk: {} \n {}".format(variable_crosswalk, crosswalk_df) + ) + net_to_dbf_dict = dict(zip(crosswalk_df["net"], crosswalk_df["dbf"])) + + dbf_name_list = [] + + dbf_df = copy.deepcopy(input_df) + + # only write out variables that we specify + # if variable is specified in the crosswalk, rename it to that variable + for c in dbf_df.columns: + if c in output_variables: + try: + dbf_df.rename(columns={c: net_to_dbf_dict[c]}, inplace=True) + dbf_name_list += [net_to_dbf_dict[c]] + except: + dbf_name_list += [c] + + if "geometry" in dbf_df.columns: + if str(dbf_df["geometry"].iloc[0].geom_type) == "Point": + dbf_df["X"] = dbf_df.geometry.apply(lambda g: g.x) + dbf_df["Y"] = dbf_df.geometry.apply(lambda g: g.y) + dbf_name_list += ["X", "Y"] + + WranglerLogger.debug("DBF Variables: {}".format(",".join(dbf_name_list))) + + return dbf_df[dbf_name_list]
+ +
[docs] def write_roadway_as_shp( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + data_to_csv: bool = True, + data_to_dbf: bool = False, + output_link_shp: str = None, + output_node_shp: str = None, + output_link_csv: str = None, + output_node_csv: str = None, + output_gpkg: str = None, + output_link_gpkg_layer: str = None, + output_node_gpkg_layer: str = None, + output_gpkg_link_filter: str = None + ): + """ + Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names. + + Args: + output_dir (str): File path to directory + node_output_variables (list): List of strings for node output variables. + link_output_variables (list): List of strings for link output variables. + data_to_csv (bool): True if write network in csv format. + data_to_dbf (bool): True if write network in dbf/shp format. + output_link_shp (str): File name to output link dbf/shp. + output_node_shp (str): File name of output node dbf/shp. + output_link_csv (str): File name to output link csv. + output_node_csv (str): File name to output node csv. + output_gpkg (str): File name to output GeoPackage. + output_link_gpkg_layer (str): Layer name within output_gpkg to output links. + output_node_gpkg_layer (str): Layer name within output_gpkg to output links. + output_gpkg_link_filter (str): Optional column name to additional output link subset layers + + Returns: + None + """ + + WranglerLogger.info("Writing Network as Shapefile") + WranglerLogger.debug( + "Output Variables: \n - {}".format( + "\n - ".join(self.parameters.output_variables) + ) + ) + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_met_council_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + # unless specified that all the data goes to the DBF, only output A and B + dbf_link_output_variables = ( + #MTC + link_output_variables if link_output_variables else ["A", "B", "geometry"] + #MC + #link_output_variables if data_to_dbf else ["A", "B", "shape_id", "geometry"] + ) + + # Removing code to set this to versions from parameters + # User can use these as arg + + """ + Start Process + """ + # rename these to short only for shapefile option + if output_node_shp: + WranglerLogger.info("Renaming DBF Node Variables") + nodes_dbf_df = self.rename_variables_for_dbf(self.nodes_mtc_df, output_variables=node_output_variables) + else: + WranglerLogger.debug("nodes_mtc_df columns: {}".format(list(self.nodes_mtc_df.columns))) + nodes_dbf_df = self.nodes_mtc_df[node_output_variables] + + if output_link_shp: + WranglerLogger.info("Renaming DBF Link Variables") + links_dbf_df = self.rename_variables_for_dbf(self.links_mtc_df, output_variables=dbf_link_output_variables) + else: + WranglerLogger.debug("links_mtc_df columns: {}".format(list(self.links_mtc_df.columns))) + links_dbf_df = self.links_mtc_df[dbf_link_output_variables] + + links_dbf_df = gpd.GeoDataFrame(links_dbf_df, geometry=links_dbf_df["geometry"]) + + # temp debug + WranglerLogger.debug("links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))]:\n{}".format( + links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))] + )) + + if output_node_shp: + WranglerLogger.info("Writing Node Shapes: {}".format(os.path.join(output_dir, output_node_shp))) + nodes_dbf_df.to_file(os.path.join(output_dir, output_node_shp)) + + if output_gpkg and output_node_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Node Layer {}".format(os.path.join(output_dir, output_gpkg), output_node_gpkg_layer)) + nodes_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_node_gpkg_layer, driver="GPKG") + + if output_link_shp: + WranglerLogger.info("Writing Link Shapes: {}".format(os.path.join(output_dir, output_link_shp))) + links_dbf_df.to_file(os.path.join(output_dir, output_link_shp)) + + # debug test + link_schema = { + "properties": { + "A" : "int:8", + "B" : "int:8", + "model_link_id" : "int:10", + "shstGeometryId": "str:32", + "name" : "str:84", + "ft" : "int:2", + "assignable" : "int:18", + "cntype" : "str:80", + "distance" : "float", + "county" : "str:15", + "bike_access" : "int:2", + "drive_access" : "int:2", + "walk_access" : "int:2", + "rail_only" : "int:2", + "bus_only" : "int:2", + "transit" : "int:2", + "managed" : "int:2", + "tollbooth" : "int:2", + "tollseg" : "int:2", + "segment_id" : "int:4", + "lanes_EA" : "int:2", + "heuristic_num" : "int:2", + "lanes_AM" : "int:2", + "lanes_MD" : "int:2", + "lanes_PM" : "int:2", + "lanes_EV" : "int:2", + "useclass_EA" : "int:2", + "useclass_AM" : "int:2", + "useclass_MD" : "int:2", + "useclass_PM" : "int:2", + "useclass_EV" : "int:2" + }, + "geometry": "LineString" + } + if output_gpkg and output_link_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Link Layer {}".format(os.path.join(output_dir, output_gpkg), output_link_gpkg_layer)) + links_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_link_gpkg_layer, schema=link_schema, driver="GPKG") + + # output additional link layers if filter column is specified + # e.g. if county-subsets are output + if output_gpkg_link_filter: + link_value_counts = links_dbf_df[output_gpkg_link_filter].value_counts() + for filter_val,filter_count in link_value_counts.items(): + gpkg_layer_name = "{}_{}".format(output_link_gpkg_layer, filter_val) + gpkg_layer_name = gpkg_layer_name.replace(" ","_") + WranglerLogger.info("Writing GeoPackage {} with Link Layer {} for {} rows".format( + os.path.join(output_dir, output_gpkg), gpkg_layer_name, filter_count)) + links_dbf_df.loc[ links_dbf_df[output_gpkg_link_filter]==filter_val ].to_file( + os.path.join(output_dir, output_gpkg), layer=gpkg_layer_name, schema=link_schema, driver="GPKG") + + + + + if data_to_csv: + WranglerLogger.info( + "Writing Network Data to CSVs:\n - {}\n - {}".format( + output_link_csv, output_node_csv + ) + ) + self.links_mtc_df[link_output_variables].to_csv( + output_link_csv, index=False + ) + self.nodes_mtc_df[node_output_variables].to_csv( + output_node_csv, index=False + )
+ + + # this should be moved to util +
[docs] @staticmethod + def dataframe_to_fixed_width(df): + """ + Convert dataframe to fixed width format, geometry column will not be transformed. + + Args: + df (pandas DataFrame). + + Returns: + pandas dataframe: dataframe with fixed width for each column. + dict: dictionary with columns names as keys, column width as values. + """ + WranglerLogger.info("Starting fixed width conversion") + + # get the max length for each variable column + max_width_dict = dict( + [ + (v, df[v].apply(lambda r: len(str(r)) if r != None else 0).max()) + for v in df.columns.values + if v != "geometry" + ] + ) + + fw_df = df.drop("geometry", axis=1).copy() + for c in fw_df.columns: + fw_df[c] = fw_df[c].apply(lambda x: str(x)) + fw_df["pad"] = fw_df[c].apply(lambda x: " " * (max_width_dict[c] - len(x))) + fw_df[c] = fw_df.apply(lambda x: x["pad"] + x[c], axis=1) + + return fw_df, max_width_dict
+ +
[docs] def write_roadway_as_fixedwidth( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + output_link_txt: str = None, + output_node_txt: str = None, + output_link_header_width_txt: str = None, + output_node_header_width_txt: str = None, + output_cube_network_script: str = None, + drive_only: bool = False, + ): + """ + Writes out fixed width file. + + This function does: + 1. write out link and node fixed width data files for cube. + 2. write out header and width correspondence. + 3. write out cube network building script with header and width specification. + + Args: + output_dir (str): File path to where links, nodes and script will be written and run + node_output_variables (list): list of node variable names. + link_output_variables (list): list of link variable names. + output_link_txt (str): File name of output link database (within output_dir) + output_node_txt (str): File name of output node database (within output_dir) + output_link_header_width_txt (str): File name of link column width records (within output_dir) + output_node_header_width_txt (str): File name of node column width records (within output_dir) + output_cube_network_script (str): File name of CUBE network building script (within output_dir) + drive_only (bool): If True, only writes drive nodes and links + + Returns: + None + + """ + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_mtc_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + output_link_txt = ( + output_link_txt if output_link_txt else self.parameters.output_link_txt + ) + + output_node_txt = ( + output_node_txt if output_node_txt else self.parameters.output_node_txt + ) + + output_link_header_width_txt = ( + output_link_header_width_txt + if output_link_header_width_txt + else self.parameters.output_link_header_width_txt + ) + + output_node_header_width_txt = ( + output_node_header_width_txt + if output_node_header_width_txt + else self.parameters.output_node_header_width_txt + ) + + output_cube_network_script = ( + output_cube_network_script + if output_cube_network_script + else self.parameters.output_cube_network_script + ) + + """ + Start Process + """ + #MTC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_mtc_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df['drive_access'] == 1] + #/MTC + """ + #MC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_metcouncil_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df["drive_access"] == 1] + #/MC + """ + WranglerLogger.info("Writing out link database") + + link_ff_df.to_csv(os.path.join(output_dir, output_link_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out link header and width ----") + link_max_width_df = DataFrame( + list(link_max_width_dict.items()), columns=["header", "width"] + ) + link_max_width_df.to_csv(os.path.join(output_dir, output_link_header_width_txt), index=False) + + #MTC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_mtc_df[node_output_variables] + ) + #/MTC + """ + #MC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_metcouncil_df[node_output_variables] + ) + #/MC + """ + WranglerLogger.info("Writing out node database") + + if drive_only: + node_ff_df = node_ff_df.loc[node_ff_df["drive_node"] == 1] + + + node_ff_df.to_csv(os.path.join(output_dir, output_node_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out node header and width") + node_max_width_df = DataFrame( + list(node_max_width_dict.items()), columns=["header", "width"] + ) + node_max_width_df.to_csv(os.path.join(output_dir, output_node_header_width_txt), index=False) + + # write out cube script + s = 'RUN PGM = NETWORK MSG = "Read in network from fixed width file" \n' + s += 'FILEI LINKI[1] = "{}",'.format(output_link_txt) + start_pos = 1 + for i in range(len(link_max_width_df)): + s += " VAR=" + link_max_width_df.header.iloc[i] + + if ( + self.links_mtc_df.dtypes.loc[link_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(link_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(link_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += link_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += "\n" + s += 'FILEI NODEI[1] = "{}",'.format(output_node_txt) + start_pos = 1 + for i in range(len(node_max_width_df)): + s += " VAR=" + node_max_width_df.header.iloc[i] + + if ( + self.nodes_mtc_df.dtypes.loc[node_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(node_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(node_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += node_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += '\n' + s += 'FILEO NETO = "complete_network.net"\n\n' + s += ' ZONES = {}\n\n'.format(self.parameters.zones) + s += '; Trim leading whitespace from string variables\n' + # todo: The below should be built above based on columns that are strings + s += ' phase=NODEMERGE\n' + s += ' county = LTRIM(county)\n' + s += ' endphase\n' + s += ' phase=LINKMERGE\n' + s += ' name = LTRIM(name)\n' + s += ' county = LTRIM(county)\n' + s += ' cntype = LTRIM(cntype)\n' + s += ' endphase\n' + s += '\nENDRUN\n' + + with open(os.path.join(output_dir, output_cube_network_script), "w") as f: + f.write(s) + + # run the cube script to create the cube network + import subprocess + env = copy.copy(os.environ) + cube_cmd = '"C:\\Program Files\\Citilabs\\CubeVoyager\\runtpp.exe" {}'.format(output_cube_network_script) + try: + WranglerLogger.info("Running [{}] in cwd [{}]".format(cube_cmd, output_dir)) + ret = subprocess.run(cube_cmd, cwd=output_dir, capture_output=True, check=True) + + WranglerLogger.info("return code: {}".format(ret.returncode)) + + for line in ret.stdout.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stdout: {}".format(line)) + + for line in ret.stderr.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stderr: {}".format(line)) + + except Exception as e: + WranglerLogger.error(e)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_modules/lasso/transit/index.html b/branch/test_no_change/_modules/lasso/transit/index.html new file mode 100644 index 0000000..e075d49 --- /dev/null +++ b/branch/test_no_change/_modules/lasso/transit/index.html @@ -0,0 +1,2037 @@ + + + + + + lasso.transit — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.transit

+"""Transit-related classes to parse, compare, and write standard and cube transit files.
+
+  Typical usage example:
+
+    tn = CubeTransit.create_from_cube(CUBE_DIR)
+    transit_change_list = tn.evaluate_differences(base_transit_network)
+
+    cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+    cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+"""
+import os
+import copy
+import csv
+import datetime, time
+from typing import Any, Dict, Optional, Union
+
+from lark import Lark, Transformer, v_args
+from pandas import DataFrame
+
+import pandas as pd
+import partridge as ptg
+import numpy as np
+
+from network_wrangler import TransitNetwork
+
+from .logger import WranglerLogger
+from .parameters import Parameters
+
+
[docs]class CubeTransit(object): + """Class for storing information about transit defined in Cube line + files. + + Has the capability to: + + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + + .. highlight:: python + + Typical usage example: + :: + tn = CubeTransit.create_from_cube(CUBE_DIR) + transit_change_list = tn.evaluate_differences(base_transit_network) + + Attributes: + lines (list): list of strings representing unique line names in + the cube network. + line_properties (dict): dictionary of line properties keyed by line name. Property + values are stored in a dictionary by property name. These + properties are directly read from the cube line files and haven't + been translated to standard transit values. + shapes (dict): dictionary of shapes + keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns: + - 'node_id' (int): positive integer of node id + - 'node' (int): node number, with negative indicating a non-stop + - 'stop' (boolean): indicates if it is a stop + - 'order' (int): order within this shape + program_type (str): Either PT or TRNBLD + parameters (Parameters): + Parameters instance that will be applied to this instance which + includes information about time periods and variables. + source_list (list): + List of cube line file sources that have been read and added. + diff_dict (dict): + """ + +
[docs] def __init__(self, parameters: Union[Parameters, dict] = {}): + """ + Constructor for CubeTransit + + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + WranglerLogger.debug("Creating a new Cube Transit instance") + + self.lines = [] + + self.line_properties = {} + self.shapes = {} + + self.program_type = None + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.source_list = [] + + self.diff_dict = Dict[str, Any]
+ +
[docs] def add_cube(self, transit_source: str): + """Reads a .lin file and adds it to existing TransitNetwork instance. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + """ + + """ + Figure out what kind of transit source it is + """ + + parser = Lark(TRANSIT_LINE_FILE_GRAMMAR, debug="debug", parser="lalr") + + if "NAME=" in transit_source: + WranglerLogger.debug("reading transit source as string") + self.source_list.append("input_str") + parse_tree = parser.parse(transit_source) + elif os.path.isfile(transit_source): + print("reading: {}".format(transit_source)) + with open(transit_source) as file: + WranglerLogger.debug( + "reading transit source: {}".format(transit_source) + ) + self.source_list.append(transit_source) + parse_tree = parser.parse(file.read()) + elif os.path.isdir(transit_source): + import glob + + for lin_file in glob.glob(os.path.join(transit_source, "*.LIN")): + self.add_cube(lin_file) + return + else: + msg = "{} not a valid transit line string, directory, or file" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("finished parsing cube line file") + # WranglerLogger.debug("--Parse Tree--\n {}".format(parse_tree.pretty())) + transformed_tree_data = CubeTransformer().transform(parse_tree) + # WranglerLogger.debug("--Transformed Parse Tree--\n {}".format(transformed_tree_data)) + + _line_data = transformed_tree_data["lines"] + + line_properties_dict = {k: v["line_properties"] for k, v in _line_data.items()} + line_shapes_dict = {k: v["line_shape"] for k, v in _line_data.items()} + new_lines = list(line_properties_dict.keys()) + """ + Before adding lines, check to see if any are overlapping with existing ones in the network + """ + + overlapping_lines = set(new_lines) & set(self.lines) + if overlapping_lines: + msg = "Overlapping lines found when adding from {}. \nSource files:\n{}\n{} Overlapping Lines of {} total new lines.\n-->{}".format( + transit_source, + "\n - ".join(self.source_list), + len(new_lines), + len(overlapping_lines), + overlapping_lines, + ) + print(msg) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.program_type = transformed_tree_data.get("program_type", None) + + self.lines += new_lines + self.line_properties.update(line_properties_dict) + self.shapes.update(line_shapes_dict) + + WranglerLogger.debug("Added lines to CubeTransit: \n".format(new_lines))
+ +
[docs] @staticmethod + def create_from_cube(transit_source: str, parameters: Optional[dict] = {}): + """ + Reads a cube .lin file and stores as TransitNetwork object. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + Returns: + A ::CubeTransit object created from the transit_source. + """ + + tn = CubeTransit(parameters) + tn.add_cube(transit_source) + + return tn
+ +
[docs] def evaluate_differences(self, base_transit): + """ + 1. Identifies what routes need to be updated, deleted, or added + 2. For routes being added or updated, identify if the time periods + have changed or if there are multiples, and make duplicate lines if so + 3. Create project card dictionaries for each change. + + Args: + base_transit (CubeTransit): an instance of this class for the base condition + + Returns: + A list of dictionaries containing project card changes + required to evaluate the differences between the base network + and this transit network instance. + """ + transit_change_list = [] + + """ + Identify what needs to be evaluated + """ + lines_to_update = [l for l in self.lines if l in base_transit.lines] + lines_to_delete = [l for l in base_transit.lines if l not in self.lines] + lines_to_add = [l for l in self.lines if l not in base_transit.lines] + + project_card_changes = [] + + """ + Evaluate Property Updates + """ + + for line in lines_to_update: + WranglerLogger.debug( + "Finding differences in time periods for: {}".format(line) + ) + + """ + Find any additional time periods that might need to add or delete. + """ + base_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + base_transit.line_properties[line] + ) + ) + + try: + assert len(base_cube_time_period_numbers) == 1 + except: + msg = "Base network line {} should only have one time period per route, but {} found".format( + line, base_cube_time_period_numbers + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + base_cube_time_period_number = base_cube_time_period_numbers[0] + + build_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + + time_periods_to_add = [ + tp + for tp in build_cube_time_period_numbers + if tp not in base_cube_time_period_numbers + ] + + for tp in time_periods_to_add: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + time_periods_to_delete = [ + tp + for tp in base_cube_time_period_numbers + if tp not in build_cube_time_period_numbers + ] + + for tp in time_periods_to_delete: + lines_to_delete.append(line) + + WranglerLogger.debug("Evaluating differences in: {}".format(line)) + updated_properties = self.evaluate_route_property_differences( + self.line_properties[line], + base_transit.line_properties[line], + base_cube_time_period_number, + ) + updated_shapes = CubeTransit.evaluate_route_shape_changes( + self.shapes[line].node, base_transit.shapes[line].node + ) + if updated_properties: + update_prop_card_dict = self.create_update_route_card_dict( + line, updated_properties + ) + project_card_changes.append(update_prop_card_dict) + + if updated_shapes: + update_shape_card_dict = self.create_update_route_card_dict( + line, updated_shapes + ) + project_card_changes.append(update_shape_card_dict) + + """ + Evaluate Deletions + """ + for line in lines_to_delete: + delete_card_dict = self.create_delete_route_card_dict( + line, base_transit.line_properties[line] + ) + project_card_changes.append(delete_card_dict) + + """ + Evaluate Additions + + First assess if need to add multiple routes if there are multiple time periods + """ + for line in lines_to_add: + time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + if len(time_period_numbers) > 1: + for tp in time_period_numbers[1:]: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + for line in lines_to_add: + add_card_dict = self.create_add_route_card_dict(line) + project_card_changes.append(add_card_dict) + + return project_card_changes
+ +
[docs] def add_additional_time_periods( + self, new_time_period_number: int, orig_line_name: str + ): + """ + Copies a route to another cube time period with appropriate + values for time-period-specific properties. + + New properties are stored under the new name in: + - ::self.shapes + - ::self.line_properties + + Args: + new_time_period_number (int): cube time period number + orig_line_name(str): name of the originating line, from which + the new line will copy its properties. + + Returns: + Line name with new time period. + """ + WranglerLogger.debug( + "adding time periods {} to line {}".format( + new_time_period_number, orig_line_name + ) + ) + + ( + route_id, + _init_time_period, + agency_id, + direction_id, + ) = CubeTransit.unpack_route_name(orig_line_name) + new_time_period_name = self.parameters.cube_time_periods[new_time_period_number] + new_tp_line_name = CubeTransit.build_route_name( + route_id=route_id, + time_period=new_time_period_name, + agency_id=agency_id, + direction_id=direction_id, + ) + + try: + assert new_tp_line_name not in self.lines + except: + msg = "Trying to add a new time period {} to line {}, but constructed name {} is already in line list.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + WrangerLogger.error(msg) + raise ValueError(msg) + + # copy to a new line and add it to list of lines to add + self.line_properties[new_tp_line_name] = copy.deepcopy( + self.line_properties[orig_line_name] + ) + self.shapes[new_tp_line_name] = copy.deepcopy(self.shapes[orig_line_name]) + self.line_properties[new_tp_line_name]["NAME"] = new_tp_line_name + + """ + Remove entries that aren't for this time period from the new line's properties list. + """ + this_time_period_properties_list = [ + p + "[" + str(new_time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + self.line_properties[new_tp_line_name].pop(k, None) + + """ + Remove entries for time period from the original line's properties list. + """ + for k in this_time_period_properties_list: + self.line_properties[orig_line_name].pop(k, None) + + """ + Add new line to list of lines to add. + """ + WranglerLogger.debug( + "Adding new time period {} for line {} as {}.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + ) + return new_tp_line_name
+ +
[docs] def create_update_route_card_dict(self, line: str, updated_properties_dict: dict): + """ + Creates a project card change formatted dictionary for updating + the line. + + Args: + line: name of line that is being updated + updated_properties_dict: dictionary of attributes to update as + 'property': <property name>, + 'set': <new property value> + + Returns: + A project card change-formatted dictionary for the attribute update. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.split("_")[-2].strip("d\"")), + "shape_id": line.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + "properties": updated_properties_dict, + } + WranglerLogger.debug( + "Updating {} route to changes:\n{}".format(line, str(update_card_dict)) + ) + + return update_card_dict
+ +
[docs] def create_delete_route_card_dict( + self, line: str, base_transit_line_properties_dict: dict + ): + """ + Creates a project card change formatted dictionary for deleting a line. + + Args: + line: name of line that is being deleted + base_transit_line_properties_dict: dictionary of cube-style + attribute values in order to find time periods and + start and end times. + + Returns: + A project card change-formatted dictionary for the route deletion. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + base_transit_line_properties_dict + ) + + delete_card_dict = { + "category": "Delete Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + } + WranglerLogger.debug( + "Deleting {} route to changes:\n{}".format(line, delete_card_dict) + ) + + return delete_card_dict
+ +
[docs] def create_add_route_card_dict(self, line: str): + """ + Creates a project card change formatted dictionary for adding + a route based on the information in self.route_properties for + the line. + + Args: + line: name of line that is being updated + + Returns: + A project card change-formatted dictionary for the route addition. + """ + start_time_str, end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + standard_properties = self.cube_properties_to_standard_properties( + self.line_properties[line] + ) + + routing_properties = { + "property": "routing", + "set": self.shapes[line]["node"].tolist(), + } + + add_card_dict = { + "category": "New Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('_')[-2]), + "start_time": start_time_str, + "end_time": end_time_str, + "agency_id": line.strip('_')[0], + }, + "properties": standard_properties + [routing_properties], + } + + WranglerLogger.debug( + "Adding {} route to changes:\n{}".format(line, add_card_dict) + ) + return add_card_dict
+ +
[docs] @staticmethod + def get_time_period_numbers_from_cube_properties(properties_list: list): + """ + Finds properties that are associated with time periods and the + returns the numbers in them. + + Args: + properties_list (list): list of all properties. + + Returns: + list of strings of the time period numbers found + """ + time_periods_list = [] + for p in properties_list: + if ("[" not in p) or ("]" not in p): + continue + tp_num = p.split("[")[1][0] + if tp_num and tp_num not in time_periods_list: + time_periods_list.append(tp_num) + return time_periods_list
+ +
[docs] @staticmethod + def build_route_name( + route_id: str = "", + time_period: str = "", + agency_id: str = 0, + direction_id: str = 1, + ): + """ + Create a route name by contatenating route, time period, agency, and direction + + Args: + route_id: i.e. 452-111 + time_period: i.e. pk + direction_id: i.e. 1 + agency_id: i.e. 0 + + Returns: + constructed line_name i.e. "0_452-111_452_pk1" + """ + + return ( + str(agency_id) + + "_" + + str(route_id) + + "_" + + str(route_id.split("-")[0]) + + "_" + + str(time_period) + + str(direction_id) + )
+ +
[docs] @staticmethod + def unpack_route_name(line_name: str): + """ + Unpacks route name into direction, route, agency, and time period info + + Args: + line_name (str): i.e. "0_452-111_452_pk1" + + Returns: + route_id (str): 452-111 + time_period (str): i.e. pk + direction_id (str) : i.e. 1 + agency_id (str) : i.e. 0 + """ + + line_name = line_name.strip('"') + + agency_id, route_id, _rtid, _tp_direction = line_name.split("_") + time_period = _tp_direction[0:-1] + direction_id = _tp_direction[-1] + + return route_id, time_period, agency_id, direction_id
+ +
[docs] def calculate_start_end_times(self, line_properties_dict: dict): + """ + Calculate the start and end times of the property change + WARNING: Doesn't take care of discongruous time periods!!!! + + Args: + line_properties_dict: dictionary of cube-flavor properties for a transit line + """ + start_time_m = 24 * 60 + end_time_m = 0 * 60 + + WranglerLogger.debug( + "parameters.time_period_properties_list: {}".format( + self.parameters.time_period_properties_list + ) + ) + current_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + line_properties_dict + ) + ) + + WranglerLogger.debug( + "current_cube_time_period_numbers:{}".format( + current_cube_time_period_numbers + ) + ) + + for tp in current_cube_time_period_numbers: + time_period_name = self.parameters.cube_time_periods[tp] + WranglerLogger.debug("time_period_name:{}".format(time_period_name)) + _start_time, _end_time = self.parameters.time_period_to_time[ + time_period_name + ] + + # change from "HH:MM" to integer # of seconds + _start_time_m = (int(_start_time.split(":")[0]) * 60) + int( + _start_time.split(":")[1] + ) + _end_time_m = (int(_end_time.split(":")[0]) * 60) + int( + _end_time.split(":")[1] + ) + + # find bounding start and end times + if _start_time_m < start_time_m: + start_time_m = _start_time_m + if _end_time_m > end_time_m: + end_time_m = _end_time_m + + if start_time_m > end_time_m: + msg = "Start time ({}) is after end time ({})".format( + start_time_m, end_time_m + ) + #WranglerLogger.error(msg) + #raise ValueError(msg) + + start_time_str = "{:02d}:{:02d}".format(*divmod(start_time_m, 60)) + end_time_str = "{:02d}:{:02d}".format(*divmod(end_time_m, 60)) + return start_time_str, end_time_str
+ +
[docs] @staticmethod + def cube_properties_to_standard_properties(cube_properties_dict: dict): + """ + Converts cube style properties to standard properties. + + This is most pertinent to time-period specific variables like headway, + and varibles that have stnadard units like headway, which is minutes + in cube and seconds in standard format. + + Args: + cube_properties_dict: <cube style property name> : <property value> + + Returns: + A list of dictionaries with values for `"property": <standard + style property name>, "set" : <property value with correct units>` + + """ + standard_properties_list = [] + for k, v in cube_properties_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + change_item["set"] = v * 60 + else: + change_item["property"] = k + change_item["set"] = v + standard_properties_list.append(change_item) + + return standard_properties_list
+ +
[docs] def evaluate_route_property_differences( + self, + properties_build: dict, + properties_base: dict, + time_period_number: str, + absolute: bool = True, + validate_base: bool = False, + ): + """ + Checks if any values have been updated or added for a specific + route and creates project card entries for each. + + Args: + properties_build: ::<property_name>: <property_value> + properties_base: ::<property_name>: <property_value> + time_period_number: time period to evaluate + absolute: if True, will use `set` command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway + validate_base: if True, will add the `existing` line in the project card + + Returns: + transit_change_list (list): a list of dictionary values suitable for writing to a project card + `{ + 'property': <property_name>, + 'set': <set value>, + 'change': <change from existing value>, + 'existing': <existing value to check>, + }` + + """ + + # Remove time period specific values for things that aren't part of the time period in question + this_time_period_properties_list = [ + p + "[" + str(time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + properties_build.pop(k, None) + properties_base.pop(k, None) + + difference_dict = dict( + set(properties_build.items()) ^ set(properties_base.items()) + ) + + # Iterate through properties list to build difference project card list + + properties_list = [] + for k, v in difference_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + + if absolute: + change_item["set"] = ( + v * 60 + ) # project cards are in secs, cube is in minutes + else: + change_item["change"] = ( + properties_build[k] - properties_base[k] + ) * 60 + if validate_base or not absolute: + change_item["existing"] = properties_base[k] * 60 + else: + change_item["property"] = k + change_item["set"] = v + if validate_base: + change_item["existing"] = properties_base[k] + + properties_list.append(change_item) + WranglerLogger.debug( + "Evaluated Route Changes: \n {})".format( + "\n".join(map(str, properties_list)) + ) + ) + return properties_list
+ +
[docs] @staticmethod + def evaluate_route_shape_changes( + shape_build: DataFrame, shape_base: DataFrame + ): + """ + Compares two route shapes and constructs returns list of changes + suitable for a project card. + + Args: + shape_build: DataFrame of the build-version of the route shape. + shape_base: dDataFrame of the base-version of the route shape. + + Returns: + List of shape changes formatted as a project card-change dictionary. + + """ + + if shape_build.equals(shape_base): + return None + + shape_change_list = [] + + base_node_list = shape_base.tolist() + build_node_list = shape_build.tolist() + + sort_len = max(len(base_node_list), len(build_node_list)) + + start_pos = None + end_pos = None + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + start_pos = i - 1 + break + if base_node_list[i] != build_node_list[i]: + start_pos = i + break + else: + continue + + j = -1 + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + end_pos = j + 1 + break + if base_node_list[j] != build_node_list[j]: + end_pos = j + break + else: + j -= 1 + + if start_pos or end_pos: + existing = base_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + set = build_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + + shape_change_list.append( + {"property": "routing", "existing": existing, "set": set} + ) + + return shape_change_list
+ + +
[docs]class StandardTransit(object): + """Holds a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's + Cube Line files. + + .. highlight:: python + Typical usage example: + :: + cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) + cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) + + Attributes: + feed: Partridge Feed object containing read-only access to GTFS feed + parameters (Parameters): Parameters instance containing information + about time periods and variables. + """ + +
[docs] def __init__(self, ptg_feed, parameters: Union[Parameters, dict] = {}): + """ + + Args: + ptg_feed: partridge feed object + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + self.feed = ptg_feed + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] @staticmethod + def fromTransitNetwork( + transit_network_object: TransitNetwork, parameters: Union[Parameters, dict] = {} + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + transit_network_object: Reference to an instance of TransitNetwork. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit + """ + return StandardTransit(transit_network_object.feed, parameters=parameters)
+ +
[docs] @staticmethod + def read_gtfs(gtfs_feed_dir: str, parameters: Union[Parameters, dict] = {}): + """ + Reads GTFS files from a directory and returns a StandardTransit + instance. + + Args: + gtfs_feed_dir: location of the GTFS files + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit instance + """ + return StandardTransit(ptg.load_feed(gtfs_feed_dir), parameters=parameters)
+ +
[docs] def write_as_cube_lin(self, outpath: str = None): + """ + Writes the gtfs feed as a cube line file after + converting gtfs properties to MetCouncil cube properties. + #MC + Args: + outpath: File location for output cube line file. + + """ + if not outpath: + outpath = os.path.join(self.parameters.scratch_location, "outtransit.lin") + trip_cube_df = self.route_properties_gtfs_to_cube(self) + + trip_cube_df["LIN"] = trip_cube_df.apply(self.cube_format, axis=1) + + l = trip_cube_df["LIN"].tolist() + + with open(outpath, "w") as f: + f.write("\n".join(l))
+ +
[docs] @staticmethod + def route_properties_gtfs_to_cube(self): + """ + Prepare gtfs for cube lin file. + #MC + Does the following operations: + 1. Combines route, frequency, trip, and shape information + 2. Converts time of day to time periods + 3. Calculates cube route name from gtfs route name and properties + 4. Assigns a cube-appropriate mode number + 5. Assigns a cube-appropriate operator number + + Returns: + trip_df (DataFrame): DataFrame of trips with cube-appropriate values for: + - NAME + - ONEWAY + - OPERATOR + - MODE + - HEADWAY + """ + WranglerLogger.info( + "Converting GTFS Standard Properties to MetCouncil's Cube Standard" + ) + metro_operator_dict = { + "0": 3, + "1": 3, + "2": 3, + "3": 4, + "4": 2, + "5": 5, + "6": 8, + "7": 1, + "8": 1, + "9": 10, + "10": 3, + "11": 9, + "12": 3, + "13": 4, + "14": 4, + "15": 3, + } + + shape_df = self.feed.shapes.copy() + trip_df = self.feed.trips.copy() + + """ + Add information from: routes, frequencies, and routetype to trips_df + """ + trip_df = pd.merge(trip_df, self.feed.routes, how="left", on="route_id") + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + inv_cube_time_periods_map = { + v: k for k, v in self.parameters.cube_time_periods.items() + } + trip_df["tod_num"] = trip_df.tod_name.map(inv_cube_time_periods_map) + trip_df["tod_name"] = trip_df.tod_name.map( + self.parameters.cube_time_periods_name + ) + + trip_df["NAME"] = trip_df.apply( + lambda x: x.agency_id + + "_" + + x.route_id + + "_" + + x.route_short_name + + "_" + + x.tod_name + + str(x.direction_id), + axis=1, + ) + + trip_df["LONGNAME"] = trip_df["route_long_name"] + trip_df["HEADWAY"] = (trip_df["headway_secs"] / 60).astype(int) + trip_df["MODE"] = trip_df.apply(self.calculate_cube_mode, axis=1) + trip_df["ONEWAY"] = "T" + trip_df["OPERATOR"] = trip_df["agency_id"].map(metro_operator_dict) + + return trip_df
+ +
[docs] def calculate_cube_mode(self, row): + """ + Assigns a cube mode number by following logic. + #MC + For rail, uses GTFS route_type variable: + https://developers.google.com/transit/gtfs/reference + + :: + # route_type : cube_mode + route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9} # Rail + + For buses, uses route id numbers and route name to find + express and suburban buses as follows: + + :: + if not cube_mode: + if 'express' in row['LONGNAME'].lower(): + cube_mode = 7 # Express + elif int(row['route_id'].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + Args: + row: A DataFrame row with route_type, route_long_name, and route_id + + Returns: + cube mode number + """ + # route_type : cube_mode + route_type_to_cube_mode = { + 0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9, + } # Rail + + cube_mode = route_type_to_cube_mode[row["route_type"]] + + if not cube_mode: + if "express" in row["route_long_name"].lower(): + cube_mode = 7 # Express + elif int(row["route_id"].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + return cube_mode
+ +
[docs] def time_to_cube_time_period( + self, start_time_secs: int, as_str: bool = True, verbose: bool = False + ): + """ + Converts seconds from midnight to the cube time period. + + Args: + start_time_secs: start time for transit trip in seconds + from midnight + as_str: if True, returns the time period as a string, + otherwise returns a numeric time period + + Returns: + this_tp_num: if as_str is False, returns the numeric + time period + this_tp: if as_str is True, returns the Cube time period + name abbreviation + """ + from .util import hhmmss_to_datetime, secs_to_datetime + + # set initial time as the time that spans midnight + + start_time_dt = secs_to_datetime(start_time_secs) + + # set initial time as the time that spans midnight + this_tp = "NA" + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + _dt_end_time = hhmmss_to_datetime(_end_time) + if _dt_start_time > _dt_end_time: + this_tp = tp_name + break + + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + if start_time_dt >= _dt_start_time: + this_time = _dt_start_time + this_tp = tp_name + + if verbose: + WranglerLogger.debug( + "Finding Cube Time Period from Start Time: \ + \n - start_time_sec: {} \ + \n - start_time_dt: {} \ + \n - this_tp: {}".format( + start_time_secs, start_time_dt, this_tp + ) + ) + + if as_str: + return this_tp + + name_to_num = {v: k for k, v in self.parameters.cube_time_periods.items()} + this_tp_num = name_to_num.get(this_tp) + + if not this_tp_num: + msg = ( + "Cannot find time period number in {} for time period name: {}".format( + name_to_num, this_tp + ) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + return this_tp_num
+ +
[docs] def shape_gtfs_to_dict_list(self, trip_id: str, shape_id: str, add_nntime: bool): + """ + This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of + stepping through the routed nodes and corresponding them with shape nodes. + + TODO: eliminate this necessity by tagging the stop nodes in the shapes to begin with when + the transit routing on the roadway network is first performed. + + As such, I'm copying the code from StandardTransit.shape_gtfs_to_cube() with minimal modifications. + + Args: + trip_id of the trip in question + shape_id of the trip in question + Returns: + list of dict records with columns: + trip_id + shape_id + shape_pt_sequence + shape_mode_node_id + is_stop + access + stop_sequence + """ + # get the stop times for this route + # https://developers.google.com/transit/gtfs/reference#stop_timestxt + trip_stop_times_df = self.feed.stop_times.loc[ self.feed.stop_times.trip_id == trip_id, + ['trip_id','arrival_time','departure_time','stop_id','stop_sequence','pickup_type','drop_off_type']].copy() + trip_stop_times_df.sort_values(by='stop_sequence', inplace=True) + trip_stop_times_df.reset_index(drop=True, inplace=True) + # print("trip_stop_times_df:\n{}".format(trip_stop_times_df)) + # print("trip_stop_times_df.dtypes:\n{}".format(trip_stop_times_df.dtypes)) + # trip_stop_times_df: + # trip_id arrival_time departure_time stop_id stop_sequence pickup_type drop_off_type + # 0 10007 0 0 7781 1 0 NaN + # 1 10007 120 120 7845 2 0 NaN + # 2 10007 300 300 7790 3 0 NaN + # 3 10007 360 360 7854 4 0 NaN + # 4 10007 390 390 7951 5 0 NaN + # 5 10007 720 720 7950 6 0 NaN + # 6 10007 810 810 7850 7 0 NaN + # 7 10007 855 855 7945 8 0 NaN + # 8 10007 900 900 7803 9 0 NaN + # 9 10007 930 930 7941 10 0 NaN + # trip_stop_times_df.dtypes: + # trip_id object + # arrival_time object + # departure_time object + # stop_id object + # stop_sequence int64 + # pickup_type object + # drop_off_type object + + # get the shapes for this route + # https://developers.google.com/transit/gtfs/reference#shapestxt + trip_node_df = self.feed.shapes.loc[self.feed.shapes.shape_id == shape_id].copy() + trip_node_df.sort_values(by="shape_pt_sequence", inplace = True) + trip_node_df.reset_index(drop=True, inplace=True) + # print("trip_node_df.head(20):\n{}".format(trip_node_df.head(20))) + # print("trip_node_df.dtypes:\n{}".format(trip_node_df.dtypes)) + # trip_node_df: + # shape_id shape_pt_sequence shape_osm_node_id shape_shst_node_id shape_model_node_id shape_pt_lat shape_pt_lon + # 0 696 1 1429334016 35cb440c505534e8aedbd3a286b70eab 2139625 NaN NaN + # 1 696 2 444242480 39e263722d5849b3c732b48734671400 2164862 NaN NaN + # 2 696 3 5686705779 4c41c608c35f457079fd673bce5556e5 2169898 NaN NaN + # 3 696 4 3695761874 d0f5b2173189bbb1b5dbaa78a004e8c4 2021876 NaN NaN + # 4 696 5 1433982749 60726971f0fb359a57e9d8df30bf384b 2002078 NaN NaN + # 5 696 6 1433982740 634c301424647d5883191edf522180e3 2156807 NaN NaN + # 6 696 7 4915736746 f03c3d7f1aa0358a91c165f53dac1e20 2145185 NaN NaN + # 7 696 8 65604864 68b8df24f1572d267ecf834107741393 2120788 NaN NaN + # 8 696 9 65604866 e412a013ad45af6649fa1b396f74c127 2066513 NaN NaN + # 9 696 10 956664242 657e1602aa8585383ed058f28f7811ed 2006476 NaN NaN + # 10 696 11 291642561 726b03cced023a6459d7333885927208 2133933 NaN NaN + # 11 696 12 291642583 709a0c00811f213f7476349a2c002003 2159991 NaN NaN + # 12 696 13 291642745 c5aaab62e0c78c34d93ee57795f06953 2165343 NaN NaN + # 13 696 14 5718664845 c7f1f4aa88887071a0d28154fc84604b 2007965 NaN NaN + # 14 696 15 291642692 0ef007a79b391e8ba98daf4985f26f9b 2160569 NaN NaN + # 15 696 16 5718664843 2ce63288e77747abc3a4124f0e28efcf 2047955 NaN NaN + # 16 696 17 3485537279 ec0c8eb524f41072a9fd87ecfd45e15f 2169094 NaN NaN + # 17 696 18 5718664419 57ca23828db4adea39355a92fb0fc3ff 2082102 NaN NaN + # 18 696 19 5718664417 4aba41268ada1058ee58e99a84e28d37 2019974 NaN NaN + # 19 696 20 65545418 d4f815a2f6da6c95d2f032a3cd61020c 2025374 NaN NaN # trip_node_df.dtypes: + # shape_id object + # shape_pt_sequence int64 + # shape_osm_node_id object + # shape_shst_node_id object + # shape_model_node_id object + # shape_pt_lat object + # shape_pt_lon object + + # we only need: shape_id, shape_pt_sequence, shape_model_node_id + trip_node_df = trip_node_df[['shape_id','shape_pt_sequence','shape_model_node_id']] + + if 'trip_id' in self.feed.stops.columns: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on=['trip_id', "stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + # print("trip_stop_times_df:\n{}".format(trip_stop_times_df)) + # print("trip_stop_times_df.dtypes:\n{}".format(trip_stop_times_df.dtypes)) + # trip_stop_times_df.dtypes: + # trip_id object + # arrival_time object + # departure_time object + # stop_id object + # stop_sequence int64 + # pickup_type object + # drop_off_type object + # stop_name object + # stop_lat float64 + # stop_lon float64 + # zone_id object + # agency_raw_name object + # stop_code object + # location_type float64 + # parent_station object + # stop_desc object + # stop_url object + # stop_timezone object + # wheelchair_boarding float64 + # platform_code object + # position object + # direction object + # * used by routes object + # osm_node_id object + # shst_node_id object + # model_node_id object + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + # this is the same as shape_gtfs_to_cube but we'll build up a list of dicts with shape/stop information + shape_stop_dict_list = [] + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + # add this stop to shape_stop_df + node_dict = trip_node_df.iloc[nodeIdx].to_dict() + node_dict['trip_id' ] = trip_id + node_dict['is_stop' ] = True + node_dict['access' ] = access_v + node_dict['stop_sequence'] = stop_seq + shape_stop_dict_list.append(node_dict) + + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + # add this stop to shape_stop_df + node_dict = trip_node_df.iloc[nodeIdx].to_dict() + node_dict['trip_id'] = trip_id + node_dict['is_stop'] = False + shape_stop_dict_list.append(node_dict) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + # print("node_list_str: {}".format(node_list_str)) + return shape_stop_dict_list
+ +
[docs] def shape_gtfs_to_cube(self, row, add_nntime = False): + """ + Creates a list of nodes that for the route in appropriate + cube format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a string representation of the node list + for a route in cube format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + if 'trip_id' in self.feed.stops.columns: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on=['trip_id', "stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + return node_list_str
+ + +
[docs] def cube_format(self, row): + """ + Creates a string represnting the route in cube line file notation. + #MC + Args: + row: row of a DataFrame representing a cube-formatted trip, with the Attributes + trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR + + Returns: + string representation of route in cube line file notation + """ + + s = '\nLINE NAME="{}",'.format(row.NAME) + s += '\n LONGNAME="{}",'.format(row.LONGNAME) + s += "\n HEADWAY[{}]={},".format(row.tod_num, row.HEADWAY) + s += "\n MODE={},".format(row.MODE) + s += "\n ONEWAY={},".format(row.ONEWAY) + s += "\n OPERATOR={},".format(row.OPERATOR) + s += "\n NODES={}".format(self.shape_gtfs_to_cube(row)) + + return s
+ +
[docs] def shape_gtfs_to_emme(self, trip_row): + """ + Creates transit segment for the trips in appropriate + emme format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a dataframe representation of the transit segment + for a trip in emme format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == trip_row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == trip_row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # node list + stop_seq = 0 + nntimes = [] + allow_alightings=[] + allow_boardings=[] + stop_names=[] + + if trip_row.TM2_line_haul_name in ["Light rail", "Heavy rail", "Commuter rail", "Ferry service"]: + add_nntime = True + else: + add_nntime = False + + for nodeIdx in range(len(trip_node_list)): + + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + nntimes.append(nntime_v) + else: + nntimes.append(0) + + pickup_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"pickup_type"].iloc[0] + if pickup_type in [1, "1"]: + allow_alightings.append(0) + else: + allow_alightings.append(1) + + drop_off_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"drop_off_type"].iloc[0] + if drop_off_type in [1, "1"]: + allow_boardings.append(0) + else: + allow_boardings.append(1) + + stop_name = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"stop_name"].iloc[0] + stop_names.append(stop_name) + + else: + nntimes.append(0) + allow_alightings.append(0) + allow_boardings.append(0) + stop_names.append("") + + trip_node_df['time_minutes'] = nntimes + trip_node_df['allow_alightings'] = allow_alightings + trip_node_df['allow_boardings'] = allow_boardings + trip_node_df['stop_name'] = stop_names + trip_node_df['line_id'] = trip_row['line_id'] + trip_node_df['node_id'] = trip_node_df['shape_model_node_id'].astype(int) + trip_node_df['stop_order'] = trip_node_df['shape_pt_sequence'] + + return trip_node_df
+ +
[docs] def evaluate_differences(self, transit_changes): + """ + Compare changes from the transit_changes dataframe with the standard transit network + returns the project card changes in dictionary format + """ + + # simple properties change + trip_df = self.feed.trips.copy() + + mode_crosswalk = pd.read_csv(self.parameters.mode_crosswalk_file) + mode_crosswalk.drop_duplicates(subset = ["agency_raw_name", "route_type", "is_express_bus"], inplace = True) + + trip_df = pd.merge(trip_df, self.feed.routes.drop("agency_raw_name", axis = 1), how="left", on="route_id") + + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod"] = trip_df.start_time.apply(self.time_to_cube_time_period, as_str = False) + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + + trip_df["headway_minutes"] = (trip_df["headway_secs"] / 60).astype(int) + + trip_df = pd.merge(trip_df, self.feed.agency[["agency_name", "agency_raw_name", "agency_id"]], how = "left", on = ["agency_raw_name", "agency_id"]) + + # identify express bus + # moved this here from top since this StandardTransit shouldn't depend on mtc... + from .mtc import _is_express_bus + trip_df["is_express_bus"] = trip_df.apply(lambda x: _is_express_bus(x), axis = 1) + trip_df.drop("agency_name", axis = 1 , inplace = True) + + trip_df = pd.merge( + trip_df, + mode_crosswalk.drop("agency_id", axis = 1), + how = "left", + on = ["agency_raw_name", "route_type", "is_express_bus"] + ) + + trip_df["line_id"] = trip_df.apply( + lambda x: str(x.TM2_operator) + + "_" + + str(x.route_id) + + "_" + + x.tod_name + + "_" + + "d" + + str(int(x.direction_id)) + + "_s" + + x.shape_id, + axis=1, + ) + + trip_df["line_id"] = trip_df["line_id"].str.slice(stop = 28) + + project_card_changes = [] + + # lines updated + transit_changes['line_id'] = transit_changes.apply( + lambda x: '-'.join(x['element_id'].split('-')[:-3]) if + x['object'] == 'TRANSIT_STOP' else + x['element_id'], + axis = 1 + ) + + lines_updated_df = transit_changes[ + (transit_changes['operation'] == 'C') & + (transit_changes['line_id'].isin(trip_df['line_id'].tolist())) + ].copy() + + ######################### + # simple property changes + ######################### + + property_changes_df = lines_updated_df[ + lines_updated_df.object == 'TRANSIT_LINE' + ].copy() + + property_attribute_list = ['headway_secs'] + + for index, row in property_changes_df.iterrows(): + line_id = row['line_id'] + properties_list = [] + change_item = {} + for c in property_attribute_list: + existing_value = int(trip_df[ + trip_df['line_id'] == line_id + ][c].iloc[0]) + + change_item["existing"] = existing_value + + if c == 'headway_secs': + change_item["set"] = row['headway'] * 60 + else: + change_item["set"] = row[c] + + change_item["property"] = c + + properties_list.append(change_item) + + property_changes_df.loc[index, 'properties'] = properties_list + + ############### + # shape changes + ############### + + shape_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_SHAPE']) + ].copy() + + for index, row in shape_changes_df.iterrows(): + line_id = row.line_id + + # get base shape + trip_row = trip_df[trip_df.line_id == line_id].copy().squeeze() + + base_shape = self.shape_gtfs_to_emme( + trip_row=trip_row + ) + base_shape['shape_model_node_id'] = base_shape['shape_model_node_id'].astype(int) + + # get build shape + build_shape = row.new_itinerary + + updated_shapes = CubeTransit.evaluate_route_shape_changes( + shape_base = base_shape.shape_model_node_id, + shape_build = pd.Series(row.new_itinerary) + ) + updated_shapes[0]['property'] = 'shapes' + shape_changes_df.loc[index, 'properties'] = updated_shapes + + ############## + # stop changes + ############## + stop_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_STOP']) + ].copy() + + stop_attribute_list = ['allow_alightings', 'allow_boardings'] + + stop_changes_df = stop_changes_df.groupby( + ['line_id','i_node'] + )[stop_attribute_list].last().reset_index() + + stop_attribute_changes_df = pd.DataFrame() + + for attribute in stop_attribute_list: + + attribute_df = stop_changes_df.groupby( + ['line_id', attribute] + )['i_node'].apply(list).reset_index() + attribute_df['properties'] = attribute_df.apply( + lambda x: { + 'property' : attribute if x[attribute] == True else 'no_'+attribute.split('_')[-1], + 'set': x['i_node']}, + axis = 1 + ) + + stop_attribute_changes_df = pd.concat( + [stop_attribute_changes_df, + attribute_df[['line_id', 'properties']]], + sort = False, + ignore_index = True + ) + + ############## + # combine all transit changes + ############## + transit_changes_df = pd.concat( + [ + property_changes_df, + shape_changes_df, + stop_attribute_changes_df + ], + sort = False, + ignore_index = True + ) + + # groupby line_id + transit_changes_df = transit_changes_df.groupby( + ['line_id'] + )['properties'].apply(list).reset_index() + + # create change items by line_id + for index, row in transit_changes_df.iterrows(): + line_id = row['line_id'] + base_start_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[0] + + base_end_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[1] + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line_id.split("_")[1], + "direction_id": int(line_id.split("_")[-2].strip("d\"")), + "shape_id": line_id.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str + }, + "properties": row['properties'], + } + + project_card_changes.append(update_card_dict) + + return project_card_changes
+ +class CubeTransformer(Transformer): + """A lark-parsing Transformer which transforms the parse-tree to + a dictionary. + + .. highlight:: python + Typical usage example: + :: + transformed_tree_data = CubeTransformer().transform(parse_tree) + + Attributes: + line_order (int): a dynamic counter to hold the order of the nodes within + a route shape + lines_list (list): a list of the line names + """ + + def __init__(self): + self.line_order = 0 + self.lines_list = [] + + def lines(self, line): + # WranglerLogger.debug("lines: \n {}".format(line)) + + # This MUST be a tuple because it returns to start in the tree + lines = {k: v for k, v in line} + return ("lines", lines) + + @v_args(inline=True) + def program_type_line(self, PROGRAM_TYPE, whitespace=None): + # WranglerLogger.debug("program_type_line:{}".format(PROGRAM_TYPE)) + self.program_type = PROGRAM_TYPE.value + + # This MUST be a tuple because it returns to start in the tree + return ("program_type", PROGRAM_TYPE.value) + + @v_args(inline=True) + def line(self, lin_attributes, nodes): + # WranglerLogger.debug("line...attributes:\n {}".format(lin_attributes)) + # WranglerLogger.debug("line...nodes:\n {}".format(nodes)) + lin_name = lin_attributes["NAME"] + + self.line_order = 0 + # WranglerLogger.debug("parsing: {}".format(lin_name)) + + return (lin_name, {"line_properties": lin_attributes, "line_shape": nodes}) + + @v_args(inline=True) + def lin_attributes(self, *lin_attr): + lin_attr = {k: v for (k, v) in lin_attr} + # WranglerLogger.debug("lin_attributes: {}".format(lin_attr)) + return lin_attr + + @v_args(inline=True) + def lin_attr(self, lin_attr_name, attr_value, SEMICOLON_COMMENT=None): + # WranglerLogger.debug("lin_attr {}: {}".format(lin_attr_name, attr_value)) + return lin_attr_name, attr_value + + def lin_attr_name(self, args): + attr_name = args[0].value.upper() + # WranglerLogger.debug(".......args {}".format(args)) + if attr_name in ["FREQ", "HEADWAY"]: + attr_name = attr_name + "[" + str(args[2]) + "]" + return attr_name + + def attr_value(self, attr_value): + try: + return int(attr_value[0].value) + except: + return attr_value[0].value + + def nodes(self, lin_node): + lin_node = DataFrame(lin_node) + # WranglerLogger.debug("nodes:\n {}".format(lin_node)) + + return lin_node + + @v_args(inline=True) + def lin_node(self, NODE_NUM, SEMICOLON_COMMENT=None, *lin_nodeattr): + self.line_order += 1 + n = int(NODE_NUM.value) + return {"node_id": abs(n), "node": n, "stop": n > 0, "order": self.line_order} + + start = dict + + +TRANSIT_LINE_FILE_GRAMMAR = r""" + +start : program_type_line? lines +WHITESPACE : /[ \t\r\n]/+ +STRING : /("(?!"").*?(?<!\\)(\\\\)*?"|'(?!'').*?(?<!\\)(\\\\)*?')/i +SEMICOLON_COMMENT : /;[^\n]*/ +BOOLEAN : "T"i | "F"i +program_type_line : ";;<<" PROGRAM_TYPE ">><<LINE>>;;" WHITESPACE? +PROGRAM_TYPE : "PT" | "TRNBUILD" + +lines : line* +line : "LINE" lin_attributes nodes + +lin_attributes : lin_attr+ +lin_attr : lin_attr_name "=" attr_value "," SEMICOLON_COMMENT* +TIME_PERIOD : "1".."5" +!lin_attr_name : "allstops"i + | "color"i + | ("freq"i "[" TIME_PERIOD "]") + | ("headway"i "[" TIME_PERIOD "]") + | "mode"i + | "name"i + | "oneway"i + | "owner"i + | "runtime"i + | "timefac"i + | "xyspeed"i + | "longname"i + | "shortname"i + | ("usera1"i) + | ("usera2"i) + | "circular"i + | "vehicletype"i + | "operator"i + | "faresystem"i + +attr_value : BOOLEAN | STRING | SIGNED_INT | FLOAT + +nodes : lin_node+ +lin_node : ("N" | "NODES")? "="? NODE_NUM ","? SEMICOLON_COMMENT? lin_nodeattr* +NODE_NUM : SIGNED_INT +lin_nodeattr : lin_nodeattr_name "=" attr_value ","? SEMICOLON_COMMENT* +!lin_nodeattr_name : "access_c"i + | "access"i + | "delay"i + | "xyspeed"i + | "timefac"i + | "nntime"i + | "time"i + +operator : SEMICOLON_COMMENT* "OPERATOR" opmode_attr* SEMICOLON_COMMENT* +mode : SEMICOLON_COMMENT* "MODE" opmode_attr* SEMICOLON_COMMENT* +opmode_attr : ( (opmode_attr_name "=" attr_value) ","? ) +opmode_attr_name : "number" | "name" | "longname" + +%import common.SIGNED_INT +%import common.FLOAT +%import common.WS +%ignore WS + +""" +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_modules/lasso/util/index.html b/branch/test_no_change/_modules/lasso/util/index.html new file mode 100644 index 0000000..4b81954 --- /dev/null +++ b/branch/test_no_change/_modules/lasso/util/index.html @@ -0,0 +1,256 @@ + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.util

+from functools import partial
+import pyproj
+from shapely.ops import transform
+from shapely.geometry import Point, Polygon
+import re
+from unidecode import unidecode
+
+
[docs]def get_shared_streets_intersection_hash(lat, long, osm_node_id=None): + """ + Calculated per: + https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565 + Expected in/out + -93.0965985, 44.952112199999995 osm_node_id = 954734870 + 69f13f881649cb21ee3b359730790bb9 + + """ + import hashlib + + message = "Intersection {0:.5f} {0:.5f}".format(long, lat) + if osm_node_id: + message += " {}".format(osm_node_id) + unhashed = message.encode("utf-8") + hash = hashlib.md5(unhashed).hexdigest() + return hash
+ + +
[docs]def hhmmss_to_datetime(hhmmss_str: str): + """ + Creates a datetime time object from a string of hh:mm:ss + + Args: + hhmmss_str: string of hh:mm:ss + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = datetime.time(*[int(i) for i in hhmmss_str.split(":")]) + + return dt
+ + +
[docs]def secs_to_datetime(secs: int): + """ + Creates a datetime time object from a seconds from midnight + + Args: + secs: seconds from midnight + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = (datetime.datetime.min + datetime.timedelta(seconds=secs)).time() + + return dt
+ + +
[docs]def geodesic_point_buffer(lat, lon, meters): + """ + creates circular buffer polygon for node + + Args: + lat: node lat + lon: node lon + meters: buffer distance, radius of circle + Returns: + Polygon + """ + proj_wgs84 = pyproj.Proj('+proj=longlat +datum=WGS84') + # Azimuthal equidistant projection + aeqd_proj = '+proj=aeqd +lat_0={lat} +lon_0={lon} +x_0=0 +y_0=0' + project = partial( + pyproj.transform, + pyproj.Proj(aeqd_proj.format(lat=lat, lon=lon)), + proj_wgs84) + buf = Point(0, 0).buffer(meters) # distance in meters + return Polygon(transform(project, buf).exterior.coords[:])
+ +
[docs]def create_locationreference(node, link): + node['X'] = node['geometry'].apply(lambda p: p.x) + node['Y'] = node['geometry'].apply(lambda p: p.y) + node['point'] = [list(xy) for xy in zip(node.X, node.Y)] + node_dict = dict(zip(node.model_node_id, node.point)) + + link['A_point'] = link['A'].map(node_dict) + link['B_point'] = link['B'].map(node_dict) + link['locationReferences'] = link.apply(lambda x: [{'sequence':1, + 'point': x['A_point'], + 'distanceToNextRef':x['length'], + 'bearing' : 0, + 'intersectionId':x['fromIntersectionId']}, + {'sequence':2, + 'point': x['B_point'], + 'intersectionId':x['toIntersectionId']}], + axis = 1)
+ +
[docs]def column_name_to_parts(c, parameters=None): + + if not parameters: + from .parameters import Parameters + + parameters = Parameters() + + if c[0:2] == "ML": + managed = True + else: + managed = False + + time_period = None + category = None + + if c.split("_")[0] not in parameters.properties_to_split.keys(): + return c, None, None, managed + + tps = parameters.time_period_to_time.keys() + cats = parameters.categories.keys() + + if c.split("_")[-1] in tps: + time_period = c.split("_")[-1] + base_name = c.split(time_period)[-2][:-1] + if c.split("_")[-2] in cats: + category = c.split("_")[-2] + base_name = c.split(category)[-2][:-1] + elif c.split("_")[-1] in cats: + category = c.split("_")[-1] + base_name = c.split(category)[-2][:-1] + else: + msg = "Can't split property correctly: {}".format(c) + WranglerLogger.error(msg) + + return base_name, time_period, category, managed
+ +
[docs]def shorten_name(name): + if type(name) == str: + name_list = name.split(',') + else: + name_list = name + name_list = [re.sub(r'\W+', ' ', c).replace('nan', '').strip(' ') for c in name_list] + + name_list = list(set(name_list)) + #name_list.remove('') + + name_new = ' '.join(name_list).strip(' ') + + # convert non english character to english + name_new = unidecode(name_new) + + return name_new
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_modules/shapely/geometry/point/index.html b/branch/test_no_change/_modules/shapely/geometry/point/index.html new file mode 100644 index 0000000..57e05b8 --- /dev/null +++ b/branch/test_no_change/_modules/shapely/geometry/point/index.html @@ -0,0 +1,252 @@ + + + + + + shapely.geometry.point — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.point

+"""Points and related utilities
+"""
+import numpy as np
+
+import shapely
+from shapely.errors import DimensionError
+from shapely.geometry.base import BaseGeometry
+
+__all__ = ["Point"]
+
+
+
[docs]class Point(BaseGeometry): + """ + A geometry type that represents a single coordinate with + x,y and possibly z values. + + A point is a zero-dimensional feature and has zero length and zero area. + + Parameters + ---------- + args : float, or sequence of floats + The coordinates can either be passed as a single parameter, or as + individual float values using multiple parameters: + + 1) 1 parameter: a sequence or array-like of with 2 or 3 values. + 2) 2 or 3 parameters (float): x, y, and possibly z. + + Attributes + ---------- + x, y, z : float + Coordinate values + + Examples + -------- + Constructing the Point using separate parameters for x and y: + + >>> p = Point(1.0, -1.0) + + Constructing the Point using a list of x, y coordinates: + + >>> p = Point([1.0, -1.0]) + >>> print(p) + POINT (1 -1) + >>> p.y + -1.0 + >>> p.x + 1.0 + """ + + __slots__ = [] + + def __new__(self, *args): + if len(args) == 0: + # empty geometry + # TODO better constructor + return shapely.from_wkt("POINT EMPTY") + elif len(args) > 3: + raise TypeError(f"Point() takes at most 3 arguments ({len(args)} given)") + elif len(args) == 1: + coords = args[0] + if isinstance(coords, Point): + return coords + + # Accept either (x, y) or [(x, y)] + if not hasattr(coords, "__getitem__"): # generators + coords = list(coords) + coords = np.asarray(coords).squeeze() + else: + # 2 or 3 args + coords = np.array(args).squeeze() + + if coords.ndim > 1: + raise ValueError( + f"Point() takes only scalar or 1-size vector arguments, got {args}" + ) + if not np.issubdtype(coords.dtype, np.number): + coords = [float(c) for c in coords] + geom = shapely.points(coords) + if not isinstance(geom, Point): + raise ValueError("Invalid values passed to Point constructor") + return geom + + # Coordinate getters and setters + + @property + def x(self): + """Return x coordinate.""" + return float(shapely.get_x(self)) + + @property + def y(self): + """Return y coordinate.""" + return float(shapely.get_y(self)) + + @property + def z(self): + """Return z coordinate.""" + if not shapely.has_z(self): + raise DimensionError("This point has no z coordinate.") + # return shapely.get_z(self) -> get_z only supported for GEOS 3.7+ + return self.coords[0][2] + + @property + def __geo_interface__(self): + return {"type": "Point", "coordinates": self.coords[0]} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG circle element for the Point geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG circle diameter. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + return ( + '<circle cx="{0.x}" cy="{0.y}" r="{1}" ' + 'stroke="#555555" stroke-width="{2}" fill="{3}" opacity="{4}" />' + ).format(self, 3.0 * scale_factor, 1.0 * scale_factor, fill_color, opacity)
+ + @property + def xy(self): + """Separate arrays of X and Y coordinate values + + Example: + >>> x, y = Point(0, 0).xy + >>> list(x) + [0.0] + >>> list(y) + [0.0] + """ + return self.coords.xy
+ + +shapely.lib.registry[0] = Point +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_modules/shapely/geometry/polygon/index.html b/branch/test_no_change/_modules/shapely/geometry/polygon/index.html new file mode 100644 index 0000000..3f094ba --- /dev/null +++ b/branch/test_no_change/_modules/shapely/geometry/polygon/index.html @@ -0,0 +1,462 @@ + + + + + + shapely.geometry.polygon — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.polygon

+"""Polygons and their linear ring components
+"""
+
+import numpy as np
+
+import shapely
+from shapely.algorithms.cga import is_ccw_impl, signed_area
+from shapely.errors import TopologicalError
+from shapely.geometry.base import BaseGeometry
+from shapely.geometry.linestring import LineString
+from shapely.geometry.point import Point
+
+__all__ = ["orient", "Polygon", "LinearRing"]
+
+
+def _unpickle_linearring(wkb):
+    linestring = shapely.from_wkb(wkb)
+    srid = shapely.get_srid(linestring)
+    linearring = shapely.linearrings(shapely.get_coordinates(linestring))
+    if srid:
+        linearring = shapely.set_srid(linearring, srid)
+    return linearring
+
+
+class LinearRing(LineString):
+    """
+    A geometry type composed of one or more line segments
+    that forms a closed loop.
+
+    A LinearRing is a closed, one-dimensional feature.
+    A LinearRing that crosses itself or touches itself at a single point is
+    invalid and operations on it may fail.
+
+    Parameters
+    ----------
+    coordinates : sequence
+        A sequence of (x, y [,z]) numeric coordinate pairs or triples, or
+        an array-like with shape (N, 2) or (N, 3).
+        Also can be a sequence of Point objects.
+
+    Notes
+    -----
+    Rings are automatically closed. There is no need to specify a final
+    coordinate pair identical to the first.
+
+    Examples
+    --------
+    Construct a square ring.
+
+    >>> ring = LinearRing( ((0, 0), (0, 1), (1 ,1 ), (1 , 0)) )
+    >>> ring.is_closed
+    True
+    >>> list(ring.coords)
+    [(0.0, 0.0), (0.0, 1.0), (1.0, 1.0), (1.0, 0.0), (0.0, 0.0)]
+    >>> ring.length
+    4.0
+
+    """
+
+    __slots__ = []
+
+    def __new__(self, coordinates=None):
+        if coordinates is None:
+            # empty geometry
+            # TODO better way?
+            return shapely.from_wkt("LINEARRING EMPTY")
+        elif isinstance(coordinates, LineString):
+            if type(coordinates) == LinearRing:
+                # return original objects since geometries are immutable
+                return coordinates
+            elif not coordinates.is_valid:
+                raise TopologicalError("An input LineString must be valid.")
+            else:
+                # LineString
+                # TODO convert LineString to LinearRing more directly?
+                coordinates = coordinates.coords
+
+        else:
+            if hasattr(coordinates, "__array__"):
+                coordinates = np.asarray(coordinates)
+            if isinstance(coordinates, np.ndarray) and np.issubdtype(
+                coordinates.dtype, np.number
+            ):
+                pass
+            else:
+                # check coordinates on points
+                def _coords(o):
+                    if isinstance(o, Point):
+                        return o.coords[0]
+                    else:
+                        return [float(c) for c in o]
+
+                coordinates = np.array([_coords(o) for o in coordinates])
+                if not np.issubdtype(coordinates.dtype, np.number):
+                    # conversion of coords to 2D array failed, this might be due
+                    # to inconsistent coordinate dimensionality
+                    raise ValueError("Inconsistent coordinate dimensionality")
+
+        if len(coordinates) == 0:
+            # empty geometry
+            # TODO better constructor + should shapely.linearrings handle this?
+            return shapely.from_wkt("LINEARRING EMPTY")
+
+        geom = shapely.linearrings(coordinates)
+        if not isinstance(geom, LinearRing):
+            raise ValueError("Invalid values passed to LinearRing constructor")
+        return geom
+
+    @property
+    def __geo_interface__(self):
+        return {"type": "LinearRing", "coordinates": tuple(self.coords)}
+
+    def __reduce__(self):
+        """WKB doesn't differentiate between LineString and LinearRing so we
+        need to move the coordinate sequence into the correct geometry type"""
+        return (_unpickle_linearring, (shapely.to_wkb(self, include_srid=True),))
+
+    @property
+    def is_ccw(self):
+        """True is the ring is oriented counter clock-wise"""
+        return bool(is_ccw_impl()(self))
+
+    @property
+    def is_simple(self):
+        """True if the geometry is simple, meaning that any self-intersections
+        are only at boundary points, else False"""
+        return bool(shapely.is_simple(self))
+
+
+shapely.lib.registry[2] = LinearRing
+
+
+class InteriorRingSequence:
+
+    _parent = None
+    _ndim = None
+    _index = 0
+    _length = 0
+
+    def __init__(self, parent):
+        self._parent = parent
+        self._ndim = parent._ndim
+
+    def __iter__(self):
+        self._index = 0
+        self._length = self.__len__()
+        return self
+
+    def __next__(self):
+        if self._index < self._length:
+            ring = self._get_ring(self._index)
+            self._index += 1
+            return ring
+        else:
+            raise StopIteration
+
+    def __len__(self):
+        return shapely.get_num_interior_rings(self._parent)
+
+    def __getitem__(self, key):
+        m = self.__len__()
+        if isinstance(key, int):
+            if key + m < 0 or key >= m:
+                raise IndexError("index out of range")
+            if key < 0:
+                i = m + key
+            else:
+                i = key
+            return self._get_ring(i)
+        elif isinstance(key, slice):
+            res = []
+            start, stop, stride = key.indices(m)
+            for i in range(start, stop, stride):
+                res.append(self._get_ring(i))
+            return res
+        else:
+            raise TypeError("key must be an index or slice")
+
+    def _get_ring(self, i):
+        return shapely.get_interior_ring(self._parent, i)
+
+
+
[docs]class Polygon(BaseGeometry): + """ + A geometry type representing an area that is enclosed by a linear ring. + + A polygon is a two-dimensional feature and has a non-zero area. It may + have one or more negative-space "holes" which are also bounded by linear + rings. If any rings cross each other, the feature is invalid and + operations on it may fail. + + Parameters + ---------- + shell : sequence + A sequence of (x, y [,z]) numeric coordinate pairs or triples, or + an array-like with shape (N, 2) or (N, 3). + Also can be a sequence of Point objects. + holes : sequence + A sequence of objects which satisfy the same requirements as the + shell parameters above + + Attributes + ---------- + exterior : LinearRing + The ring which bounds the positive space of the polygon. + interiors : sequence + A sequence of rings which bound all existing holes. + + Examples + -------- + Create a square polygon with no holes + + >>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.)) + >>> polygon = Polygon(coords) + >>> polygon.area + 1.0 + """ + + __slots__ = [] + + def __new__(self, shell=None, holes=None): + if shell is None: + # empty geometry + # TODO better way? + return shapely.from_wkt("POLYGON EMPTY") + elif isinstance(shell, Polygon): + # return original objects since geometries are immutable + return shell + else: + shell = LinearRing(shell) + + if holes is not None: + if len(holes) == 0: + # shapely constructor cannot handle holes=[] + holes = None + else: + holes = [LinearRing(ring) for ring in holes] + + geom = shapely.polygons(shell, holes=holes) + if not isinstance(geom, Polygon): + raise ValueError("Invalid values passed to Polygon constructor") + return geom + + @property + def exterior(self): + return shapely.get_exterior_ring(self) + + @property + def interiors(self): + if self.is_empty: + return [] + return InteriorRingSequence(self) + + @property + def coords(self): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not" + ) + + def __eq__(self, other): + if not isinstance(other, BaseGeometry): + return NotImplemented + if not isinstance(other, Polygon): + return False + check_empty = (self.is_empty, other.is_empty) + if all(check_empty): + return True + elif any(check_empty): + return False + my_coords = [self.exterior.coords] + [ + interior.coords for interior in self.interiors + ] + other_coords = [other.exterior.coords] + [ + interior.coords for interior in other.interiors + ] + if not len(my_coords) == len(other_coords): + return False + # equal_nan=False is the default, but not yet available for older numpy + return np.all( + [ + np.array_equal(left, right) # , equal_nan=False) + for left, right in zip(my_coords, other_coords) + ] + ) + + def __hash__(self): + return super().__hash__() + + @property + def __geo_interface__(self): + if self.exterior == LinearRing(): + coords = [] + else: + coords = [tuple(self.exterior.coords)] + for hole in self.interiors: + coords.append(tuple(hole.coords)) + return {"type": "Polygon", "coordinates": tuple(coords)} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG path element for the Polygon geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG stroke-width. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + exterior_coords = [["{},{}".format(*c) for c in self.exterior.coords]] + interior_coords = [ + ["{},{}".format(*c) for c in interior.coords] for interior in self.interiors + ] + path = " ".join( + [ + "M {} L {} z".format(coords[0], " L ".join(coords[1:])) + for coords in exterior_coords + interior_coords + ] + ) + return ( + '<path fill-rule="evenodd" fill="{2}" stroke="#555555" ' + 'stroke-width="{0}" opacity="{3}" d="{1}" />' + ).format(2.0 * scale_factor, path, fill_color, opacity)
+ +
[docs] @classmethod + def from_bounds(cls, xmin, ymin, xmax, ymax): + """Construct a `Polygon()` from spatial bounds.""" + return cls([(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin)])
+ + +shapely.lib.registry[3] = Polygon + + +def orient(polygon, sign=1.0): + s = float(sign) + rings = [] + ring = polygon.exterior + if signed_area(ring) / s >= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + for ring in polygon.interiors: + if signed_area(ring) / s <= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + return Polygon(rings[0], rings[1:]) +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_modules/shapely/ops/index.html b/branch/test_no_change/_modules/shapely/ops/index.html new file mode 100644 index 0000000..48045cb --- /dev/null +++ b/branch/test_no_change/_modules/shapely/ops/index.html @@ -0,0 +1,845 @@ + + + + + + shapely.ops — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.ops

+"""Support for various GEOS geometry operations
+"""
+
+from warnings import warn
+
+import shapely
+from shapely.algorithms.polylabel import polylabel  # noqa
+from shapely.errors import GeometryTypeError, ShapelyDeprecationWarning
+from shapely.geometry import (
+    GeometryCollection,
+    LineString,
+    MultiLineString,
+    MultiPoint,
+    Point,
+    Polygon,
+    shape,
+)
+from shapely.geometry.base import BaseGeometry, BaseMultipartGeometry
+from shapely.geometry.polygon import orient as orient_
+from shapely.prepared import prep
+
+__all__ = [
+    "cascaded_union",
+    "linemerge",
+    "operator",
+    "polygonize",
+    "polygonize_full",
+    "transform",
+    "unary_union",
+    "triangulate",
+    "voronoi_diagram",
+    "split",
+    "nearest_points",
+    "validate",
+    "snap",
+    "shared_paths",
+    "clip_by_rect",
+    "orient",
+    "substring",
+]
+
+
+class CollectionOperator:
+    def shapeup(self, ob):
+        if isinstance(ob, BaseGeometry):
+            return ob
+        else:
+            try:
+                return shape(ob)
+            except (ValueError, AttributeError):
+                return LineString(ob)
+
+    def polygonize(self, lines):
+        """Creates polygons from a source of lines
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        collection = shapely.polygonize(obs)
+        return collection.geoms
+
+    def polygonize_full(self, lines):
+        """Creates polygons from a source of lines, returning the polygons
+        and leftover geometries.
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+
+        Returns a tuple of objects: (polygons, cut edges, dangles, invalid ring
+        lines). Each are a geometry collection.
+
+        Dangles are edges which have one or both ends which are not incident on
+        another edge endpoint. Cut edges are connected at both ends but do not
+        form part of polygon. Invalid ring lines form rings which are invalid
+        (bowties, etc).
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        return shapely.polygonize_full(obs)
+
+    def linemerge(self, lines, directed=False):
+        """Merges all connected lines from a source
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.  Returns a
+        LineString or MultiLineString when lines are not contiguous.
+        """
+        source = None
+        if getattr(lines, "geom_type", None) == "MultiLineString":
+            source = lines
+        elif hasattr(lines, "geoms"):
+            # other Multi geometries
+            source = MultiLineString([ls.coords for ls in lines.geoms])
+        elif hasattr(lines, "__iter__"):
+            try:
+                source = MultiLineString([ls.coords for ls in lines])
+            except AttributeError:
+                source = MultiLineString(lines)
+        if source is None:
+            raise ValueError(f"Cannot linemerge {lines}")
+        return shapely.line_merge(source, directed=directed)
+
+    def cascaded_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        .. deprecated:: 1.8
+            This function was superseded by :meth:`unary_union`.
+        """
+        warn(
+            "The 'cascaded_union()' function is deprecated. "
+            "Use 'unary_union()' instead.",
+            ShapelyDeprecationWarning,
+            stacklevel=2,
+        )
+        return shapely.union_all(geoms, axis=None)
+
+    def unary_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        Usually used to convert a collection into the smallest set of polygons
+        that cover the same area.
+        """
+        return shapely.union_all(geoms, axis=None)
+
+
+operator = CollectionOperator()
+polygonize = operator.polygonize
+polygonize_full = operator.polygonize_full
+linemerge = operator.linemerge
+cascaded_union = operator.cascaded_union
+unary_union = operator.unary_union
+
+
+def triangulate(geom, tolerance=0.0, edges=False):
+    """Creates the Delaunay triangulation and returns a list of geometries
+
+    The source may be any geometry type. All vertices of the geometry will be
+    used as the points of the triangulation.
+
+    From the GEOS documentation:
+    tolerance is the snapping tolerance used to improve the robustness of
+    the triangulation computation. A tolerance of 0.0 specifies that no
+    snapping will take place.
+
+    If edges is False, a list of Polygons (triangles) will be returned.
+    Otherwise the list of LineString edges is returned.
+
+    """
+    collection = shapely.delaunay_triangles(geom, tolerance=tolerance, only_edges=edges)
+    return [g for g in collection.geoms]
+
+
+def voronoi_diagram(geom, envelope=None, tolerance=0.0, edges=False):
+    """
+    Constructs a Voronoi Diagram [1] from the given geometry.
+    Returns a list of geometries.
+
+    Parameters
+    ----------
+    geom: geometry
+        the input geometry whose vertices will be used to calculate
+        the final diagram.
+    envelope: geometry, None
+        clipping envelope for the returned diagram, automatically
+        determined if None. The diagram will be clipped to the larger
+        of this envelope or an envelope surrounding the sites.
+    tolerance: float, 0.0
+        sets the snapping tolerance used to improve the robustness
+        of the computation. A tolerance of 0.0 specifies that no
+        snapping will take place.
+    edges: bool, False
+        If False, return regions as polygons. Else, return only
+        edges e.g. LineStrings.
+
+    GEOS documentation can be found at [2]
+
+    Returns
+    -------
+    GeometryCollection
+        geometries representing the Voronoi regions.
+
+    Notes
+    -----
+    The tolerance `argument` can be finicky and is known to cause the
+    algorithm to fail in several cases. If you're using `tolerance`
+    and getting a failure, try removing it. The test cases in
+    tests/test_voronoi_diagram.py show more details.
+
+
+    References
+    ----------
+    [1] https://en.wikipedia.org/wiki/Voronoi_diagram
+    [2] https://geos.osgeo.org/doxygen/geos__c_8h_source.html  (line 730)
+    """
+    try:
+        result = shapely.voronoi_polygons(
+            geom, tolerance=tolerance, extend_to=envelope, only_edges=edges
+        )
+    except shapely.GEOSException as err:
+        errstr = "Could not create Voronoi Diagram with the specified inputs "
+        errstr += f"({err!s})."
+        if tolerance:
+            errstr += " Try running again with default tolerance value."
+        raise ValueError(errstr) from err
+
+    if result.geom_type != "GeometryCollection":
+        return GeometryCollection([result])
+    return result
+
+
+def validate(geom):
+    return shapely.is_valid_reason(geom)
+
+
+
[docs]def transform(func, geom): + """Applies `func` to all coordinates of `geom` and returns a new + geometry of the same type from the transformed coordinates. + + `func` maps x, y, and optionally z to output xp, yp, zp. The input + parameters may iterable types like lists or arrays or single values. + The output shall be of the same type. Scalars in, scalars out. + Lists in, lists out. + + For example, here is an identity function applicable to both types + of input. + + def id_func(x, y, z=None): + return tuple(filter(None, [x, y, z])) + + g2 = transform(id_func, g1) + + Using pyproj >= 2.1, this example will accurately project Shapely geometries: + + import pyproj + + wgs84 = pyproj.CRS('EPSG:4326') + utm = pyproj.CRS('EPSG:32618') + + project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform + + g2 = transform(project, g1) + + Note that the always_xy kwarg is required here as Shapely geometries only support + X,Y coordinate ordering. + + Lambda expressions such as the one in + + g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1) + + also satisfy the requirements for `func`. + """ + if geom.is_empty: + return geom + if geom.geom_type in ("Point", "LineString", "LinearRing", "Polygon"): + + # First we try to apply func to x, y, z sequences. When func is + # optimized for sequences, this is the fastest, though zipping + # the results up to go back into the geometry constructors adds + # extra cost. + try: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)(zip(*func(*zip(*geom.coords)))) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)(zip(*func(*zip(*geom.exterior.coords)))) + holes = list( + type(ring)(zip(*func(*zip(*ring.coords)))) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + # A func that assumes x, y, z are single values will likely raise a + # TypeError, in which case we'll try again. + except TypeError: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)([func(*c) for c in geom.coords]) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)([func(*c) for c in geom.exterior.coords]) + holes = list( + type(ring)([func(*c) for c in ring.coords]) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + elif geom.geom_type.startswith("Multi") or geom.geom_type == "GeometryCollection": + return type(geom)([transform(func, part) for part in geom.geoms]) + else: + raise GeometryTypeError(f"Type {geom.geom_type!r} not recognized")
+ + +def nearest_points(g1, g2): + """Returns the calculated nearest points in the input geometries + + The points are returned in the same order as the input geometries. + """ + seq = shapely.shortest_line(g1, g2) + if seq is None: + if g1.is_empty: + raise ValueError("The first input geometry is empty") + else: + raise ValueError("The second input geometry is empty") + + p1 = shapely.get_point(seq, 0) + p2 = shapely.get_point(seq, 1) + return (p1, p2) + + +def snap(g1, g2, tolerance): + """ + Snaps an input geometry (g1) to reference (g2) geometry's vertices. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + tolerance : float + The snapping tolerance + + Refer to :func:`shapely.snap` for full documentation. + """ + + return shapely.snap(g1, g2, tolerance) + + +def shared_paths(g1, g2): + """Find paths shared between the two given lineal geometries + + Returns a GeometryCollection with two elements: + - First element is a MultiLineString containing shared paths with the + same direction for both inputs. + - Second element is a MultiLineString containing shared paths with the + opposite direction for the two inputs. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + """ + if not isinstance(g1, LineString): + raise GeometryTypeError("First geometry must be a LineString") + if not isinstance(g2, LineString): + raise GeometryTypeError("Second geometry must be a LineString") + return shapely.shared_paths(g1, g2) + + +class SplitOp: + @staticmethod + def _split_polygon_with_line(poly, splitter): + """Split a Polygon with a LineString""" + if not isinstance(poly, Polygon): + raise GeometryTypeError("First argument must be a Polygon") + if not isinstance(splitter, LineString): + raise GeometryTypeError("Second argument must be a LineString") + + union = poly.boundary.union(splitter) + + # greatly improves split performance for big geometries with many + # holes (the following contains checks) with minimal overhead + # for common cases + poly = prep(poly) + + # some polygonized geometries may be holes, we do not want them + # that's why we test if the original polygon (poly) contains + # an inner point of polygonized geometry (pg) + return [ + pg for pg in polygonize(union) if poly.contains(pg.representative_point()) + ] + + @staticmethod + def _split_line_with_line(line, splitter): + """Split a LineString with another (Multi)LineString or (Multi)Polygon""" + + # if splitter is a polygon, pick it's boundary + if splitter.geom_type in ("Polygon", "MultiPolygon"): + splitter = splitter.boundary + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, LineString) and not isinstance( + splitter, MultiLineString + ): + raise GeometryTypeError( + "Second argument must be either a LineString or a MultiLineString" + ) + + # | s\l | Interior | Boundary | Exterior | + # |----------|----------|----------|----------| + # | Interior | 0 or F | * | * | At least one of these two must be 0 + # | Boundary | 0 or F | * | * | So either '0********' or '[0F]**0*****' + # | Exterior | * | * | * | No overlapping interiors ('1********') + relation = splitter.relate(line) + if relation[0] == "1": + # The lines overlap at some segment (linear intersection of interiors) + raise ValueError("Input geometry segment overlaps with the splitter.") + elif relation[0] == "0" or relation[3] == "0": + # The splitter crosses or touches the line's interior --> return multilinestring from the split + return line.difference(splitter) + else: + # The splitter does not cross or touch the line's interior --> return collection with identity line + return [line] + + @staticmethod + def _split_line_with_point(line, splitter): + """Split a LineString with a Point""" + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, Point): + raise GeometryTypeError("Second argument must be a Point") + + # check if point is in the interior of the line + if not line.relate_pattern(splitter, "0********"): + # point not on line interior --> return collection with single identity line + # (REASONING: Returning a list with the input line reference and creating a + # GeometryCollection at the general split function prevents unnecessary copying + # of linestrings in multipoint splitting function) + return [line] + elif line.coords[0] == splitter.coords[0]: + # if line is a closed ring the previous test doesn't behave as desired + return [line] + + # point is on line, get the distance from the first point on line + distance_on_line = line.project(splitter) + coords = list(line.coords) + # split the line at the point and create two new lines + current_position = 0.0 + for i in range(len(coords) - 1): + point1 = coords[i] + point2 = coords[i + 1] + dx = point1[0] - point2[0] + dy = point1[1] - point2[1] + segment_length = (dx**2 + dy**2) ** 0.5 + current_position += segment_length + if distance_on_line == current_position: + # splitter is exactly on a vertex + return [LineString(coords[: i + 2]), LineString(coords[i + 1 :])] + elif distance_on_line < current_position: + # splitter is between two vertices + return [ + LineString(coords[: i + 1] + [splitter.coords[0]]), + LineString([splitter.coords[0]] + coords[i + 1 :]), + ] + return [line] + + @staticmethod + def _split_line_with_multipoint(line, splitter): + """Split a LineString with a MultiPoint""" + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, MultiPoint): + raise GeometryTypeError("Second argument must be a MultiPoint") + + chunks = [line] + for pt in splitter.geoms: + new_chunks = [] + for chunk in filter(lambda x: not x.is_empty, chunks): + # add the newly split 2 lines or the same line if not split + new_chunks.extend(SplitOp._split_line_with_point(chunk, pt)) + chunks = new_chunks + + return chunks + + @staticmethod + def split(geom, splitter): + """ + Splits a geometry by another geometry and returns a collection of geometries. This function is the theoretical + opposite of the union of the split geometry parts. If the splitter does not split the geometry, a collection + with a single geometry equal to the input geometry is returned. + The function supports: + - Splitting a (Multi)LineString by a (Multi)Point or (Multi)LineString or (Multi)Polygon + - Splitting a (Multi)Polygon by a LineString + + It may be convenient to snap the splitter with low tolerance to the geometry. For example in the case + of splitting a line by a point, the point must be exactly on the line, for the line to be correctly split. + When splitting a line by a polygon, the boundary of the polygon is used for the operation. + When splitting a line by another line, a ValueError is raised if the two overlap at some segment. + + Parameters + ---------- + geom : geometry + The geometry to be split + splitter : geometry + The geometry that will split the input geom + + Example + ------- + >>> pt = Point((1, 1)) + >>> line = LineString([(0,0), (2,2)]) + >>> result = split(line, pt) + >>> result.wkt + 'GEOMETRYCOLLECTION (LINESTRING (0 0, 1 1), LINESTRING (1 1, 2 2))' + """ + + if geom.geom_type in ("MultiLineString", "MultiPolygon"): + return GeometryCollection( + [i for part in geom.geoms for i in SplitOp.split(part, splitter).geoms] + ) + + elif geom.geom_type == "LineString": + if splitter.geom_type in ( + "LineString", + "MultiLineString", + "Polygon", + "MultiPolygon", + ): + split_func = SplitOp._split_line_with_line + elif splitter.geom_type == "Point": + split_func = SplitOp._split_line_with_point + elif splitter.geom_type == "MultiPoint": + split_func = SplitOp._split_line_with_multipoint + else: + raise GeometryTypeError( + f"Splitting a LineString with a {splitter.geom_type} is not supported" + ) + + elif geom.geom_type == "Polygon": + if splitter.geom_type == "LineString": + split_func = SplitOp._split_polygon_with_line + else: + raise GeometryTypeError( + f"Splitting a Polygon with a {splitter.geom_type} is not supported" + ) + + else: + raise GeometryTypeError( + f"Splitting {geom.geom_type} geometry is not supported" + ) + + return GeometryCollection(split_func(geom, splitter)) + + +split = SplitOp.split + + +def substring(geom, start_dist, end_dist, normalized=False): + """Return a line segment between specified distances along a LineString + + Negative distance values are taken as measured in the reverse + direction from the end of the geometry. Out-of-range index + values are handled by clamping them to the valid range of values. + + If the start distance equals the end distance, a Point is returned. + + If the start distance is actually beyond the end distance, then the + reversed substring is returned such that the start distance is + at the first coordinate. + + Parameters + ---------- + geom : LineString + The geometry to get a substring of. + start_dist : float + The distance along `geom` of the start of the substring. + end_dist : float + The distance along `geom` of the end of the substring. + normalized : bool, False + Whether the distance parameters are interpreted as a + fraction of the geometry's length. + + Returns + ------- + Union[Point, LineString] + The substring between `start_dist` and `end_dist` or a Point + if they are at the same location. + + Raises + ------ + TypeError + If `geom` is not a LineString. + + Examples + -------- + >>> from shapely.geometry import LineString + >>> from shapely.ops import substring + >>> ls = LineString((i, 0) for i in range(6)) + >>> ls.wkt + 'LINESTRING (0 0, 1 0, 2 0, 3 0, 4 0, 5 0)' + >>> substring(ls, start_dist=1, end_dist=3).wkt + 'LINESTRING (1 0, 2 0, 3 0)' + >>> substring(ls, start_dist=3, end_dist=1).wkt + 'LINESTRING (3 0, 2 0, 1 0)' + >>> substring(ls, start_dist=1, end_dist=-3).wkt + 'LINESTRING (1 0, 2 0)' + >>> substring(ls, start_dist=0.2, end_dist=-0.6, normalized=True).wkt + 'LINESTRING (1 0, 2 0)' + + Returning a `Point` when `start_dist` and `end_dist` are at the + same location. + + >>> substring(ls, 2.5, -2.5).wkt + 'POINT (2.5 0)' + """ + + if not isinstance(geom, LineString): + raise GeometryTypeError( + "Can only calculate a substring of LineString geometries. " + f"A {geom.geom_type} was provided." + ) + + # Filter out cases in which to return a point + if start_dist == end_dist: + return geom.interpolate(start_dist, normalized) + elif not normalized and start_dist >= geom.length and end_dist >= geom.length: + return geom.interpolate(geom.length, normalized) + elif not normalized and -start_dist >= geom.length and -end_dist >= geom.length: + return geom.interpolate(0, normalized) + elif normalized and start_dist >= 1 and end_dist >= 1: + return geom.interpolate(1, normalized) + elif normalized and -start_dist >= 1 and -end_dist >= 1: + return geom.interpolate(0, normalized) + + if normalized: + start_dist *= geom.length + end_dist *= geom.length + + # Filter out cases where distances meet at a middle point from opposite ends. + if start_dist < 0 < end_dist and abs(start_dist) + end_dist == geom.length: + return geom.interpolate(end_dist) + elif end_dist < 0 < start_dist and abs(end_dist) + start_dist == geom.length: + return geom.interpolate(start_dist) + + start_point = geom.interpolate(start_dist) + end_point = geom.interpolate(end_dist) + + if start_dist < 0: + start_dist = geom.length + start_dist # Values may still be negative, + if end_dist < 0: # but only in the out-of-range + end_dist = geom.length + end_dist # sense, not the wrap-around sense. + + reverse = start_dist > end_dist + if reverse: + start_dist, end_dist = end_dist, start_dist + + if start_dist < 0: + start_dist = 0 # to avoid duplicating the first vertex + + if reverse: + vertex_list = [tuple(*end_point.coords)] + else: + vertex_list = [tuple(*start_point.coords)] + + coords = list(geom.coords) + current_distance = 0 + for p1, p2 in zip(coords, coords[1:]): + if start_dist < current_distance < end_dist: + vertex_list.append(p1) + elif current_distance >= end_dist: + break + + current_distance += ((p2[0] - p1[0]) ** 2 + (p2[1] - p1[1]) ** 2) ** 0.5 + + if reverse: + vertex_list.append(tuple(*start_point.coords)) + # reverse direction result + vertex_list = reversed(vertex_list) + else: + vertex_list.append(tuple(*end_point.coords)) + + return LineString(vertex_list) + + +def clip_by_rect(geom, xmin, ymin, xmax, ymax): + """Returns the portion of a geometry within a rectangle + + The geometry is clipped in a fast but possibly dirty way. The output is + not guaranteed to be valid. No exceptions will be raised for topological + errors. + + Parameters + ---------- + geom : geometry + The geometry to be clipped + xmin : float + Minimum x value of the rectangle + ymin : float + Minimum y value of the rectangle + xmax : float + Maximum x value of the rectangle + ymax : float + Maximum y value of the rectangle + + Notes + ----- + Requires GEOS >= 3.5.0 + New in 1.7. + """ + if geom.is_empty: + return geom + return shapely.clip_by_rect(geom, xmin, ymin, xmax, ymax) + + +def orient(geom, sign=1.0): + """A properly oriented copy of the given geometry. + + The signed area of the result will have the given sign. A sign of + 1.0 means that the coordinates of the product's exterior rings will + be oriented counter-clockwise. + + Parameters + ---------- + geom : Geometry + The original geometry. May be a Polygon, MultiPolygon, or + GeometryCollection. + sign : float, optional. + The sign of the result's signed area. + + Returns + ------- + Geometry + + """ + if isinstance(geom, BaseMultipartGeometry): + return geom.__class__( + list( + map( + lambda geom: orient(geom, sign), + geom.geoms, + ) + ) + ) + if isinstance(geom, (Polygon,)): + return orient_(geom, sign) + return geom +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/_sources/_generated/lasso.CubeTransit.rst.txt b/branch/test_no_change/_sources/_generated/lasso.CubeTransit.rst.txt new file mode 100644 index 0000000..e24b49e --- /dev/null +++ b/branch/test_no_change/_sources/_generated/lasso.CubeTransit.rst.txt @@ -0,0 +1,36 @@ +lasso.CubeTransit +================= + +.. currentmodule:: lasso + +.. autoclass:: CubeTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~CubeTransit.__init__ + ~CubeTransit.add_additional_time_periods + ~CubeTransit.add_cube + ~CubeTransit.build_route_name + ~CubeTransit.calculate_start_end_times + ~CubeTransit.create_add_route_card_dict + ~CubeTransit.create_delete_route_card_dict + ~CubeTransit.create_from_cube + ~CubeTransit.create_update_route_card_dict + ~CubeTransit.cube_properties_to_standard_properties + ~CubeTransit.evaluate_differences + ~CubeTransit.evaluate_route_property_differences + ~CubeTransit.evaluate_route_shape_changes + ~CubeTransit.get_time_period_numbers_from_cube_properties + ~CubeTransit.unpack_route_name + + + + + + \ No newline at end of file diff --git a/branch/test_no_change/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt b/branch/test_no_change/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt new file mode 100644 index 0000000..29190d8 --- /dev/null +++ b/branch/test_no_change/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt @@ -0,0 +1,90 @@ +lasso.ModelRoadwayNetwork +========================= + +.. currentmodule:: lasso + +.. autoclass:: ModelRoadwayNetwork + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~ModelRoadwayNetwork.__init__ + ~ModelRoadwayNetwork.add_counts + ~ModelRoadwayNetwork.add_incident_link_data_to_nodes + ~ModelRoadwayNetwork.add_new_roadway_feature_change + ~ModelRoadwayNetwork.add_variable_using_shst_reference + ~ModelRoadwayNetwork.addition_map + ~ModelRoadwayNetwork.apply + ~ModelRoadwayNetwork.apply_managed_lane_feature_change + ~ModelRoadwayNetwork.apply_python_calculation + ~ModelRoadwayNetwork.apply_roadway_feature_change + ~ModelRoadwayNetwork.assess_connectivity + ~ModelRoadwayNetwork.build_selection_key + ~ModelRoadwayNetwork.calculate_area_type + ~ModelRoadwayNetwork.calculate_centroidconnect + ~ModelRoadwayNetwork.calculate_county + ~ModelRoadwayNetwork.calculate_distance + ~ModelRoadwayNetwork.calculate_mpo + ~ModelRoadwayNetwork.calculate_use + ~ModelRoadwayNetwork.convert_int + ~ModelRoadwayNetwork.create_ML_variable + ~ModelRoadwayNetwork.create_calculated_variables + ~ModelRoadwayNetwork.create_dummy_connector_links + ~ModelRoadwayNetwork.create_hov_corridor_variable + ~ModelRoadwayNetwork.create_managed_lane_network + ~ModelRoadwayNetwork.create_managed_variable + ~ModelRoadwayNetwork.dataframe_to_fixed_width + ~ModelRoadwayNetwork.delete_roadway_feature_change + ~ModelRoadwayNetwork.deletion_map + ~ModelRoadwayNetwork.fill_na + ~ModelRoadwayNetwork.from_RoadwayNetwork + ~ModelRoadwayNetwork.get_attribute + ~ModelRoadwayNetwork.get_managed_lane_node_ids + ~ModelRoadwayNetwork.get_modal_graph + ~ModelRoadwayNetwork.get_modal_links_nodes + ~ModelRoadwayNetwork.get_property_by_time_period_and_group + ~ModelRoadwayNetwork.identify_segment + ~ModelRoadwayNetwork.identify_segment_endpoints + ~ModelRoadwayNetwork.is_network_connected + ~ModelRoadwayNetwork.load_transform_network + ~ModelRoadwayNetwork.network_connection_plot + ~ModelRoadwayNetwork.orig_dest_nodes_foreign_key + ~ModelRoadwayNetwork.ox_graph + ~ModelRoadwayNetwork.path_search + ~ModelRoadwayNetwork.read + ~ModelRoadwayNetwork.read_match_result + ~ModelRoadwayNetwork.rename_variables_for_dbf + ~ModelRoadwayNetwork.roadway_net_to_gdf + ~ModelRoadwayNetwork.roadway_standard_to_met_council_network + ~ModelRoadwayNetwork.select_roadway_features + ~ModelRoadwayNetwork.selection_has_unique_link_id + ~ModelRoadwayNetwork.selection_map + ~ModelRoadwayNetwork.shortest_path + ~ModelRoadwayNetwork.split_properties_by_time_period_and_category + ~ModelRoadwayNetwork.update_distance + ~ModelRoadwayNetwork.validate_link_schema + ~ModelRoadwayNetwork.validate_node_schema + ~ModelRoadwayNetwork.validate_properties + ~ModelRoadwayNetwork.validate_selection + ~ModelRoadwayNetwork.validate_shape_schema + ~ModelRoadwayNetwork.validate_uniqueness + ~ModelRoadwayNetwork.write + ~ModelRoadwayNetwork.write_roadway_as_fixedwidth + ~ModelRoadwayNetwork.write_roadway_as_shp + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~ModelRoadwayNetwork.CALCULATED_VALUES + + \ No newline at end of file diff --git a/branch/test_no_change/_sources/_generated/lasso.Parameters.rst.txt b/branch/test_no_change/_sources/_generated/lasso.Parameters.rst.txt new file mode 100644 index 0000000..4aeacfb --- /dev/null +++ b/branch/test_no_change/_sources/_generated/lasso.Parameters.rst.txt @@ -0,0 +1,31 @@ +lasso.Parameters +================ + +.. currentmodule:: lasso + +.. autoclass:: Parameters + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Parameters.__init__ + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Parameters.maz_shape_file + ~Parameters.properties_to_split + ~Parameters.county_link_range_dict + ~Parameters.zones + + \ No newline at end of file diff --git a/branch/test_no_change/_sources/_generated/lasso.Project.rst.txt b/branch/test_no_change/_sources/_generated/lasso.Project.rst.txt new file mode 100644 index 0000000..e6e6bcc --- /dev/null +++ b/branch/test_no_change/_sources/_generated/lasso.Project.rst.txt @@ -0,0 +1,42 @@ +lasso.Project +============= + +.. currentmodule:: lasso + +.. autoclass:: Project + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Project.__init__ + ~Project.add_highway_changes + ~Project.add_transit_changes + ~Project.create_project + ~Project.determine_roadway_network_changes_compatability + ~Project.emme_id_to_wrangler_id + ~Project.emme_name_to_wrangler_name + ~Project.evaluate_changes + ~Project.get_object_from_network_build_command + ~Project.get_operation_from_network_build_command + ~Project.read_logfile + ~Project.read_network_build_file + ~Project.write_project_card + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Project.CALCULATED_VALUES + ~Project.DEFAULT_PROJECT_NAME + ~Project.STATIC_VALUES + + \ No newline at end of file diff --git a/branch/test_no_change/_sources/_generated/lasso.StandardTransit.rst.txt b/branch/test_no_change/_sources/_generated/lasso.StandardTransit.rst.txt new file mode 100644 index 0000000..4fae048 --- /dev/null +++ b/branch/test_no_change/_sources/_generated/lasso.StandardTransit.rst.txt @@ -0,0 +1,33 @@ +lasso.StandardTransit +===================== + +.. currentmodule:: lasso + +.. autoclass:: StandardTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~StandardTransit.__init__ + ~StandardTransit.calculate_cube_mode + ~StandardTransit.cube_format + ~StandardTransit.evaluate_differences + ~StandardTransit.fromTransitNetwork + ~StandardTransit.read_gtfs + ~StandardTransit.route_properties_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_dict_list + ~StandardTransit.shape_gtfs_to_emme + ~StandardTransit.time_to_cube_time_period + ~StandardTransit.write_as_cube_lin + + + + + + \ No newline at end of file diff --git a/branch/test_no_change/_sources/_generated/lasso.logger.rst.txt b/branch/test_no_change/_sources/_generated/lasso.logger.rst.txt new file mode 100644 index 0000000..2054273 --- /dev/null +++ b/branch/test_no_change/_sources/_generated/lasso.logger.rst.txt @@ -0,0 +1,29 @@ +lasso.logger +============ + +.. automodule:: lasso.logger + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + setupLogging + + + + + + + + + + + + + diff --git a/branch/test_no_change/_sources/_generated/lasso.util.rst.txt b/branch/test_no_change/_sources/_generated/lasso.util.rst.txt new file mode 100644 index 0000000..95fecf8 --- /dev/null +++ b/branch/test_no_change/_sources/_generated/lasso.util.rst.txt @@ -0,0 +1,35 @@ +lasso.util +========== + +.. automodule:: lasso.util + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + column_name_to_parts + create_locationreference + geodesic_point_buffer + get_shared_streets_intersection_hash + hhmmss_to_datetime + secs_to_datetime + shorten_name + + + + + + + + + + + + + diff --git a/branch/test_no_change/_sources/autodoc.rst.txt b/branch/test_no_change/_sources/autodoc.rst.txt new file mode 100644 index 0000000..7e48d58 --- /dev/null +++ b/branch/test_no_change/_sources/autodoc.rst.txt @@ -0,0 +1,29 @@ +Lasso Classes and Functions +==================================== + +.. automodule:: lasso + :no-members: + :no-undoc-members: + :no-inherited-members: + :no-show-inheritance: + + +Base Classes +-------------- +.. autosummary:: + :toctree: _generated + :nosignatures: + + CubeTransit + StandardTransit + ModelRoadwayNetwork + Project + Parameters + +Utils and Functions +-------------------- +.. autosummary:: + :toctree: _generated + + util + logger diff --git a/branch/test_no_change/_sources/index.rst.txt b/branch/test_no_change/_sources/index.rst.txt new file mode 100644 index 0000000..616853c --- /dev/null +++ b/branch/test_no_change/_sources/index.rst.txt @@ -0,0 +1,36 @@ +.. lasso documentation master file, created by + sphinx-quickstart on Thu Dec 5 15:43:28 2019. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +Welcome to lasso's documentation! +================================= + +This package of utilities is a wrapper around the +`network_wrangler `_ package +for MetCouncil and MTC. It aims to have the following functionality: + +1. parse Cube log files and base highway networks and create ProjectCards + for Network Wrangler +2. parse two Cube transit line files and create ProjectCards for NetworkWrangler +3. refine Network Wrangler highway networks to contain specific variables and + settings for the respective agency and export them to a format that can + be read in by Citilab's Cube software. + +.. toctree:: + :maxdepth: 3 + :caption: Contents: + + starting + setup + running + autodoc + + + +Indices and tables +================== + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/branch/test_no_change/_sources/running.md.txt b/branch/test_no_change/_sources/running.md.txt new file mode 100644 index 0000000..e139dc8 --- /dev/null +++ b/branch/test_no_change/_sources/running.md.txt @@ -0,0 +1,12 @@ +# Running Lasso + +## Create project files + + +## Create a scenario + + +## Exporting networks + + +## Auditing and Reporting diff --git a/branch/test_no_change/_sources/setup.md.txt b/branch/test_no_change/_sources/setup.md.txt new file mode 100644 index 0000000..e77d463 --- /dev/null +++ b/branch/test_no_change/_sources/setup.md.txt @@ -0,0 +1,9 @@ +# Setup + +### Projects + +### Parameters + +### Settings + +### Additional Data Files diff --git a/branch/test_no_change/_sources/starting.md.txt b/branch/test_no_change/_sources/starting.md.txt new file mode 100644 index 0000000..8886f95 --- /dev/null +++ b/branch/test_no_change/_sources/starting.md.txt @@ -0,0 +1,292 @@ +# Starting Out + +## Installation + +If you are managing multiple python versions, we suggest using [`virtualenv`](https://virtualenv.pypa.io/en/latest/) or [`conda`](https://conda.io/en/latest/) virtual environments. + +Example using a conda environment (recommended) and using the package manager [pip](https://pip.pypa.io/en/stable/) to install Lasso from the source on GitHub. + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/Lasso@master +``` + +Lasso will install `network_wrangler` from the [PyPi](https://pypi.org/project/network-wrangler/) repository because it is included in Lasso's `requirements.txt`. + +#### Bleeding Edge +If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the `develop` branch of + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/network_wrangler@develop +pip install git+https://github.com/wsp-sag/Lasso@develop +``` + +#### From Clone +If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in [editable mode](https://pip.pypa.io/en/stable/reference/pip_install/?highlight=editable#editable-installs). + +**if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!** + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas osmnx -n +conda activate +git clone https://github.com/wsp-sag/Lasso +git clone https://github.com/wsp-sag/network_wrangler +cd network_wrangler +pip install -e . +cd .. +cd Lasso +pip install -e . +``` + +Notes: + +1. The -e installs it in editable mode. +2. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per [these directions on github](https://help.github.com/en/articles/fork-a-repo). +3. if you wanted to install from a specific tag/version number or branch, replace `@master` with `@` or `@tag` +4. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler + +If you are going to be doing Lasso development, we also recommend: + - a good IDE such as [Atom](http://atom.io), VS Code, Sublime Text, etc. + with Python syntax highlighting turned on. + - [GitHub Desktop](https://desktop.github.com/) to locally update your clones + +## Brief Intro + +Lasso is a 'wrapper' around the [Network Wrangler](http://wsp-sag.github.io/network_wrangler) utility. + +Both Lasso and NetworkWrangler are built around the following data schemas: + - [`roadway network`], which is based on a mashup of Open Street Map and [Shared Streets](http://sharedstreets.io). In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category. + - [`transit network`], which is based on a frequency-based implementation of the csv-based GTFS; and + - [`project card`], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml. + +In addition, Lasso utilizes the following data schemas: + + - [`MetCouncil Model Roadway Network Schema`], which adds data fields to the `roadway network` schema that MetCouncil uses in their travel model including breaking out data fields by time period. + - [`MetCouncil Model Transit Network Schema`], which uses the Cube PublicTransport format, and + - [`Cube Log Files`], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler. + - [`Cube public transport line files`], which define a set of transit lines in the cube software. + +### Components +Network Wrangler has the following atomic parts: + + - _RoadwayNetwork_ object, which represents the `roadway network` data as GeoDataFrames; + - _TransitNetwork_ object, which represents the `transit network` data as DataFrames; + - _ProjectCard_ object, which represents the data of the `project card`. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.; + - _Scenario_ object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network. + +In addition, Lasso has the following atomic parts: + + - _Project_ object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file. + - _ModelRoadwayNetwork_ object is a subclass of `RoadwayNetwork` and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube. + - _StandardTransit_, an object for holding a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files. + - _CubeTransit_, an object for storing information about transit defined in `Cube public transport line files` + . Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries. + - _Parameters_, A class representing all the parameters defining the networks + including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + +#### RoadwayNetwork + +Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames `links_df`, `nodes_df`, and `shapes_df`. + +```python +net = RoadwayNetwork.read( + link_filename=MY_LINK_FILE, + node_filename=MY_NODE_FILE, + shape_filename=MY_SHAPE_FILE, + shape_foreign_key ='shape_id', + + ) +my_selection = { + "link": [{"name": ["I 35E"]}], + "A": {"osm_node_id": "961117623"}, # start searching for segments at A + "B": {"osm_node_id": "2564047368"}, +} +net.select_roadway_features(my_selection) + +my_change = [ + { + 'property': 'lanes', + 'existing': 1, + 'set': 2, + }, + { + 'property': 'drive_access', + 'set': 0, + }, +] + +my_net.apply_roadway_feature_change( + my_net.select_roadway_features(my_selection), + my_change +) + +ml_net = net.create_managed_lane_network(in_place=False) + +ml_net.is_network_connected(mode="drive")) + +_, disconnected_nodes = ml_net.assess_connectivity( + mode="walk", + ignore_end_nodes=True +) +ml_net.write(filename=my_out_prefix, path=my_dir) +``` +#### TransitNetwork + +#### ProjectCard + +#### Scenario + +Manages sets of project cards and tiering from a base scenario/set of networks. + +```python + +my_base_scenario = { + "road_net": RoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ), + "transit_net": TransitNetwork.read(STPAUL_DIR), +} + +card_filenames = [ + "3_multiple_roadway_attribute_change.yml", + "multiple_changes.yml", + "4_simple_managed_lane.yml", +] + +project_card_directory = os.path.join(STPAUL_DIR, "project_cards") + +project_cards_list = [ + ProjectCard.read(os.path.join(project_card_directory, filename), validate=False) + for filename in card_filenames +] + +my_scenario = Scenario.create_scenario( + base_scenario=my_base_scenario, + project_cards_list=project_cards_list, +) +my_scenario.check_scenario_requisites() + +my_scenario.apply_all_projects() + +my_scenario.scenario_summary() +``` + +#### Project +Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network; + +```python + +test_project = Project.create_project( + base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + +test_project.evaluate_changes() + +test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + +``` + +#### ModelRoadwayNetwork +A subclass of network_wrangler's RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema. + +```Python + +net = ModelRoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ) + +net.write_roadway_as_fixedwidth() + +``` + +#### StandardTransit +Translates the standard GTFS data to MetCouncil's Cube Line files. + +```Python +cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) +cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) +``` + +#### CubeTransit +Used by the project class and has the capability to: + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + +```python +tn = CubeTransit.create_from_cube(CUBE_DIR) +transit_change_list = tn.evaluate_differences(base_transit_network) +``` + +#### Parameters +Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary. + +```Python +# read parameters from a yaml configuration file +# could also provide as a key/value pair +with open(config_file) as f: + my_config = yaml.safe_load(f) + +# provide parameters at instantiation of ModelRoadwayNetwork +model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork( + my_scenario.road_net, parameters=my_config.get("my_parameters", {}) + ) +# network written with direction from the parameters given +model_road_net.write_roadway_as_shp() + +``` + +### Typical Workflow + +Workflows in Lasso and Network Wrangler typically accomplish one of two goals: +1. Create Project Cards to document network changes as a result of either transit or roadway projects. +2. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network. + +#### Project Cards from Transit LIN Files + + +#### Project Cards from Cube LOG Files + + +#### Model Network Files for a Scenario + + + +## Running Quickstart Jupyter Notebooks + +To learn basic lasso functionality, please refer to the following jupyter notebooks in the `/notebooks` directory: + + - `Lasso Project Card Creation Quickstart.ipynb` + - `Lasso Scenario Creation Quickstart.ipynb` + + Jupyter notebooks can be started by activating the lasso conda environment and typing `jupyter notebook`: + + ```bash + conda activate + jupyter notebook + ``` diff --git a/branch/test_no_change/_static/_sphinx_javascript_frameworks_compat.js b/branch/test_no_change/_static/_sphinx_javascript_frameworks_compat.js new file mode 100644 index 0000000..8141580 --- /dev/null +++ b/branch/test_no_change/_static/_sphinx_javascript_frameworks_compat.js @@ -0,0 +1,123 @@ +/* Compatability shim for jQuery and underscores.js. + * + * Copyright Sphinx contributors + * Released under the two clause BSD licence + */ + +/** + * small helper function to urldecode strings + * + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent#Decoding_query_parameters_from_a_URL + */ +jQuery.urldecode = function(x) { + if (!x) { + return x + } + return decodeURIComponent(x.replace(/\+/g, ' ')); +}; + +/** + * small helper function to urlencode strings + */ +jQuery.urlencode = encodeURIComponent; + +/** + * This function returns the parsed url parameters of the + * current request. Multiple values per key are supported, + * it will always return arrays of strings for the value parts. + */ +jQuery.getQueryParameters = function(s) { + if (typeof s === 'undefined') + s = document.location.search; + var parts = s.substr(s.indexOf('?') + 1).split('&'); + var result = {}; + for (var i = 0; i < parts.length; i++) { + var tmp = parts[i].split('=', 2); + var key = jQuery.urldecode(tmp[0]); + var value = jQuery.urldecode(tmp[1]); + if (key in result) + result[key].push(value); + else + result[key] = [value]; + } + return result; +}; + +/** + * highlight a given string on a jquery object by wrapping it in + * span elements with the given class name. + */ +jQuery.fn.highlightText = function(text, className) { + function highlight(node, addItems) { + if (node.nodeType === 3) { + var val = node.nodeValue; + var pos = val.toLowerCase().indexOf(text); + if (pos >= 0 && + !jQuery(node.parentNode).hasClass(className) && + !jQuery(node.parentNode).hasClass("nohighlight")) { + var span; + var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.className = className; + } + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + node.parentNode.insertBefore(span, node.parentNode.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling)); + node.nodeValue = val.substr(0, pos); + if (isInSVG) { + var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); + var bbox = node.parentElement.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute('class', className); + addItems.push({ + "parent": node.parentNode, + "target": rect}); + } + } + } + else if (!jQuery(node).is("button, select, textarea")) { + jQuery.each(node.childNodes, function() { + highlight(this, addItems); + }); + } + } + var addItems = []; + var result = this.each(function() { + highlight(this, addItems); + }); + for (var i = 0; i < addItems.length; ++i) { + jQuery(addItems[i].parent).before(addItems[i].target); + } + return result; +}; + +/* + * backward compatibility for jQuery.browser + * This will be supported until firefox bug is fixed. + */ +if (!jQuery.browser) { + jQuery.uaMatch = function(ua) { + ua = ua.toLowerCase(); + + var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || + /(webkit)[ \/]([\w.]+)/.exec(ua) || + /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || + /(msie) ([\w.]+)/.exec(ua) || + ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || + []; + + return { + browser: match[ 1 ] || "", + version: match[ 2 ] || "0" + }; + }; + jQuery.browser = {}; + jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; +} diff --git a/branch/test_no_change/_static/basic.css b/branch/test_no_change/_static/basic.css new file mode 100644 index 0000000..cfc60b8 --- /dev/null +++ b/branch/test_no_change/_static/basic.css @@ -0,0 +1,921 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 230px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li p.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 360px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, figure.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, figure.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, figure.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, figure.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar, +aside.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} + +nav.contents, +aside.topic, +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ + +nav.contents, +aside.topic, +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +aside.sidebar > :last-child, +nav.contents > :last-child, +aside.topic > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +aside.sidebar::after, +nav.contents::after, +aside.topic::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure, figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption, figcaption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number, +figcaption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text, +figcaption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + +/* -- object description styles --------------------------------------------- */ + +.sig { + font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; +} + +.sig-name, code.descname { + background-color: transparent; + font-weight: bold; +} + +.sig-name { + font-size: 1.1em; +} + +code.descname { + font-size: 1.2em; +} + +.sig-prename, code.descclassname { + background-color: transparent; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.sig-param.n { + font-style: italic; +} + +/* C++ specific styling */ + +.sig-inline.c-texpr, +.sig-inline.cpp-texpr { + font-family: unset; +} + +.sig.c .k, .sig.c .kt, +.sig.cpp .k, .sig.cpp .kt { + color: #0033B3; +} + +.sig.c .m, +.sig.cpp .m { + color: #1750EB; +} + +.sig.c .s, .sig.c .sc, +.sig.cpp .s, .sig.cpp .sc { + color: #067D17; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} + +aside.footnote > span, +div.citation > span { + float: left; +} +aside.footnote > span:last-of-type, +div.citation > span:last-of-type { + padding-right: 0.5em; +} +aside.footnote > p { + margin-left: 2em; +} +div.citation > p { + margin-left: 4em; +} +aside.footnote > p:last-of-type, +div.citation > p:last-of-type { + margin-bottom: 0em; +} +aside.footnote > p:last-of-type:after, +div.citation > p:last-of-type:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +.sig dd { + margin-top: 0px; + margin-bottom: 0px; +} + +.sig dl { + margin-top: 0px; + margin-bottom: 0px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0 0.5em; + content: ":"; + display: inline-block; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +.translated { + background-color: rgba(207, 255, 207, 0.2) +} + +.untranslated { + background-color: rgba(255, 207, 207, 0.2) +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; + white-space: nowrap; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +span.linenos, +div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; + -webkit-user-select: text; /* Safari fallback only */ + -webkit-user-select: none; /* Chrome/Safari */ + -moz-user-select: none; /* Firefox */ + -ms-user-select: none; /* IE10+ */ +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/branch/test_no_change/_static/css/badge_only.css b/branch/test_no_change/_static/css/badge_only.css new file mode 100644 index 0000000..c718cee --- /dev/null +++ b/branch/test_no_change/_static/css/badge_only.css @@ -0,0 +1 @@ +.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-style:normal;font-weight:400;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#FontAwesome) format("svg")}.fa:before{font-family:FontAwesome;font-style:normal;font-weight:400;line-height:1}.fa:before,a .fa{text-decoration:inherit}.fa:before,a .fa,li .fa{display:inline-block}li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before,.icon-book:before{content:"\f02d"}.fa-caret-down:before,.icon-caret-down:before{content:"\f0d7"}.fa-caret-up:before,.icon-caret-up:before{content:"\f0d8"}.fa-caret-left:before,.icon-caret-left:before{content:"\f0d9"}.fa-caret-right:before,.icon-caret-right:before{content:"\f0da"}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60}.rst-versions .rst-current-version:after{clear:both;content:"";display:block}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}} \ No newline at end of file diff --git a/branch/test_no_change/_static/css/fonts/Roboto-Slab-Bold.woff b/branch/test_no_change/_static/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/branch/test_no_change/_static/css/fonts/Roboto-Slab-Bold.woff2 b/branch/test_no_change/_static/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/branch/test_no_change/_static/css/fonts/Roboto-Slab-Regular.woff b/branch/test_no_change/_static/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/branch/test_no_change/_static/css/fonts/Roboto-Slab-Regular.woff2 b/branch/test_no_change/_static/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/branch/test_no_change/_static/css/fonts/fontawesome-webfont.eot b/branch/test_no_change/_static/css/fonts/fontawesome-webfont.eot new file mode 100644 index 0000000..e9f60ca Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/fontawesome-webfont.eot differ diff --git a/branch/test_no_change/_static/css/fonts/fontawesome-webfont.svg b/branch/test_no_change/_static/css/fonts/fontawesome-webfont.svg new file mode 100644 index 0000000..855c845 --- /dev/null +++ b/branch/test_no_change/_static/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/branch/test_no_change/_static/css/fonts/fontawesome-webfont.ttf b/branch/test_no_change/_static/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 0000000..35acda2 Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/fontawesome-webfont.ttf differ diff --git a/branch/test_no_change/_static/css/fonts/fontawesome-webfont.woff b/branch/test_no_change/_static/css/fonts/fontawesome-webfont.woff new file mode 100644 index 0000000..400014a Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/fontawesome-webfont.woff differ diff --git a/branch/test_no_change/_static/css/fonts/fontawesome-webfont.woff2 b/branch/test_no_change/_static/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 0000000..4d13fc6 Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/fontawesome-webfont.woff2 differ diff --git a/branch/test_no_change/_static/css/fonts/lato-bold-italic.woff b/branch/test_no_change/_static/css/fonts/lato-bold-italic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/lato-bold-italic.woff differ diff --git a/branch/test_no_change/_static/css/fonts/lato-bold-italic.woff2 b/branch/test_no_change/_static/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/lato-bold-italic.woff2 differ diff --git a/branch/test_no_change/_static/css/fonts/lato-bold.woff b/branch/test_no_change/_static/css/fonts/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/lato-bold.woff differ diff --git a/branch/test_no_change/_static/css/fonts/lato-bold.woff2 b/branch/test_no_change/_static/css/fonts/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/lato-bold.woff2 differ diff --git a/branch/test_no_change/_static/css/fonts/lato-normal-italic.woff b/branch/test_no_change/_static/css/fonts/lato-normal-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/lato-normal-italic.woff differ diff --git a/branch/test_no_change/_static/css/fonts/lato-normal-italic.woff2 b/branch/test_no_change/_static/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/lato-normal-italic.woff2 differ diff --git a/branch/test_no_change/_static/css/fonts/lato-normal.woff b/branch/test_no_change/_static/css/fonts/lato-normal.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/lato-normal.woff differ diff --git a/branch/test_no_change/_static/css/fonts/lato-normal.woff2 b/branch/test_no_change/_static/css/fonts/lato-normal.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/branch/test_no_change/_static/css/fonts/lato-normal.woff2 differ diff --git a/branch/test_no_change/_static/css/theme.css b/branch/test_no_change/_static/css/theme.css new file mode 100644 index 0000000..19a446a --- /dev/null +++ b/branch/test_no_change/_static/css/theme.css @@ -0,0 +1,4 @@ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .eqno .headerlink:before,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .eqno .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a button.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-left.toctree-expand,.wy-menu-vertical li button.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .eqno .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a button.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-right.toctree-expand,.wy-menu-vertical li button.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .eqno .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a button.pull-left.toctree-expand,.wy-menu-vertical li.on a button.pull-left.toctree-expand,.wy-menu-vertical li button.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .eqno .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a button.pull-right.toctree-expand,.wy-menu-vertical li.on a button.pull-right.toctree-expand,.wy-menu-vertical li button.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li button.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content .eqno .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content .eqno a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content p a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li a button.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content .eqno .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content p .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li button.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content .eqno .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a button.toctree-expand,.btn .wy-menu-vertical li.on a button.toctree-expand,.btn .wy-menu-vertical li button.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content .eqno .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a button.toctree-expand,.nav .wy-menu-vertical li.on a button.toctree-expand,.nav .wy-menu-vertical li button.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .eqno .btn .headerlink,.rst-content .eqno .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p .btn .headerlink,.rst-content p .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn button.toctree-expand,.wy-menu-vertical li.current>a .btn button.toctree-expand,.wy-menu-vertical li.current>a .nav button.toctree-expand,.wy-menu-vertical li .nav button.toctree-expand,.wy-menu-vertical li.on a .btn button.toctree-expand,.wy-menu-vertical li.on a .nav button.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .eqno .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li button.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .eqno .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li button.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .eqno .btn .fa-large.headerlink,.rst-content .eqno .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p .btn .fa-large.headerlink,.rst-content p .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn button.fa-large.toctree-expand,.wy-menu-vertical li .nav button.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .eqno .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li button.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .eqno .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li button.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .eqno .btn .fa-spin.headerlink,.rst-content .eqno .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p .btn .fa-spin.headerlink,.rst-content p .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn button.fa-spin.toctree-expand,.wy-menu-vertical li .nav button.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content .eqno .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li button.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content .eqno .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li button.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content .eqno .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li button.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content .eqno .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini button.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.rst-content section ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.rst-content section ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.rst-content section ul li p:last-child,.rst-content section ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.rst-content section ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.rst-content section ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.rst-content section ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content .section ol.arabic,.rst-content .toctree-wrapper ol,.rst-content .toctree-wrapper ol.arabic,.rst-content section ol,.rst-content section ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol.arabic li,.rst-content .section ol li,.rst-content .toctree-wrapper ol.arabic li,.rst-content .toctree-wrapper ol li,.rst-content section ol.arabic li,.rst-content section ol li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol.arabic li ul,.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content .toctree-wrapper ol.arabic li ul,.rst-content .toctree-wrapper ol li p:last-child,.rst-content .toctree-wrapper ol li ul,.rst-content section ol.arabic li ul,.rst-content section ol li p:last-child,.rst-content section ol li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol.arabic li ul li,.rst-content .section ol li ul li,.rst-content .toctree-wrapper ol.arabic li ul li,.rst-content .toctree-wrapper ol li ul li,.rst-content section ol.arabic li ul li,.rst-content section ol li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs>li{display:inline-block;padding-top:5px}.wy-breadcrumbs>li.wy-breadcrumbs-aside{float:right}.rst-content .wy-breadcrumbs>li code,.rst-content .wy-breadcrumbs>li tt,.wy-breadcrumbs>li .rst-content tt,.wy-breadcrumbs>li code{all:inherit;color:inherit}.breadcrumb-item:before{content:"/";color:#bbb;font-size:13px;padding:0 6px 0 3px}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li button.toctree-expand{display:block;float:left;margin-left:-1.2em;line-height:18px;color:#4d4d4d;border:none;background:none;padding:0}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover button.toctree-expand,.wy-menu-vertical li.on a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand{display:block;line-height:18px;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{padding:.4045em 1.618em .4045em 4.045em}.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{padding:.4045em 1.618em .4045em 5.663em}.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a{padding:.4045em 1.618em .4045em 7.281em}.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a{padding:.4045em 1.618em .4045em 8.899em}.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a{padding:.4045em 1.618em .4045em 10.517em}.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a{padding:.4045em 1.618em .4045em 12.135em}.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a{padding:.4045em 1.618em .4045em 13.753em}.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a{padding:.4045em 1.618em .4045em 15.371em}.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 1.618em .4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 button.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 button.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover button.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active button.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em;max-width:100%}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .eqno .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content .eqno .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li button.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version button.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content .toctree-wrapper>p.caption,.rst-content h1,.rst-content h2,.rst-content h3,.rst-content h4,.rst-content h5,.rst-content h6{margin-bottom:24px}.rst-content img{max-width:100%;height:auto}.rst-content div.figure,.rst-content figure{margin-bottom:24px}.rst-content div.figure .caption-text,.rst-content figure .caption-text{font-style:italic}.rst-content div.figure p:last-child.caption,.rst-content figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center,.rst-content figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img,.rst-content section>a>img,.rst-content section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp,.rst-content div.highlight span.linenos{user-select:none;pointer-events:none}.rst-content div.highlight span.linenos{display:inline-block;padding-left:0;padding-right:12px;margin-right:12px;border-right:1px solid #e6e9ea}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li,.rst-content .toctree-wrapper ol.loweralpha,.rst-content .toctree-wrapper ol.loweralpha>li,.rst-content section ol.loweralpha,.rst-content section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li,.rst-content .toctree-wrapper ol.upperalpha,.rst-content .toctree-wrapper ol.upperalpha>li,.rst-content section ol.upperalpha,.rst-content section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*,.rst-content .toctree-wrapper ol li>*,.rst-content .toctree-wrapper ul li>*,.rst-content section ol li>*,.rst-content section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child,.rst-content .toctree-wrapper ol li>:first-child,.rst-content .toctree-wrapper ul li>:first-child,.rst-content section ol li>:first-child,.rst-content section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child,.rst-content .toctree-wrapper ol li>p,.rst-content .toctree-wrapper ol li>p:last-child,.rst-content .toctree-wrapper ul li>p,.rst-content .toctree-wrapper ul li>p:last-child,.rst-content section ol li>p,.rst-content section ol li>p:last-child,.rst-content section ul li>p,.rst-content section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child,.rst-content .toctree-wrapper ol li>p:only-child,.rst-content .toctree-wrapper ol li>p:only-child:last-child,.rst-content .toctree-wrapper ul li>p:only-child,.rst-content .toctree-wrapper ul li>p:only-child:last-child,.rst-content section ol li>p:only-child,.rst-content section ol li>p:only-child:last-child,.rst-content section ul li>p:only-child,.rst-content section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul,.rst-content .toctree-wrapper ol li>ol,.rst-content .toctree-wrapper ol li>ul,.rst-content .toctree-wrapper ul li>ol,.rst-content .toctree-wrapper ul li>ul,.rst-content section ol li>ol,.rst-content section ol li>ul,.rst-content section ul li>ol,.rst-content section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul,.rst-content .toctree-wrapper ol.simple li>*,.rst-content .toctree-wrapper ol.simple li ol,.rst-content .toctree-wrapper ol.simple li ul,.rst-content .toctree-wrapper ul.simple li>*,.rst-content .toctree-wrapper ul.simple li ol,.rst-content .toctree-wrapper ul.simple li ul,.rst-content section ol.simple li>*,.rst-content section ol.simple li ol,.rst-content section ol.simple li ul,.rst-content section ul.simple li>*,.rst-content section ul.simple li ol,.rst-content section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink{opacity:0;font-size:14px;font-family:FontAwesome;margin-left:.5em}.rst-content .code-block-caption .headerlink:focus,.rst-content .code-block-caption:hover .headerlink,.rst-content .eqno .headerlink:focus,.rst-content .eqno:hover .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink:focus,.rst-content .toctree-wrapper>p.caption:hover .headerlink,.rst-content dl dt .headerlink:focus,.rst-content dl dt:hover .headerlink,.rst-content h1 .headerlink:focus,.rst-content h1:hover .headerlink,.rst-content h2 .headerlink:focus,.rst-content h2:hover .headerlink,.rst-content h3 .headerlink:focus,.rst-content h3:hover .headerlink,.rst-content h4 .headerlink:focus,.rst-content h4:hover .headerlink,.rst-content h5 .headerlink:focus,.rst-content h5:hover .headerlink,.rst-content h6 .headerlink:focus,.rst-content h6:hover .headerlink,.rst-content p.caption .headerlink:focus,.rst-content p.caption:hover .headerlink,.rst-content p .headerlink:focus,.rst-content p:hover .headerlink,.rst-content table>caption .headerlink:focus,.rst-content table>caption:hover .headerlink{opacity:1}.rst-content p a{overflow-wrap:anywhere}.rst-content .wy-table td p,.rst-content .wy-table td ul,.rst-content .wy-table th p,.rst-content .wy-table th ul,.rst-content table.docutils td p,.rst-content table.docutils td ul,.rst-content table.docutils th p,.rst-content table.docutils th ul,.rst-content table.field-list td p,.rst-content table.field-list td ul,.rst-content table.field-list th p,.rst-content table.field-list th ul{font-size:inherit}.rst-content .btn:focus{outline:2px solid}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .citation-reference>span.fn-bracket,.rst-content .footnote-reference>span.fn-bracket{display:none}.rst-content .hlist{width:100%}.rst-content dl dt span.classifier:before{content:" : "}.rst-content dl dt span.classifier-delimiter{display:none!important}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:auto minmax(80%,95%)}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{display:inline-grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{display:grid;grid-template-columns:auto auto minmax(.65rem,auto) minmax(40%,95%)}html.writer-html5 .rst-content aside.citation>span.label,html.writer-html5 .rst-content aside.footnote>span.label,html.writer-html5 .rst-content div.citation>span.label{grid-column-start:1;grid-column-end:2}html.writer-html5 .rst-content aside.citation>span.backrefs,html.writer-html5 .rst-content aside.footnote>span.backrefs,html.writer-html5 .rst-content div.citation>span.backrefs{grid-column-start:2;grid-column-end:3;grid-row-start:1;grid-row-end:3}html.writer-html5 .rst-content aside.citation>p,html.writer-html5 .rst-content aside.footnote>p,html.writer-html5 .rst-content div.citation>p{grid-column-start:4;grid-column-end:5}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{margin-bottom:24px}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.citation>dt>span.brackets:before,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.citation>dt>span.brackets:after,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a{word-break:keep-all}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a:not(:first-child):before,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.citation>dd p,html.writer-html5 .rst-content dl.footnote>dd p{font-size:.9rem}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{padding-left:1rem;padding-right:1rem;font-size:.9rem;line-height:1.2rem}html.writer-html5 .rst-content aside.citation p,html.writer-html5 .rst-content aside.footnote p,html.writer-html5 .rst-content div.citation p{font-size:.9rem;line-height:1.2rem;margin-bottom:12px}html.writer-html5 .rst-content aside.citation span.backrefs,html.writer-html5 .rst-content aside.footnote span.backrefs,html.writer-html5 .rst-content div.citation span.backrefs{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content aside.citation span.backrefs>a,html.writer-html5 .rst-content aside.footnote span.backrefs>a,html.writer-html5 .rst-content div.citation span.backrefs>a{word-break:keep-all}html.writer-html5 .rst-content aside.citation span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content aside.footnote span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content div.citation span.backrefs>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content aside.citation span.label,html.writer-html5 .rst-content aside.footnote span.label,html.writer-html5 .rst-content div.citation span.label{line-height:1.2rem}html.writer-html5 .rst-content aside.citation-list,html.writer-html5 .rst-content aside.footnote-list,html.writer-html5 .rst-content div.citation-list{margin-bottom:24px}html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content aside.footnote-list aside.footnote,html.writer-html5 .rst-content div.citation-list>div.citation,html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content aside.footnote-list aside.footnote code,html.writer-html5 .rst-content aside.footnote-list aside.footnote tt,html.writer-html5 .rst-content aside.footnote code,html.writer-html5 .rst-content aside.footnote tt,html.writer-html5 .rst-content div.citation-list>div.citation code,html.writer-html5 .rst-content div.citation-list>div.citation tt,html.writer-html5 .rst-content dl.citation code,html.writer-html5 .rst-content dl.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c;white-space:normal}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040;overflow-wrap:normal}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}.rst-content dl dd>ol:last-child,.rst-content dl dd>p:last-child,.rst-content dl dd>table:last-child,.rst-content dl dd>ul:last-child{margin-bottom:0}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px;max-width:100%}html.writer-html4 .rst-content dl:not(.docutils) .k,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .k{font-style:italic}html.writer-html4 .rst-content dl:not(.docutils) .descclassname,html.writer-html4 .rst-content dl:not(.docutils) .descname,html.writer-html4 .rst-content dl:not(.docutils) .sig-name,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .sig-name{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#000}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel,.rst-content .menuselection{font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content .guilabel,.rst-content .menuselection{border:1px solid #7fbbe3;background:#e7f2fa}.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>.kbd,.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>kbd{color:inherit;font-size:80%;background-color:#fff;border:1px solid #a6a6a6;border-radius:4px;box-shadow:0 2px grey;padding:2.4px 6px;margin:auto 0}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} \ No newline at end of file diff --git a/branch/test_no_change/_static/doctools.js b/branch/test_no_change/_static/doctools.js new file mode 100644 index 0000000..d06a71d --- /dev/null +++ b/branch/test_no_change/_static/doctools.js @@ -0,0 +1,156 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Base JavaScript utilities for all Sphinx HTML documentation. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +const BLACKLISTED_KEY_CONTROL_ELEMENTS = new Set([ + "TEXTAREA", + "INPUT", + "SELECT", + "BUTTON", +]); + +const _ready = (callback) => { + if (document.readyState !== "loading") { + callback(); + } else { + document.addEventListener("DOMContentLoaded", callback); + } +}; + +/** + * Small JavaScript module for the documentation. + */ +const Documentation = { + init: () => { + Documentation.initDomainIndexTable(); + Documentation.initOnKeyListeners(); + }, + + /** + * i18n support + */ + TRANSLATIONS: {}, + PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), + LOCALE: "unknown", + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext: (string) => { + const translated = Documentation.TRANSLATIONS[string]; + switch (typeof translated) { + case "undefined": + return string; // no translation + case "string": + return translated; // translation exists + default: + return translated[0]; // (singular, plural) translation tuple exists + } + }, + + ngettext: (singular, plural, n) => { + const translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated !== "undefined") + return translated[Documentation.PLURAL_EXPR(n)]; + return n === 1 ? singular : plural; + }, + + addTranslations: (catalog) => { + Object.assign(Documentation.TRANSLATIONS, catalog.messages); + Documentation.PLURAL_EXPR = new Function( + "n", + `return (${catalog.plural_expr})` + ); + Documentation.LOCALE = catalog.locale; + }, + + /** + * helper function to focus on search bar + */ + focusSearchBar: () => { + document.querySelectorAll("input[name=q]")[0]?.focus(); + }, + + /** + * Initialise the domain index toggle buttons + */ + initDomainIndexTable: () => { + const toggler = (el) => { + const idNumber = el.id.substr(7); + const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); + if (el.src.substr(-9) === "minus.png") { + el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; + toggledRows.forEach((el) => (el.style.display = "none")); + } else { + el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; + toggledRows.forEach((el) => (el.style.display = "")); + } + }; + + const togglerElements = document.querySelectorAll("img.toggler"); + togglerElements.forEach((el) => + el.addEventListener("click", (event) => toggler(event.currentTarget)) + ); + togglerElements.forEach((el) => (el.style.display = "")); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); + }, + + initOnKeyListeners: () => { + // only install a listener if it is really needed + if ( + !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && + !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS + ) + return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.altKey || event.ctrlKey || event.metaKey) return; + + if (!event.shiftKey) { + switch (event.key) { + case "ArrowLeft": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const prevLink = document.querySelector('link[rel="prev"]'); + if (prevLink && prevLink.href) { + window.location.href = prevLink.href; + event.preventDefault(); + } + break; + case "ArrowRight": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const nextLink = document.querySelector('link[rel="next"]'); + if (nextLink && nextLink.href) { + window.location.href = nextLink.href; + event.preventDefault(); + } + break; + } + } + + // some keyboard layouts may need Shift to get / + switch (event.key) { + case "/": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.focusSearchBar(); + event.preventDefault(); + } + }); + }, +}; + +// quick alias for translations +const _ = Documentation.gettext; + +_ready(Documentation.init); diff --git a/branch/test_no_change/_static/documentation_options.js b/branch/test_no_change/_static/documentation_options.js new file mode 100644 index 0000000..c066c69 --- /dev/null +++ b/branch/test_no_change/_static/documentation_options.js @@ -0,0 +1,14 @@ +var DOCUMENTATION_OPTIONS = { + URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'), + VERSION: '', + LANGUAGE: 'en', + COLLAPSE_INDEX: false, + BUILDER: 'dirhtml', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false, + SHOW_SEARCH_SUMMARY: true, + ENABLE_SEARCH_SHORTCUTS: true, +}; \ No newline at end of file diff --git a/branch/test_no_change/_static/file.png b/branch/test_no_change/_static/file.png new file mode 100644 index 0000000..a858a41 Binary files /dev/null and b/branch/test_no_change/_static/file.png differ diff --git a/branch/test_no_change/_static/graphviz.css b/branch/test_no_change/_static/graphviz.css new file mode 100644 index 0000000..8d81c02 --- /dev/null +++ b/branch/test_no_change/_static/graphviz.css @@ -0,0 +1,19 @@ +/* + * graphviz.css + * ~~~~~~~~~~~~ + * + * Sphinx stylesheet -- graphviz extension. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +img.graphviz { + border: 0; + max-width: 100%; +} + +object.graphviz { + max-width: 100%; +} diff --git a/branch/test_no_change/_static/jquery.js b/branch/test_no_change/_static/jquery.js new file mode 100644 index 0000000..c4c6022 --- /dev/null +++ b/branch/test_no_change/_static/jquery.js @@ -0,0 +1,2 @@ +/*! jQuery v3.6.0 | (c) OpenJS Foundation and other contributors | jquery.org/license */ +!function(e,t){"use strict";"object"==typeof module&&"object"==typeof module.exports?module.exports=e.document?t(e,!0):function(e){if(!e.document)throw new Error("jQuery requires a window with a document");return t(e)}:t(e)}("undefined"!=typeof window?window:this,function(C,e){"use strict";var t=[],r=Object.getPrototypeOf,s=t.slice,g=t.flat?function(e){return t.flat.call(e)}:function(e){return t.concat.apply([],e)},u=t.push,i=t.indexOf,n={},o=n.toString,v=n.hasOwnProperty,a=v.toString,l=a.call(Object),y={},m=function(e){return"function"==typeof e&&"number"!=typeof e.nodeType&&"function"!=typeof e.item},x=function(e){return null!=e&&e===e.window},E=C.document,c={type:!0,src:!0,nonce:!0,noModule:!0};function b(e,t,n){var r,i,o=(n=n||E).createElement("script");if(o.text=e,t)for(r in c)(i=t[r]||t.getAttribute&&t.getAttribute(r))&&o.setAttribute(r,i);n.head.appendChild(o).parentNode.removeChild(o)}function w(e){return null==e?e+"":"object"==typeof e||"function"==typeof e?n[o.call(e)]||"object":typeof e}var f="3.6.0",S=function(e,t){return new S.fn.init(e,t)};function p(e){var t=!!e&&"length"in e&&e.length,n=w(e);return!m(e)&&!x(e)&&("array"===n||0===t||"number"==typeof t&&0+~]|"+M+")"+M+"*"),U=new RegExp(M+"|>"),X=new RegExp(F),V=new RegExp("^"+I+"$"),G={ID:new RegExp("^#("+I+")"),CLASS:new RegExp("^\\.("+I+")"),TAG:new RegExp("^("+I+"|[*])"),ATTR:new RegExp("^"+W),PSEUDO:new RegExp("^"+F),CHILD:new RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+M+"*(even|odd|(([+-]|)(\\d*)n|)"+M+"*(?:([+-]|)"+M+"*(\\d+)|))"+M+"*\\)|)","i"),bool:new RegExp("^(?:"+R+")$","i"),needsContext:new RegExp("^"+M+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+M+"*((?:-\\d)?\\d*)"+M+"*\\)|)(?=[^-]|$)","i")},Y=/HTML$/i,Q=/^(?:input|select|textarea|button)$/i,J=/^h\d$/i,K=/^[^{]+\{\s*\[native \w/,Z=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,ee=/[+~]/,te=new RegExp("\\\\[\\da-fA-F]{1,6}"+M+"?|\\\\([^\\r\\n\\f])","g"),ne=function(e,t){var n="0x"+e.slice(1)-65536;return t||(n<0?String.fromCharCode(n+65536):String.fromCharCode(n>>10|55296,1023&n|56320))},re=/([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g,ie=function(e,t){return t?"\0"===e?"\ufffd":e.slice(0,-1)+"\\"+e.charCodeAt(e.length-1).toString(16)+" ":"\\"+e},oe=function(){T()},ae=be(function(e){return!0===e.disabled&&"fieldset"===e.nodeName.toLowerCase()},{dir:"parentNode",next:"legend"});try{H.apply(t=O.call(p.childNodes),p.childNodes),t[p.childNodes.length].nodeType}catch(e){H={apply:t.length?function(e,t){L.apply(e,O.call(t))}:function(e,t){var n=e.length,r=0;while(e[n++]=t[r++]);e.length=n-1}}}function se(t,e,n,r){var i,o,a,s,u,l,c,f=e&&e.ownerDocument,p=e?e.nodeType:9;if(n=n||[],"string"!=typeof t||!t||1!==p&&9!==p&&11!==p)return n;if(!r&&(T(e),e=e||C,E)){if(11!==p&&(u=Z.exec(t)))if(i=u[1]){if(9===p){if(!(a=e.getElementById(i)))return n;if(a.id===i)return n.push(a),n}else if(f&&(a=f.getElementById(i))&&y(e,a)&&a.id===i)return n.push(a),n}else{if(u[2])return H.apply(n,e.getElementsByTagName(t)),n;if((i=u[3])&&d.getElementsByClassName&&e.getElementsByClassName)return H.apply(n,e.getElementsByClassName(i)),n}if(d.qsa&&!N[t+" "]&&(!v||!v.test(t))&&(1!==p||"object"!==e.nodeName.toLowerCase())){if(c=t,f=e,1===p&&(U.test(t)||z.test(t))){(f=ee.test(t)&&ye(e.parentNode)||e)===e&&d.scope||((s=e.getAttribute("id"))?s=s.replace(re,ie):e.setAttribute("id",s=S)),o=(l=h(t)).length;while(o--)l[o]=(s?"#"+s:":scope")+" "+xe(l[o]);c=l.join(",")}try{return H.apply(n,f.querySelectorAll(c)),n}catch(e){N(t,!0)}finally{s===S&&e.removeAttribute("id")}}}return g(t.replace($,"$1"),e,n,r)}function ue(){var r=[];return function e(t,n){return r.push(t+" ")>b.cacheLength&&delete e[r.shift()],e[t+" "]=n}}function le(e){return e[S]=!0,e}function ce(e){var t=C.createElement("fieldset");try{return!!e(t)}catch(e){return!1}finally{t.parentNode&&t.parentNode.removeChild(t),t=null}}function fe(e,t){var n=e.split("|"),r=n.length;while(r--)b.attrHandle[n[r]]=t}function pe(e,t){var n=t&&e,r=n&&1===e.nodeType&&1===t.nodeType&&e.sourceIndex-t.sourceIndex;if(r)return r;if(n)while(n=n.nextSibling)if(n===t)return-1;return e?1:-1}function de(t){return function(e){return"input"===e.nodeName.toLowerCase()&&e.type===t}}function he(n){return function(e){var t=e.nodeName.toLowerCase();return("input"===t||"button"===t)&&e.type===n}}function ge(t){return function(e){return"form"in e?e.parentNode&&!1===e.disabled?"label"in e?"label"in e.parentNode?e.parentNode.disabled===t:e.disabled===t:e.isDisabled===t||e.isDisabled!==!t&&ae(e)===t:e.disabled===t:"label"in e&&e.disabled===t}}function ve(a){return le(function(o){return o=+o,le(function(e,t){var n,r=a([],e.length,o),i=r.length;while(i--)e[n=r[i]]&&(e[n]=!(t[n]=e[n]))})})}function ye(e){return e&&"undefined"!=typeof e.getElementsByTagName&&e}for(e in d=se.support={},i=se.isXML=function(e){var t=e&&e.namespaceURI,n=e&&(e.ownerDocument||e).documentElement;return!Y.test(t||n&&n.nodeName||"HTML")},T=se.setDocument=function(e){var t,n,r=e?e.ownerDocument||e:p;return r!=C&&9===r.nodeType&&r.documentElement&&(a=(C=r).documentElement,E=!i(C),p!=C&&(n=C.defaultView)&&n.top!==n&&(n.addEventListener?n.addEventListener("unload",oe,!1):n.attachEvent&&n.attachEvent("onunload",oe)),d.scope=ce(function(e){return a.appendChild(e).appendChild(C.createElement("div")),"undefined"!=typeof e.querySelectorAll&&!e.querySelectorAll(":scope fieldset div").length}),d.attributes=ce(function(e){return e.className="i",!e.getAttribute("className")}),d.getElementsByTagName=ce(function(e){return e.appendChild(C.createComment("")),!e.getElementsByTagName("*").length}),d.getElementsByClassName=K.test(C.getElementsByClassName),d.getById=ce(function(e){return a.appendChild(e).id=S,!C.getElementsByName||!C.getElementsByName(S).length}),d.getById?(b.filter.ID=function(e){var t=e.replace(te,ne);return function(e){return e.getAttribute("id")===t}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n=t.getElementById(e);return n?[n]:[]}}):(b.filter.ID=function(e){var n=e.replace(te,ne);return function(e){var t="undefined"!=typeof e.getAttributeNode&&e.getAttributeNode("id");return t&&t.value===n}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n,r,i,o=t.getElementById(e);if(o){if((n=o.getAttributeNode("id"))&&n.value===e)return[o];i=t.getElementsByName(e),r=0;while(o=i[r++])if((n=o.getAttributeNode("id"))&&n.value===e)return[o]}return[]}}),b.find.TAG=d.getElementsByTagName?function(e,t){return"undefined"!=typeof t.getElementsByTagName?t.getElementsByTagName(e):d.qsa?t.querySelectorAll(e):void 0}:function(e,t){var n,r=[],i=0,o=t.getElementsByTagName(e);if("*"===e){while(n=o[i++])1===n.nodeType&&r.push(n);return r}return o},b.find.CLASS=d.getElementsByClassName&&function(e,t){if("undefined"!=typeof t.getElementsByClassName&&E)return t.getElementsByClassName(e)},s=[],v=[],(d.qsa=K.test(C.querySelectorAll))&&(ce(function(e){var t;a.appendChild(e).innerHTML="",e.querySelectorAll("[msallowcapture^='']").length&&v.push("[*^$]="+M+"*(?:''|\"\")"),e.querySelectorAll("[selected]").length||v.push("\\["+M+"*(?:value|"+R+")"),e.querySelectorAll("[id~="+S+"-]").length||v.push("~="),(t=C.createElement("input")).setAttribute("name",""),e.appendChild(t),e.querySelectorAll("[name='']").length||v.push("\\["+M+"*name"+M+"*="+M+"*(?:''|\"\")"),e.querySelectorAll(":checked").length||v.push(":checked"),e.querySelectorAll("a#"+S+"+*").length||v.push(".#.+[+~]"),e.querySelectorAll("\\\f"),v.push("[\\r\\n\\f]")}),ce(function(e){e.innerHTML="";var t=C.createElement("input");t.setAttribute("type","hidden"),e.appendChild(t).setAttribute("name","D"),e.querySelectorAll("[name=d]").length&&v.push("name"+M+"*[*^$|!~]?="),2!==e.querySelectorAll(":enabled").length&&v.push(":enabled",":disabled"),a.appendChild(e).disabled=!0,2!==e.querySelectorAll(":disabled").length&&v.push(":enabled",":disabled"),e.querySelectorAll("*,:x"),v.push(",.*:")})),(d.matchesSelector=K.test(c=a.matches||a.webkitMatchesSelector||a.mozMatchesSelector||a.oMatchesSelector||a.msMatchesSelector))&&ce(function(e){d.disconnectedMatch=c.call(e,"*"),c.call(e,"[s!='']:x"),s.push("!=",F)}),v=v.length&&new RegExp(v.join("|")),s=s.length&&new RegExp(s.join("|")),t=K.test(a.compareDocumentPosition),y=t||K.test(a.contains)?function(e,t){var n=9===e.nodeType?e.documentElement:e,r=t&&t.parentNode;return e===r||!(!r||1!==r.nodeType||!(n.contains?n.contains(r):e.compareDocumentPosition&&16&e.compareDocumentPosition(r)))}:function(e,t){if(t)while(t=t.parentNode)if(t===e)return!0;return!1},j=t?function(e,t){if(e===t)return l=!0,0;var n=!e.compareDocumentPosition-!t.compareDocumentPosition;return n||(1&(n=(e.ownerDocument||e)==(t.ownerDocument||t)?e.compareDocumentPosition(t):1)||!d.sortDetached&&t.compareDocumentPosition(e)===n?e==C||e.ownerDocument==p&&y(p,e)?-1:t==C||t.ownerDocument==p&&y(p,t)?1:u?P(u,e)-P(u,t):0:4&n?-1:1)}:function(e,t){if(e===t)return l=!0,0;var n,r=0,i=e.parentNode,o=t.parentNode,a=[e],s=[t];if(!i||!o)return e==C?-1:t==C?1:i?-1:o?1:u?P(u,e)-P(u,t):0;if(i===o)return pe(e,t);n=e;while(n=n.parentNode)a.unshift(n);n=t;while(n=n.parentNode)s.unshift(n);while(a[r]===s[r])r++;return r?pe(a[r],s[r]):a[r]==p?-1:s[r]==p?1:0}),C},se.matches=function(e,t){return se(e,null,null,t)},se.matchesSelector=function(e,t){if(T(e),d.matchesSelector&&E&&!N[t+" "]&&(!s||!s.test(t))&&(!v||!v.test(t)))try{var n=c.call(e,t);if(n||d.disconnectedMatch||e.document&&11!==e.document.nodeType)return n}catch(e){N(t,!0)}return 0":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(e){return e[1]=e[1].replace(te,ne),e[3]=(e[3]||e[4]||e[5]||"").replace(te,ne),"~="===e[2]&&(e[3]=" "+e[3]+" "),e.slice(0,4)},CHILD:function(e){return e[1]=e[1].toLowerCase(),"nth"===e[1].slice(0,3)?(e[3]||se.error(e[0]),e[4]=+(e[4]?e[5]+(e[6]||1):2*("even"===e[3]||"odd"===e[3])),e[5]=+(e[7]+e[8]||"odd"===e[3])):e[3]&&se.error(e[0]),e},PSEUDO:function(e){var t,n=!e[6]&&e[2];return G.CHILD.test(e[0])?null:(e[3]?e[2]=e[4]||e[5]||"":n&&X.test(n)&&(t=h(n,!0))&&(t=n.indexOf(")",n.length-t)-n.length)&&(e[0]=e[0].slice(0,t),e[2]=n.slice(0,t)),e.slice(0,3))}},filter:{TAG:function(e){var t=e.replace(te,ne).toLowerCase();return"*"===e?function(){return!0}:function(e){return e.nodeName&&e.nodeName.toLowerCase()===t}},CLASS:function(e){var t=m[e+" "];return t||(t=new RegExp("(^|"+M+")"+e+"("+M+"|$)"))&&m(e,function(e){return t.test("string"==typeof e.className&&e.className||"undefined"!=typeof e.getAttribute&&e.getAttribute("class")||"")})},ATTR:function(n,r,i){return function(e){var t=se.attr(e,n);return null==t?"!="===r:!r||(t+="","="===r?t===i:"!="===r?t!==i:"^="===r?i&&0===t.indexOf(i):"*="===r?i&&-1:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i;function j(e,n,r){return m(n)?S.grep(e,function(e,t){return!!n.call(e,t,e)!==r}):n.nodeType?S.grep(e,function(e){return e===n!==r}):"string"!=typeof n?S.grep(e,function(e){return-1)[^>]*|#([\w-]+))$/;(S.fn.init=function(e,t,n){var r,i;if(!e)return this;if(n=n||D,"string"==typeof e){if(!(r="<"===e[0]&&">"===e[e.length-1]&&3<=e.length?[null,e,null]:q.exec(e))||!r[1]&&t)return!t||t.jquery?(t||n).find(e):this.constructor(t).find(e);if(r[1]){if(t=t instanceof S?t[0]:t,S.merge(this,S.parseHTML(r[1],t&&t.nodeType?t.ownerDocument||t:E,!0)),N.test(r[1])&&S.isPlainObject(t))for(r in t)m(this[r])?this[r](t[r]):this.attr(r,t[r]);return this}return(i=E.getElementById(r[2]))&&(this[0]=i,this.length=1),this}return e.nodeType?(this[0]=e,this.length=1,this):m(e)?void 0!==n.ready?n.ready(e):e(S):S.makeArray(e,this)}).prototype=S.fn,D=S(E);var L=/^(?:parents|prev(?:Until|All))/,H={children:!0,contents:!0,next:!0,prev:!0};function O(e,t){while((e=e[t])&&1!==e.nodeType);return e}S.fn.extend({has:function(e){var t=S(e,this),n=t.length;return this.filter(function(){for(var e=0;e\x20\t\r\n\f]*)/i,he=/^$|^module$|\/(?:java|ecma)script/i;ce=E.createDocumentFragment().appendChild(E.createElement("div")),(fe=E.createElement("input")).setAttribute("type","radio"),fe.setAttribute("checked","checked"),fe.setAttribute("name","t"),ce.appendChild(fe),y.checkClone=ce.cloneNode(!0).cloneNode(!0).lastChild.checked,ce.innerHTML="",y.noCloneChecked=!!ce.cloneNode(!0).lastChild.defaultValue,ce.innerHTML="",y.option=!!ce.lastChild;var ge={thead:[1,"","
"],col:[2,"","
"],tr:[2,"","
"],td:[3,"","
"],_default:[0,"",""]};function ve(e,t){var n;return n="undefined"!=typeof e.getElementsByTagName?e.getElementsByTagName(t||"*"):"undefined"!=typeof e.querySelectorAll?e.querySelectorAll(t||"*"):[],void 0===t||t&&A(e,t)?S.merge([e],n):n}function ye(e,t){for(var n=0,r=e.length;n",""]);var me=/<|&#?\w+;/;function xe(e,t,n,r,i){for(var o,a,s,u,l,c,f=t.createDocumentFragment(),p=[],d=0,h=e.length;d\s*$/g;function je(e,t){return A(e,"table")&&A(11!==t.nodeType?t:t.firstChild,"tr")&&S(e).children("tbody")[0]||e}function De(e){return e.type=(null!==e.getAttribute("type"))+"/"+e.type,e}function qe(e){return"true/"===(e.type||"").slice(0,5)?e.type=e.type.slice(5):e.removeAttribute("type"),e}function Le(e,t){var n,r,i,o,a,s;if(1===t.nodeType){if(Y.hasData(e)&&(s=Y.get(e).events))for(i in Y.remove(t,"handle events"),s)for(n=0,r=s[i].length;n").attr(n.scriptAttrs||{}).prop({charset:n.scriptCharset,src:n.url}).on("load error",i=function(e){r.remove(),i=null,e&&t("error"===e.type?404:200,e.type)}),E.head.appendChild(r[0])},abort:function(){i&&i()}}});var _t,zt=[],Ut=/(=)\?(?=&|$)|\?\?/;S.ajaxSetup({jsonp:"callback",jsonpCallback:function(){var e=zt.pop()||S.expando+"_"+wt.guid++;return this[e]=!0,e}}),S.ajaxPrefilter("json jsonp",function(e,t,n){var r,i,o,a=!1!==e.jsonp&&(Ut.test(e.url)?"url":"string"==typeof e.data&&0===(e.contentType||"").indexOf("application/x-www-form-urlencoded")&&Ut.test(e.data)&&"data");if(a||"jsonp"===e.dataTypes[0])return r=e.jsonpCallback=m(e.jsonpCallback)?e.jsonpCallback():e.jsonpCallback,a?e[a]=e[a].replace(Ut,"$1"+r):!1!==e.jsonp&&(e.url+=(Tt.test(e.url)?"&":"?")+e.jsonp+"="+r),e.converters["script json"]=function(){return o||S.error(r+" was not called"),o[0]},e.dataTypes[0]="json",i=C[r],C[r]=function(){o=arguments},n.always(function(){void 0===i?S(C).removeProp(r):C[r]=i,e[r]&&(e.jsonpCallback=t.jsonpCallback,zt.push(r)),o&&m(i)&&i(o[0]),o=i=void 0}),"script"}),y.createHTMLDocument=((_t=E.implementation.createHTMLDocument("").body).innerHTML="
",2===_t.childNodes.length),S.parseHTML=function(e,t,n){return"string"!=typeof e?[]:("boolean"==typeof t&&(n=t,t=!1),t||(y.createHTMLDocument?((r=(t=E.implementation.createHTMLDocument("")).createElement("base")).href=E.location.href,t.head.appendChild(r)):t=E),o=!n&&[],(i=N.exec(e))?[t.createElement(i[1])]:(i=xe([e],t,o),o&&o.length&&S(o).remove(),S.merge([],i.childNodes)));var r,i,o},S.fn.load=function(e,t,n){var r,i,o,a=this,s=e.indexOf(" ");return-1").append(S.parseHTML(e)).find(r):e)}).always(n&&function(e,t){a.each(function(){n.apply(this,o||[e.responseText,t,e])})}),this},S.expr.pseudos.animated=function(t){return S.grep(S.timers,function(e){return t===e.elem}).length},S.offset={setOffset:function(e,t,n){var r,i,o,a,s,u,l=S.css(e,"position"),c=S(e),f={};"static"===l&&(e.style.position="relative"),s=c.offset(),o=S.css(e,"top"),u=S.css(e,"left"),("absolute"===l||"fixed"===l)&&-1<(o+u).indexOf("auto")?(a=(r=c.position()).top,i=r.left):(a=parseFloat(o)||0,i=parseFloat(u)||0),m(t)&&(t=t.call(e,n,S.extend({},s))),null!=t.top&&(f.top=t.top-s.top+a),null!=t.left&&(f.left=t.left-s.left+i),"using"in t?t.using.call(e,f):c.css(f)}},S.fn.extend({offset:function(t){if(arguments.length)return void 0===t?this:this.each(function(e){S.offset.setOffset(this,t,e)});var e,n,r=this[0];return r?r.getClientRects().length?(e=r.getBoundingClientRect(),n=r.ownerDocument.defaultView,{top:e.top+n.pageYOffset,left:e.left+n.pageXOffset}):{top:0,left:0}:void 0},position:function(){if(this[0]){var e,t,n,r=this[0],i={top:0,left:0};if("fixed"===S.css(r,"position"))t=r.getBoundingClientRect();else{t=this.offset(),n=r.ownerDocument,e=r.offsetParent||n.documentElement;while(e&&(e===n.body||e===n.documentElement)&&"static"===S.css(e,"position"))e=e.parentNode;e&&e!==r&&1===e.nodeType&&((i=S(e).offset()).top+=S.css(e,"borderTopWidth",!0),i.left+=S.css(e,"borderLeftWidth",!0))}return{top:t.top-i.top-S.css(r,"marginTop",!0),left:t.left-i.left-S.css(r,"marginLeft",!0)}}},offsetParent:function(){return this.map(function(){var e=this.offsetParent;while(e&&"static"===S.css(e,"position"))e=e.offsetParent;return e||re})}}),S.each({scrollLeft:"pageXOffset",scrollTop:"pageYOffset"},function(t,i){var o="pageYOffset"===i;S.fn[t]=function(e){return $(this,function(e,t,n){var r;if(x(e)?r=e:9===e.nodeType&&(r=e.defaultView),void 0===n)return r?r[i]:e[t];r?r.scrollTo(o?r.pageXOffset:n,o?n:r.pageYOffset):e[t]=n},t,e,arguments.length)}}),S.each(["top","left"],function(e,n){S.cssHooks[n]=Fe(y.pixelPosition,function(e,t){if(t)return t=We(e,n),Pe.test(t)?S(e).position()[n]+"px":t})}),S.each({Height:"height",Width:"width"},function(a,s){S.each({padding:"inner"+a,content:s,"":"outer"+a},function(r,o){S.fn[o]=function(e,t){var n=arguments.length&&(r||"boolean"!=typeof e),i=r||(!0===e||!0===t?"margin":"border");return $(this,function(e,t,n){var r;return x(e)?0===o.indexOf("outer")?e["inner"+a]:e.document.documentElement["client"+a]:9===e.nodeType?(r=e.documentElement,Math.max(e.body["scroll"+a],r["scroll"+a],e.body["offset"+a],r["offset"+a],r["client"+a])):void 0===n?S.css(e,t,i):S.style(e,t,n,i)},s,n?e:void 0,n)}})}),S.each(["ajaxStart","ajaxStop","ajaxComplete","ajaxError","ajaxSuccess","ajaxSend"],function(e,t){S.fn[t]=function(e){return this.on(t,e)}}),S.fn.extend({bind:function(e,t,n){return this.on(e,null,t,n)},unbind:function(e,t){return this.off(e,null,t)},delegate:function(e,t,n,r){return this.on(t,e,n,r)},undelegate:function(e,t,n){return 1===arguments.length?this.off(e,"**"):this.off(t,e||"**",n)},hover:function(e,t){return this.mouseenter(e).mouseleave(t||e)}}),S.each("blur focus focusin focusout resize scroll click dblclick mousedown mouseup mousemove mouseover mouseout mouseenter mouseleave change select submit keydown keypress keyup contextmenu".split(" "),function(e,n){S.fn[n]=function(e,t){return 0",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=y.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=y.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),y.elements=c+" "+a,j(b)}function f(a){var b=x[a[v]];return b||(b={},w++,a[v]=w,x[w]=b),b}function g(a,c,d){if(c||(c=b),q)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():u.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||t.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),q)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return y.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(y,b.frag)}function j(a){a||(a=b);var d=f(a);return!y.shivCSS||p||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),q||i(a,d),a}function k(a){for(var b,c=a.getElementsByTagName("*"),e=c.length,f=RegExp("^(?:"+d().join("|")+")$","i"),g=[];e--;)b=c[e],f.test(b.nodeName)&&g.push(b.applyElement(l(b)));return g}function l(a){for(var b,c=a.attributes,d=c.length,e=a.ownerDocument.createElement(A+":"+a.nodeName);d--;)b=c[d],b.specified&&e.setAttribute(b.nodeName,b.nodeValue);return e.style.cssText=a.style.cssText,e}function m(a){for(var b,c=a.split("{"),e=c.length,f=RegExp("(^|[\\s,>+~])("+d().join("|")+")(?=[[\\s,>+~#.:]|$)","gi"),g="$1"+A+"\\:$2";e--;)b=c[e]=c[e].split("}"),b[b.length-1]=b[b.length-1].replace(f,g),c[e]=b.join("}");return c.join("{")}function n(a){for(var b=a.length;b--;)a[b].removeNode()}function o(a){function b(){clearTimeout(g._removeSheetTimer),d&&d.removeNode(!0),d=null}var d,e,g=f(a),h=a.namespaces,i=a.parentWindow;return!B||a.printShived?a:("undefined"==typeof h[A]&&h.add(A),i.attachEvent("onbeforeprint",function(){b();for(var f,g,h,i=a.styleSheets,j=[],l=i.length,n=Array(l);l--;)n[l]=i[l];for(;h=n.pop();)if(!h.disabled&&z.test(h.media)){try{f=h.imports,g=f.length}catch(o){g=0}for(l=0;g>l;l++)n.push(f[l]);try{j.push(h.cssText)}catch(o){}}j=m(j.reverse().join("")),e=k(a),d=c(a,j)}),i.attachEvent("onafterprint",function(){n(e),clearTimeout(g._removeSheetTimer),g._removeSheetTimer=setTimeout(b,500)}),a.printShived=!0,a)}var p,q,r="3.7.3",s=a.html5||{},t=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,u=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,v="_html5shiv",w=0,x={};!function(){try{var a=b.createElement("a");a.innerHTML="",p="hidden"in a,q=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){p=!0,q=!0}}();var y={elements:s.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:r,shivCSS:s.shivCSS!==!1,supportsUnknownElements:q,shivMethods:s.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=y,j(b);var z=/^$|\b(?:all|print)\b/,A="html5shiv",B=!q&&function(){var c=b.documentElement;return!("undefined"==typeof b.namespaces||"undefined"==typeof b.parentWindow||"undefined"==typeof c.applyElement||"undefined"==typeof c.removeNode||"undefined"==typeof a.attachEvent)}();y.type+=" print",y.shivPrint=o,o(b),"object"==typeof module&&module.exports&&(module.exports=y)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/test_no_change/_static/js/html5shiv.min.js b/branch/test_no_change/_static/js/html5shiv.min.js new file mode 100644 index 0000000..cd1c674 --- /dev/null +++ b/branch/test_no_change/_static/js/html5shiv.min.js @@ -0,0 +1,4 @@ +/** +* @preserve HTML5 Shiv 3.7.3 | @afarkas @jdalton @jon_neal @rem | MIT/GPL2 Licensed +*/ +!function(a,b){function c(a,b){var c=a.createElement("p"),d=a.getElementsByTagName("head")[0]||a.documentElement;return c.innerHTML="x",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=t.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=t.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),t.elements=c+" "+a,j(b)}function f(a){var b=s[a[q]];return b||(b={},r++,a[q]=r,s[r]=b),b}function g(a,c,d){if(c||(c=b),l)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():p.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||o.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),l)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return t.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(t,b.frag)}function j(a){a||(a=b);var d=f(a);return!t.shivCSS||k||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),l||i(a,d),a}var k,l,m="3.7.3-pre",n=a.html5||{},o=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,p=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,q="_html5shiv",r=0,s={};!function(){try{var a=b.createElement("a");a.innerHTML="",k="hidden"in a,l=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){k=!0,l=!0}}();var t={elements:n.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:m,shivCSS:n.shivCSS!==!1,supportsUnknownElements:l,shivMethods:n.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=t,j(b),"object"==typeof module&&module.exports&&(module.exports=t)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/test_no_change/_static/js/theme.js b/branch/test_no_change/_static/js/theme.js new file mode 100644 index 0000000..1fddb6e --- /dev/null +++ b/branch/test_no_change/_static/js/theme.js @@ -0,0 +1 @@ +!function(n){var e={};function t(i){if(e[i])return e[i].exports;var o=e[i]={i:i,l:!1,exports:{}};return n[i].call(o.exports,o,o.exports,t),o.l=!0,o.exports}t.m=n,t.c=e,t.d=function(n,e,i){t.o(n,e)||Object.defineProperty(n,e,{enumerable:!0,get:i})},t.r=function(n){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(n,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(n,"__esModule",{value:!0})},t.t=function(n,e){if(1&e&&(n=t(n)),8&e)return n;if(4&e&&"object"==typeof n&&n&&n.__esModule)return n;var i=Object.create(null);if(t.r(i),Object.defineProperty(i,"default",{enumerable:!0,value:n}),2&e&&"string"!=typeof n)for(var o in n)t.d(i,o,function(e){return n[e]}.bind(null,o));return i},t.n=function(n){var e=n&&n.__esModule?function(){return n.default}:function(){return n};return t.d(e,"a",e),e},t.o=function(n,e){return Object.prototype.hasOwnProperty.call(n,e)},t.p="",t(t.s=0)}([function(n,e,t){t(1),n.exports=t(3)},function(n,e,t){(function(){var e="undefined"!=typeof window?window.jQuery:t(2);n.exports.ThemeNav={navBar:null,win:null,winScroll:!1,winResize:!1,linkScroll:!1,winPosition:0,winHeight:null,docHeight:null,isRunning:!1,enable:function(n){var t=this;void 0===n&&(n=!0),t.isRunning||(t.isRunning=!0,e((function(e){t.init(e),t.reset(),t.win.on("hashchange",t.reset),n&&t.win.on("scroll",(function(){t.linkScroll||t.winScroll||(t.winScroll=!0,requestAnimationFrame((function(){t.onScroll()})))})),t.win.on("resize",(function(){t.winResize||(t.winResize=!0,requestAnimationFrame((function(){t.onResize()})))})),t.onResize()})))},enableSticky:function(){this.enable(!0)},init:function(n){n(document);var e=this;this.navBar=n("div.wy-side-scroll:first"),this.win=n(window),n(document).on("click","[data-toggle='wy-nav-top']",(function(){n("[data-toggle='wy-nav-shift']").toggleClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift")})).on("click",".wy-menu-vertical .current ul li a",(function(){var t=n(this);n("[data-toggle='wy-nav-shift']").removeClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift"),e.toggleCurrent(t),e.hashChange()})).on("click","[data-toggle='rst-current-version']",(function(){n("[data-toggle='rst-versions']").toggleClass("shift-up")})),n("table.docutils:not(.field-list,.footnote,.citation)").wrap("
"),n("table.docutils.footnote").wrap("
"),n("table.docutils.citation").wrap("
"),n(".wy-menu-vertical ul").not(".simple").siblings("a").each((function(){var t=n(this);expand=n(''),expand.on("click",(function(n){return e.toggleCurrent(t),n.stopPropagation(),!1})),t.prepend(expand)}))},reset:function(){var n=encodeURI(window.location.hash)||"#";try{var e=$(".wy-menu-vertical"),t=e.find('[href="'+n+'"]');if(0===t.length){var i=$('.document [id="'+n.substring(1)+'"]').closest("div.section");0===(t=e.find('[href="#'+i.attr("id")+'"]')).length&&(t=e.find('[href="#"]'))}if(t.length>0){$(".wy-menu-vertical .current").removeClass("current").attr("aria-expanded","false"),t.addClass("current").attr("aria-expanded","true"),t.closest("li.toctree-l1").parent().addClass("current").attr("aria-expanded","true");for(let n=1;n<=10;n++)t.closest("li.toctree-l"+n).addClass("current").attr("aria-expanded","true");t[0].scrollIntoView()}}catch(n){console.log("Error expanding nav for anchor",n)}},onScroll:function(){this.winScroll=!1;var n=this.win.scrollTop(),e=n+this.winHeight,t=this.navBar.scrollTop()+(n-this.winPosition);n<0||e>this.docHeight||(this.navBar.scrollTop(t),this.winPosition=n)},onResize:function(){this.winResize=!1,this.winHeight=this.win.height(),this.docHeight=$(document).height()},hashChange:function(){this.linkScroll=!0,this.win.one("hashchange",(function(){this.linkScroll=!1}))},toggleCurrent:function(n){var e=n.closest("li");e.siblings("li.current").removeClass("current").attr("aria-expanded","false"),e.siblings().find("li.current").removeClass("current").attr("aria-expanded","false");var t=e.find("> ul li");t.length&&(t.removeClass("current").attr("aria-expanded","false"),e.toggleClass("current").attr("aria-expanded",(function(n,e){return"true"==e?"false":"true"})))}},"undefined"!=typeof window&&(window.SphinxRtdTheme={Navigation:n.exports.ThemeNav,StickyNav:n.exports.ThemeNav}),function(){for(var n=0,e=["ms","moz","webkit","o"],t=0;t0 + var meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$"; // [C]VC[V] is m=1 + var mgr1 = "^(" + C + ")?" + V + C + V + C; // [C]VCVC... is m>1 + var s_v = "^(" + C + ")?" + v; // vowel in stem + + this.stemWord = function (w) { + var stem; + var suffix; + var firstch; + var origword = w; + + if (w.length < 3) + return w; + + var re; + var re2; + var re3; + var re4; + + firstch = w.substr(0,1); + if (firstch == "y") + w = firstch.toUpperCase() + w.substr(1); + + // Step 1a + re = /^(.+?)(ss|i)es$/; + re2 = /^(.+?)([^s])s$/; + + if (re.test(w)) + w = w.replace(re,"$1$2"); + else if (re2.test(w)) + w = w.replace(re2,"$1$2"); + + // Step 1b + re = /^(.+?)eed$/; + re2 = /^(.+?)(ed|ing)$/; + if (re.test(w)) { + var fp = re.exec(w); + re = new RegExp(mgr0); + if (re.test(fp[1])) { + re = /.$/; + w = w.replace(re,""); + } + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + re2 = new RegExp(s_v); + if (re2.test(stem)) { + w = stem; + re2 = /(at|bl|iz)$/; + re3 = new RegExp("([^aeiouylsz])\\1$"); + re4 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re2.test(w)) + w = w + "e"; + else if (re3.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + else if (re4.test(w)) + w = w + "e"; + } + } + + // Step 1c + re = /^(.+?)y$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(s_v); + if (re.test(stem)) + w = stem + "i"; + } + + // Step 2 + re = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step2list[suffix]; + } + + // Step 3 + re = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step3list[suffix]; + } + + // Step 4 + re = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/; + re2 = /^(.+?)(s|t)(ion)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + if (re.test(stem)) + w = stem; + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1] + fp[2]; + re2 = new RegExp(mgr1); + if (re2.test(stem)) + w = stem; + } + + // Step 5 + re = /^(.+?)e$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + re2 = new RegExp(meq1); + re3 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) + w = stem; + } + re = /ll$/; + re2 = new RegExp(mgr1); + if (re.test(w) && re2.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + + // and turn initial Y back to y + if (firstch == "y") + w = firstch.toLowerCase() + w.substr(1); + return w; + } +} + diff --git a/branch/test_no_change/_static/minus.png b/branch/test_no_change/_static/minus.png new file mode 100644 index 0000000..d96755f Binary files /dev/null and b/branch/test_no_change/_static/minus.png differ diff --git a/branch/test_no_change/_static/plus.png b/branch/test_no_change/_static/plus.png new file mode 100644 index 0000000..7107cec Binary files /dev/null and b/branch/test_no_change/_static/plus.png differ diff --git a/branch/test_no_change/_static/pygments.css b/branch/test_no_change/_static/pygments.css new file mode 100644 index 0000000..84ab303 --- /dev/null +++ b/branch/test_no_change/_static/pygments.css @@ -0,0 +1,75 @@ +pre { line-height: 125%; } +td.linenos .normal { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +span.linenos { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +td.linenos .special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +span.linenos.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +.highlight .hll { background-color: #ffffcc } +.highlight { background: #f8f8f8; } +.highlight .c { color: #3D7B7B; font-style: italic } /* Comment */ +.highlight .err { border: 1px solid #FF0000 } /* Error */ +.highlight .k { color: #008000; font-weight: bold } /* Keyword */ +.highlight .o { color: #666666 } /* Operator */ +.highlight .ch { color: #3D7B7B; font-style: italic } /* Comment.Hashbang */ +.highlight .cm { color: #3D7B7B; font-style: italic } /* Comment.Multiline */ +.highlight .cp { color: #9C6500 } /* Comment.Preproc */ +.highlight .cpf { color: #3D7B7B; font-style: italic } /* Comment.PreprocFile */ +.highlight .c1 { color: #3D7B7B; font-style: italic } /* Comment.Single */ +.highlight .cs { color: #3D7B7B; font-style: italic } /* Comment.Special */ +.highlight .gd { color: #A00000 } /* Generic.Deleted */ +.highlight .ge { font-style: italic } /* Generic.Emph */ +.highlight .ges { font-weight: bold; font-style: italic } /* Generic.EmphStrong */ +.highlight .gr { color: #E40000 } /* Generic.Error */ +.highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */ +.highlight .gi { color: #008400 } /* Generic.Inserted */ +.highlight .go { color: #717171 } /* Generic.Output */ +.highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */ +.highlight .gs { font-weight: bold } /* Generic.Strong */ +.highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */ +.highlight .gt { color: #0044DD } /* Generic.Traceback */ +.highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */ +.highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */ +.highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */ +.highlight .kp { color: #008000 } /* Keyword.Pseudo */ +.highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */ +.highlight .kt { color: #B00040 } /* Keyword.Type */ +.highlight .m { color: #666666 } /* Literal.Number */ +.highlight .s { color: #BA2121 } /* Literal.String */ +.highlight .na { color: #687822 } /* Name.Attribute */ +.highlight .nb { color: #008000 } /* Name.Builtin */ +.highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */ +.highlight .no { color: #880000 } /* Name.Constant */ +.highlight .nd { color: #AA22FF } /* Name.Decorator */ +.highlight .ni { color: #717171; font-weight: bold } /* Name.Entity */ +.highlight .ne { color: #CB3F38; font-weight: bold } /* Name.Exception */ +.highlight .nf { color: #0000FF } /* Name.Function */ +.highlight .nl { color: #767600 } /* Name.Label */ +.highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */ +.highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */ +.highlight .nv { color: #19177C } /* Name.Variable */ +.highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */ +.highlight .w { color: #bbbbbb } /* Text.Whitespace */ +.highlight .mb { color: #666666 } /* Literal.Number.Bin */ +.highlight .mf { color: #666666 } /* Literal.Number.Float */ +.highlight .mh { color: #666666 } /* Literal.Number.Hex */ +.highlight .mi { color: #666666 } /* Literal.Number.Integer */ +.highlight .mo { color: #666666 } /* Literal.Number.Oct */ +.highlight .sa { color: #BA2121 } /* Literal.String.Affix */ +.highlight .sb { color: #BA2121 } /* Literal.String.Backtick */ +.highlight .sc { color: #BA2121 } /* Literal.String.Char */ +.highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */ +.highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */ +.highlight .s2 { color: #BA2121 } /* Literal.String.Double */ +.highlight .se { color: #AA5D1F; font-weight: bold } /* Literal.String.Escape */ +.highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */ +.highlight .si { color: #A45A77; font-weight: bold } /* Literal.String.Interpol */ +.highlight .sx { color: #008000 } /* Literal.String.Other */ +.highlight .sr { color: #A45A77 } /* Literal.String.Regex */ +.highlight .s1 { color: #BA2121 } /* Literal.String.Single */ +.highlight .ss { color: #19177C } /* Literal.String.Symbol */ +.highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */ +.highlight .fm { color: #0000FF } /* Name.Function.Magic */ +.highlight .vc { color: #19177C } /* Name.Variable.Class */ +.highlight .vg { color: #19177C } /* Name.Variable.Global */ +.highlight .vi { color: #19177C } /* Name.Variable.Instance */ +.highlight .vm { color: #19177C } /* Name.Variable.Magic */ +.highlight .il { color: #666666 } /* Literal.Number.Integer.Long */ \ No newline at end of file diff --git a/branch/test_no_change/_static/searchtools.js b/branch/test_no_change/_static/searchtools.js new file mode 100644 index 0000000..97d56a7 --- /dev/null +++ b/branch/test_no_change/_static/searchtools.js @@ -0,0 +1,566 @@ +/* + * searchtools.js + * ~~~~~~~~~~~~~~~~ + * + * Sphinx JavaScript utilities for the full-text search. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +/** + * Simple result scoring code. + */ +if (typeof Scorer === "undefined") { + var Scorer = { + // Implement the following function to further tweak the score for each result + // The function takes a result array [docname, title, anchor, descr, score, filename] + // and returns the new score. + /* + score: result => { + const [docname, title, anchor, descr, score, filename] = result + return score + }, + */ + + // query matches the full name of an object + objNameMatch: 11, + // or matches in the last dotted part of the object name + objPartialMatch: 6, + // Additive scores depending on the priority of the object + objPrio: { + 0: 15, // used to be importantResults + 1: 5, // used to be objectResults + 2: -5, // used to be unimportantResults + }, + // Used when the priority is not in the mapping. + objPrioDefault: 0, + + // query found in title + title: 15, + partialTitle: 7, + // query found in terms + term: 5, + partialTerm: 2, + }; +} + +const _removeChildren = (element) => { + while (element && element.lastChild) element.removeChild(element.lastChild); +}; + +/** + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions#escaping + */ +const _escapeRegExp = (string) => + string.replace(/[.*+\-?^${}()|[\]\\]/g, "\\$&"); // $& means the whole matched string + +const _displayItem = (item, searchTerms) => { + const docBuilder = DOCUMENTATION_OPTIONS.BUILDER; + const docUrlRoot = DOCUMENTATION_OPTIONS.URL_ROOT; + const docFileSuffix = DOCUMENTATION_OPTIONS.FILE_SUFFIX; + const docLinkSuffix = DOCUMENTATION_OPTIONS.LINK_SUFFIX; + const showSearchSummary = DOCUMENTATION_OPTIONS.SHOW_SEARCH_SUMMARY; + + const [docName, title, anchor, descr, score, _filename] = item; + + let listItem = document.createElement("li"); + let requestUrl; + let linkUrl; + if (docBuilder === "dirhtml") { + // dirhtml builder + let dirname = docName + "/"; + if (dirname.match(/\/index\/$/)) + dirname = dirname.substring(0, dirname.length - 6); + else if (dirname === "index/") dirname = ""; + requestUrl = docUrlRoot + dirname; + linkUrl = requestUrl; + } else { + // normal html builders + requestUrl = docUrlRoot + docName + docFileSuffix; + linkUrl = docName + docLinkSuffix; + } + let linkEl = listItem.appendChild(document.createElement("a")); + linkEl.href = linkUrl + anchor; + linkEl.dataset.score = score; + linkEl.innerHTML = title; + if (descr) + listItem.appendChild(document.createElement("span")).innerHTML = + " (" + descr + ")"; + else if (showSearchSummary) + fetch(requestUrl) + .then((responseData) => responseData.text()) + .then((data) => { + if (data) + listItem.appendChild( + Search.makeSearchSummary(data, searchTerms) + ); + }); + Search.output.appendChild(listItem); +}; +const _finishSearch = (resultCount) => { + Search.stopPulse(); + Search.title.innerText = _("Search Results"); + if (!resultCount) + Search.status.innerText = Documentation.gettext( + "Your search did not match any documents. Please make sure that all words are spelled correctly and that you've selected enough categories." + ); + else + Search.status.innerText = _( + `Search finished, found ${resultCount} page(s) matching the search query.` + ); +}; +const _displayNextItem = ( + results, + resultCount, + searchTerms +) => { + // results left, load the summary and display it + // this is intended to be dynamic (don't sub resultsCount) + if (results.length) { + _displayItem(results.pop(), searchTerms); + setTimeout( + () => _displayNextItem(results, resultCount, searchTerms), + 5 + ); + } + // search finished, update title and status message + else _finishSearch(resultCount); +}; + +/** + * Default splitQuery function. Can be overridden in ``sphinx.search`` with a + * custom function per language. + * + * The regular expression works by splitting the string on consecutive characters + * that are not Unicode letters, numbers, underscores, or emoji characters. + * This is the same as ``\W+`` in Python, preserving the surrogate pair area. + */ +if (typeof splitQuery === "undefined") { + var splitQuery = (query) => query + .split(/[^\p{Letter}\p{Number}_\p{Emoji_Presentation}]+/gu) + .filter(term => term) // remove remaining empty strings +} + +/** + * Search Module + */ +const Search = { + _index: null, + _queued_query: null, + _pulse_status: -1, + + htmlToText: (htmlString) => { + const htmlElement = new DOMParser().parseFromString(htmlString, 'text/html'); + htmlElement.querySelectorAll(".headerlink").forEach((el) => { el.remove() }); + const docContent = htmlElement.querySelector('[role="main"]'); + if (docContent !== undefined) return docContent.textContent; + console.warn( + "Content block not found. Sphinx search tries to obtain it via '[role=main]'. Could you check your theme or template." + ); + return ""; + }, + + init: () => { + const query = new URLSearchParams(window.location.search).get("q"); + document + .querySelectorAll('input[name="q"]') + .forEach((el) => (el.value = query)); + if (query) Search.performSearch(query); + }, + + loadIndex: (url) => + (document.body.appendChild(document.createElement("script")).src = url), + + setIndex: (index) => { + Search._index = index; + if (Search._queued_query !== null) { + const query = Search._queued_query; + Search._queued_query = null; + Search.query(query); + } + }, + + hasIndex: () => Search._index !== null, + + deferQuery: (query) => (Search._queued_query = query), + + stopPulse: () => (Search._pulse_status = -1), + + startPulse: () => { + if (Search._pulse_status >= 0) return; + + const pulse = () => { + Search._pulse_status = (Search._pulse_status + 1) % 4; + Search.dots.innerText = ".".repeat(Search._pulse_status); + if (Search._pulse_status >= 0) window.setTimeout(pulse, 500); + }; + pulse(); + }, + + /** + * perform a search for something (or wait until index is loaded) + */ + performSearch: (query) => { + // create the required interface elements + const searchText = document.createElement("h2"); + searchText.textContent = _("Searching"); + const searchSummary = document.createElement("p"); + searchSummary.classList.add("search-summary"); + searchSummary.innerText = ""; + const searchList = document.createElement("ul"); + searchList.classList.add("search"); + + const out = document.getElementById("search-results"); + Search.title = out.appendChild(searchText); + Search.dots = Search.title.appendChild(document.createElement("span")); + Search.status = out.appendChild(searchSummary); + Search.output = out.appendChild(searchList); + + const searchProgress = document.getElementById("search-progress"); + // Some themes don't use the search progress node + if (searchProgress) { + searchProgress.innerText = _("Preparing search..."); + } + Search.startPulse(); + + // index already loaded, the browser was quick! + if (Search.hasIndex()) Search.query(query); + else Search.deferQuery(query); + }, + + /** + * execute search (requires search index to be loaded) + */ + query: (query) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + const allTitles = Search._index.alltitles; + const indexEntries = Search._index.indexentries; + + // stem the search terms and add them to the correct list + const stemmer = new Stemmer(); + const searchTerms = new Set(); + const excludedTerms = new Set(); + const highlightTerms = new Set(); + const objectTerms = new Set(splitQuery(query.toLowerCase().trim())); + splitQuery(query.trim()).forEach((queryTerm) => { + const queryTermLower = queryTerm.toLowerCase(); + + // maybe skip this "word" + // stopwords array is from language_data.js + if ( + stopwords.indexOf(queryTermLower) !== -1 || + queryTerm.match(/^\d+$/) + ) + return; + + // stem the word + let word = stemmer.stemWord(queryTermLower); + // select the correct list + if (word[0] === "-") excludedTerms.add(word.substr(1)); + else { + searchTerms.add(word); + highlightTerms.add(queryTermLower); + } + }); + + if (SPHINX_HIGHLIGHT_ENABLED) { // set in sphinx_highlight.js + localStorage.setItem("sphinx_highlight_terms", [...highlightTerms].join(" ")) + } + + // console.debug("SEARCH: searching for:"); + // console.info("required: ", [...searchTerms]); + // console.info("excluded: ", [...excludedTerms]); + + // array of [docname, title, anchor, descr, score, filename] + let results = []; + _removeChildren(document.getElementById("search-progress")); + + const queryLower = query.toLowerCase(); + for (const [title, foundTitles] of Object.entries(allTitles)) { + if (title.toLowerCase().includes(queryLower) && (queryLower.length >= title.length/2)) { + for (const [file, id] of foundTitles) { + let score = Math.round(100 * queryLower.length / title.length) + results.push([ + docNames[file], + titles[file] !== title ? `${titles[file]} > ${title}` : title, + id !== null ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // search for explicit entries in index directives + for (const [entry, foundEntries] of Object.entries(indexEntries)) { + if (entry.includes(queryLower) && (queryLower.length >= entry.length/2)) { + for (const [file, id] of foundEntries) { + let score = Math.round(100 * queryLower.length / entry.length) + results.push([ + docNames[file], + titles[file], + id ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // lookup as object + objectTerms.forEach((term) => + results.push(...Search.performObjectSearch(term, objectTerms)) + ); + + // lookup as search terms in fulltext + results.push(...Search.performTermsSearch(searchTerms, excludedTerms)); + + // let the scorer override scores with a custom scoring function + if (Scorer.score) results.forEach((item) => (item[4] = Scorer.score(item))); + + // now sort the results by score (in opposite order of appearance, since the + // display function below uses pop() to retrieve items) and then + // alphabetically + results.sort((a, b) => { + const leftScore = a[4]; + const rightScore = b[4]; + if (leftScore === rightScore) { + // same score: sort alphabetically + const leftTitle = a[1].toLowerCase(); + const rightTitle = b[1].toLowerCase(); + if (leftTitle === rightTitle) return 0; + return leftTitle > rightTitle ? -1 : 1; // inverted is intentional + } + return leftScore > rightScore ? 1 : -1; + }); + + // remove duplicate search results + // note the reversing of results, so that in the case of duplicates, the highest-scoring entry is kept + let seen = new Set(); + results = results.reverse().reduce((acc, result) => { + let resultStr = result.slice(0, 4).concat([result[5]]).map(v => String(v)).join(','); + if (!seen.has(resultStr)) { + acc.push(result); + seen.add(resultStr); + } + return acc; + }, []); + + results = results.reverse(); + + // for debugging + //Search.lastresults = results.slice(); // a copy + // console.info("search results:", Search.lastresults); + + // print the results + _displayNextItem(results, results.length, searchTerms); + }, + + /** + * search for object names + */ + performObjectSearch: (object, objectTerms) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const objects = Search._index.objects; + const objNames = Search._index.objnames; + const titles = Search._index.titles; + + const results = []; + + const objectSearchCallback = (prefix, match) => { + const name = match[4] + const fullname = (prefix ? prefix + "." : "") + name; + const fullnameLower = fullname.toLowerCase(); + if (fullnameLower.indexOf(object) < 0) return; + + let score = 0; + const parts = fullnameLower.split("."); + + // check for different match types: exact matches of full name or + // "last name" (i.e. last dotted part) + if (fullnameLower === object || parts.slice(-1)[0] === object) + score += Scorer.objNameMatch; + else if (parts.slice(-1)[0].indexOf(object) > -1) + score += Scorer.objPartialMatch; // matches in last name + + const objName = objNames[match[1]][2]; + const title = titles[match[0]]; + + // If more than one term searched for, we require other words to be + // found in the name/title/description + const otherTerms = new Set(objectTerms); + otherTerms.delete(object); + if (otherTerms.size > 0) { + const haystack = `${prefix} ${name} ${objName} ${title}`.toLowerCase(); + if ( + [...otherTerms].some((otherTerm) => haystack.indexOf(otherTerm) < 0) + ) + return; + } + + let anchor = match[3]; + if (anchor === "") anchor = fullname; + else if (anchor === "-") anchor = objNames[match[1]][1] + "-" + fullname; + + const descr = objName + _(", in ") + title; + + // add custom score for some objects according to scorer + if (Scorer.objPrio.hasOwnProperty(match[2])) + score += Scorer.objPrio[match[2]]; + else score += Scorer.objPrioDefault; + + results.push([ + docNames[match[0]], + fullname, + "#" + anchor, + descr, + score, + filenames[match[0]], + ]); + }; + Object.keys(objects).forEach((prefix) => + objects[prefix].forEach((array) => + objectSearchCallback(prefix, array) + ) + ); + return results; + }, + + /** + * search for full-text terms in the index + */ + performTermsSearch: (searchTerms, excludedTerms) => { + // prepare search + const terms = Search._index.terms; + const titleTerms = Search._index.titleterms; + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + + const scoreMap = new Map(); + const fileMap = new Map(); + + // perform the search on the required terms + searchTerms.forEach((word) => { + const files = []; + const arr = [ + { files: terms[word], score: Scorer.term }, + { files: titleTerms[word], score: Scorer.title }, + ]; + // add support for partial matches + if (word.length > 2) { + const escapedWord = _escapeRegExp(word); + Object.keys(terms).forEach((term) => { + if (term.match(escapedWord) && !terms[word]) + arr.push({ files: terms[term], score: Scorer.partialTerm }); + }); + Object.keys(titleTerms).forEach((term) => { + if (term.match(escapedWord) && !titleTerms[word]) + arr.push({ files: titleTerms[word], score: Scorer.partialTitle }); + }); + } + + // no match but word was a required one + if (arr.every((record) => record.files === undefined)) return; + + // found search word in contents + arr.forEach((record) => { + if (record.files === undefined) return; + + let recordFiles = record.files; + if (recordFiles.length === undefined) recordFiles = [recordFiles]; + files.push(...recordFiles); + + // set score for the word in each file + recordFiles.forEach((file) => { + if (!scoreMap.has(file)) scoreMap.set(file, {}); + scoreMap.get(file)[word] = record.score; + }); + }); + + // create the mapping + files.forEach((file) => { + if (fileMap.has(file) && fileMap.get(file).indexOf(word) === -1) + fileMap.get(file).push(word); + else fileMap.set(file, [word]); + }); + }); + + // now check if the files don't contain excluded terms + const results = []; + for (const [file, wordList] of fileMap) { + // check if all requirements are matched + + // as search terms with length < 3 are discarded + const filteredTermCount = [...searchTerms].filter( + (term) => term.length > 2 + ).length; + if ( + wordList.length !== searchTerms.size && + wordList.length !== filteredTermCount + ) + continue; + + // ensure that none of the excluded terms is in the search result + if ( + [...excludedTerms].some( + (term) => + terms[term] === file || + titleTerms[term] === file || + (terms[term] || []).includes(file) || + (titleTerms[term] || []).includes(file) + ) + ) + break; + + // select one (max) score for the file. + const score = Math.max(...wordList.map((w) => scoreMap.get(file)[w])); + // add result to the result list + results.push([ + docNames[file], + titles[file], + "", + null, + score, + filenames[file], + ]); + } + return results; + }, + + /** + * helper function to return a node containing the + * search summary for a given text. keywords is a list + * of stemmed words. + */ + makeSearchSummary: (htmlText, keywords) => { + const text = Search.htmlToText(htmlText); + if (text === "") return null; + + const textLower = text.toLowerCase(); + const actualStartPosition = [...keywords] + .map((k) => textLower.indexOf(k.toLowerCase())) + .filter((i) => i > -1) + .slice(-1)[0]; + const startWithContext = Math.max(actualStartPosition - 120, 0); + + const top = startWithContext === 0 ? "" : "..."; + const tail = startWithContext + 240 < text.length ? "..." : ""; + + let summary = document.createElement("p"); + summary.classList.add("context"); + summary.textContent = top + text.substr(startWithContext, 240).trim() + tail; + + return summary; + }, +}; + +_ready(Search.init); diff --git a/branch/test_no_change/_static/sphinx_highlight.js b/branch/test_no_change/_static/sphinx_highlight.js new file mode 100644 index 0000000..aae669d --- /dev/null +++ b/branch/test_no_change/_static/sphinx_highlight.js @@ -0,0 +1,144 @@ +/* Highlighting utilities for Sphinx HTML documentation. */ +"use strict"; + +const SPHINX_HIGHLIGHT_ENABLED = true + +/** + * highlight a given string on a node by wrapping it in + * span elements with the given class name. + */ +const _highlight = (node, addItems, text, className) => { + if (node.nodeType === Node.TEXT_NODE) { + const val = node.nodeValue; + const parent = node.parentNode; + const pos = val.toLowerCase().indexOf(text); + if ( + pos >= 0 && + !parent.classList.contains(className) && + !parent.classList.contains("nohighlight") + ) { + let span; + + const closestNode = parent.closest("body, svg, foreignObject"); + const isInSVG = closestNode && closestNode.matches("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.classList.add(className); + } + + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + parent.insertBefore( + span, + parent.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling + ) + ); + node.nodeValue = val.substr(0, pos); + + if (isInSVG) { + const rect = document.createElementNS( + "http://www.w3.org/2000/svg", + "rect" + ); + const bbox = parent.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute("class", className); + addItems.push({ parent: parent, target: rect }); + } + } + } else if (node.matches && !node.matches("button, select, textarea")) { + node.childNodes.forEach((el) => _highlight(el, addItems, text, className)); + } +}; +const _highlightText = (thisNode, text, className) => { + let addItems = []; + _highlight(thisNode, addItems, text, className); + addItems.forEach((obj) => + obj.parent.insertAdjacentElement("beforebegin", obj.target) + ); +}; + +/** + * Small JavaScript module for the documentation. + */ +const SphinxHighlight = { + + /** + * highlight the search words provided in localstorage in the text + */ + highlightSearchWords: () => { + if (!SPHINX_HIGHLIGHT_ENABLED) return; // bail if no highlight + + // get and clear terms from localstorage + const url = new URL(window.location); + const highlight = + localStorage.getItem("sphinx_highlight_terms") + || url.searchParams.get("highlight") + || ""; + localStorage.removeItem("sphinx_highlight_terms") + url.searchParams.delete("highlight"); + window.history.replaceState({}, "", url); + + // get individual terms from highlight string + const terms = highlight.toLowerCase().split(/\s+/).filter(x => x); + if (terms.length === 0) return; // nothing to do + + // There should never be more than one element matching "div.body" + const divBody = document.querySelectorAll("div.body"); + const body = divBody.length ? divBody[0] : document.querySelector("body"); + window.setTimeout(() => { + terms.forEach((term) => _highlightText(body, term, "highlighted")); + }, 10); + + const searchBox = document.getElementById("searchbox"); + if (searchBox === null) return; + searchBox.appendChild( + document + .createRange() + .createContextualFragment( + '" + ) + ); + }, + + /** + * helper function to hide the search marks again + */ + hideSearchWords: () => { + document + .querySelectorAll("#searchbox .highlight-link") + .forEach((el) => el.remove()); + document + .querySelectorAll("span.highlighted") + .forEach((el) => el.classList.remove("highlighted")); + localStorage.removeItem("sphinx_highlight_terms") + }, + + initEscapeListener: () => { + // only install a listener if it is really needed + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.shiftKey || event.altKey || event.ctrlKey || event.metaKey) return; + if (DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS && (event.key === "Escape")) { + SphinxHighlight.hideSearchWords(); + event.preventDefault(); + } + }); + }, +}; + +_ready(SphinxHighlight.highlightSearchWords); +_ready(SphinxHighlight.initEscapeListener); diff --git a/branch/test_no_change/autodoc/index.html b/branch/test_no_change/autodoc/index.html new file mode 100644 index 0000000..4179b9e --- /dev/null +++ b/branch/test_no_change/autodoc/index.html @@ -0,0 +1,165 @@ + + + + + + + Lasso Classes and Functions — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Lasso Classes and Functions

+
+

Base Classes

+ + + + + + + + + + + + + + + + + + +

CubeTransit

Class for storing information about transit defined in Cube line files.

StandardTransit

Holds a standard transit feed as a Partridge object and contains methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files.

ModelRoadwayNetwork

Subclass of network_wrangler class RoadwayNetwork

Project

A single or set of changes to the roadway or transit system.

Parameters

A class representing all the parameters defining the networks including time of day, categories, etc.

+
+
+

Utils and Functions

+ + + + + + + + + +

util

logger

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/genindex/index.html b/branch/test_no_change/genindex/index.html new file mode 100644 index 0000000..b38d787 --- /dev/null +++ b/branch/test_no_change/genindex/index.html @@ -0,0 +1,1040 @@ + + + + + + Index — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Index

+ +
+ _ + | A + | B + | C + | D + | E + | F + | G + | H + | I + | K + | L + | M + | N + | O + | P + | R + | S + | T + | U + | V + | W + | X + | Y + | Z + +
+

_

+ + +
+ +

A

+ + + +
+ +

B

+ + + +
+ +

C

+ + + +
+ +

D

+ + + +
+ +

E

+ + + +
+ +

F

+ + + +
+ +

G

+ + + +
+ +

H

+ + + +
+ +

I

+ + + +
+ +

K

+ + +
+ +

L

+ + + +
+ +

M

+ + + +
+ +

N

+ + + +
+ +

O

+ + + +
+ +

P

+ + + +
+ +

R

+ + + +
+ +

S

+ + + +
+ +

T

+ + + +
+ +

U

+ + + +
+ +

V

+ + + +
+ +

W

+ + + +
+ +

X

+ + + +
+ +

Y

+ + +
+ +

Z

+ + + +
+ + + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/index.html b/branch/test_no_change/index.html new file mode 100644 index 0000000..c65fa50 --- /dev/null +++ b/branch/test_no_change/index.html @@ -0,0 +1,181 @@ + + + + + + + Welcome to lasso’s documentation! — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Welcome to lasso’s documentation!

+

This package of utilities is a wrapper around the +network_wrangler package +for MetCouncil and MTC. It aims to have the following functionality:

+
    +
  1. parse Cube log files and base highway networks and create ProjectCards +for Network Wrangler

  2. +
  3. parse two Cube transit line files and create ProjectCards for NetworkWrangler

  4. +
  5. refine Network Wrangler highway networks to contain specific variables and +settings for the respective agency and export them to a format that can +be read in by Citilab’s Cube software.

  6. +
+ +
+
+

Indices and tables

+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/objects.inv b/branch/test_no_change/objects.inv new file mode 100644 index 0000000..57b89c9 Binary files /dev/null and b/branch/test_no_change/objects.inv differ diff --git a/branch/test_no_change/py-modindex/index.html b/branch/test_no_change/py-modindex/index.html new file mode 100644 index 0000000..46c6f94 --- /dev/null +++ b/branch/test_no_change/py-modindex/index.html @@ -0,0 +1,136 @@ + + + + + + Python Module Index — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Python Module Index

+ +
+ l +
+ + + + + + + + + + + + + +
 
+ l
+ lasso +
    + lasso.logger +
    + lasso.util +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/running/index.html b/branch/test_no_change/running/index.html new file mode 100644 index 0000000..6feca00 --- /dev/null +++ b/branch/test_no_change/running/index.html @@ -0,0 +1,133 @@ + + + + + + + Running Lasso — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Running Lasso

+
+

Create project files

+
+
+

Create a scenario

+
+
+

Exporting networks

+
+
+

Auditing and Reporting

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/search/index.html b/branch/test_no_change/search/index.html new file mode 100644 index 0000000..6e5865f --- /dev/null +++ b/branch/test_no_change/search/index.html @@ -0,0 +1,126 @@ + + + + + + Search — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + + + +
+ +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + + + + + + \ No newline at end of file diff --git a/branch/test_no_change/searchindex.js b/branch/test_no_change/searchindex.js new file mode 100644 index 0000000..69a3e2a --- /dev/null +++ b/branch/test_no_change/searchindex.js @@ -0,0 +1 @@ +Search.setIndex({"docnames": ["_generated/lasso.CubeTransit", "_generated/lasso.ModelRoadwayNetwork", "_generated/lasso.Parameters", "_generated/lasso.Project", "_generated/lasso.StandardTransit", "_generated/lasso.logger", "_generated/lasso.util", "autodoc", "index", "running", "setup", "starting"], "filenames": ["_generated/lasso.CubeTransit.rst", "_generated/lasso.ModelRoadwayNetwork.rst", "_generated/lasso.Parameters.rst", "_generated/lasso.Project.rst", "_generated/lasso.StandardTransit.rst", "_generated/lasso.logger.rst", "_generated/lasso.util.rst", "autodoc.rst", "index.rst", "running.md", "setup.md", "starting.md"], "titles": ["lasso.CubeTransit", "lasso.ModelRoadwayNetwork", "lasso.Parameters", "lasso.Project", "lasso.StandardTransit", "lasso.logger", "lasso.util", "Lasso Classes and Functions", "Welcome to lasso\u2019s documentation!", "Running Lasso", "Setup", "Starting Out"], "terms": {"class": [0, 1, 2, 3, 4, 6, 8, 11], "paramet": [0, 1, 3, 4, 6, 8], "sourc": [0, 1, 2, 3, 4, 5, 6, 11], "base": [0, 1, 2, 3, 4, 6, 8, 11], "object": [0, 1, 2, 3, 4, 6, 11], "store": [0, 1, 11], "inform": [0, 1, 4, 11], "about": [0, 1, 4, 11], "transit": [0, 1, 2, 3, 4, 8], "defin": [0, 1, 2, 3, 11], "cube": [0, 1, 2, 3, 4, 8], "line": [0, 1, 2, 3, 4, 6, 8, 11], "file": [0, 1, 2, 3, 4, 8], "ha": [0, 6, 11], "capabl": [0, 11], "pars": [0, 8, 11], "properti": [0, 1, 2, 4, 6, 11], "shape": [0, 1, 3, 4, 6, 11], "python": [0, 1, 2, 11], "dictionari": [0, 1, 2, 3, 4, 6, 11], "compar": [0, 3, 4, 11], "repres": [0, 2, 4, 6, 11], "chang": [0, 1, 3, 4, 11], "project": [0, 1, 2, 4, 6, 8], "card": [0, 1, 3, 4], "typic": [0, 3, 4, 6, 8], "usag": [0, 1, 3, 4], "exampl": [0, 1, 3, 4, 6, 11], "tn": [0, 11], "create_from_cub": [0, 11], "cube_dir": [0, 3, 11], "transit_change_list": [0, 11], "evaluate_differ": [0, 4, 11], "base_transit_network": [0, 3, 11], "list": [0, 1, 2, 3, 4, 6, 11], "string": [0, 1, 3, 4, 6], "uniqu": [0, 1, 3], "name": [0, 1, 2, 3, 4, 6, 11], "network": [0, 1, 2, 3, 4, 5, 8], "type": [0, 1, 2, 3, 4, 6, 11], "line_properti": 0, "kei": [0, 1, 3, 11], "valu": [0, 1, 3, 4, 6, 11], "ar": [0, 1, 2, 3, 6, 11], "These": 0, "directli": 0, "read": [0, 1, 3, 4, 8, 11], "from": [0, 1, 2, 3, 4, 6, 8], "haven": 0, "t": [0, 1, 6], "been": [0, 6], "translat": [0, 4, 11], "standard": [0, 1, 2, 4, 11], "dict": [0, 1, 2, 3, 4], "panda": [0, 1, 3], "datafram": [0, 1, 3, 4, 11], "node": [0, 1, 2, 3, 4, 6, 11], "follow": [0, 4, 8, 11], "column": [0, 1, 4], "node_id": 0, "int": [0, 1, 2, 3, 4, 6], "posit": [0, 6], "integ": [0, 1, 2], "id": [0, 1, 3, 4, 11], "number": [0, 1, 2, 3, 4, 6, 11], "neg": [0, 6], "indic": [0, 1, 6], "non": [0, 6], "stop": [0, 4], "boolean": [0, 1], "i": [0, 1, 3, 4, 5, 6, 8, 11], "order": [0, 1, 2, 6, 11], "within": [0, 1, 2, 6], "thi": [0, 1, 2, 3, 4, 6, 8, 11], "program_typ": 0, "either": [0, 1, 5, 6, 11], "pt": 0, "trnbld": 0, "str": [0, 1, 2, 3, 6], "instanc": [0, 1, 2, 3, 4, 11], "appli": [0, 1, 6], "which": [0, 1, 3, 6, 11], "includ": [0, 1, 2, 11], "time": [0, 1, 2, 4, 6, 11], "period": [0, 1, 2, 4, 11], "variabl": [0, 1, 2, 3, 4, 8, 11], "source_list": 0, "have": [0, 1, 6, 8], "ad": [0, 1, 3, 6, 11], "diff_dict": 0, "__init__": [0, 1, 2, 3, 4], "constructor": [0, 1, 3], "set": [0, 1, 2, 3, 4, 5, 6, 8, 11], "see": [0, 1, 3, 4, 6], "an": [0, 1, 3, 4, 6, 11], "method": [0, 1, 2, 3, 4, 6, 11], "add_additional_time_period": 0, "new_time_period_numb": 0, "orig_line_nam": 0, "copi": [0, 1, 3, 4, 6, 11], "rout": [0, 1, 4], "anoth": 0, "appropri": [0, 4], "specif": [0, 1, 8, 11], "new": [0, 1, 6, 11], "under": 0, "self": [0, 1, 2, 3, 4, 6], "origin": [0, 1, 6, 11], "its": [0, 1], "return": [0, 1, 3, 4, 6], "add_cub": 0, "transit_sourc": 0, "lin": [0, 3, 4], "add": [0, 1, 3, 6, 11], "exist": [0, 1, 3, 6, 11], "transitnetwork": [0, 4], "directori": [0, 1, 4, 11], "static": [0, 1, 3, 4], "build_route_nam": 0, "route_id": [0, 4], "time_period": [0, 1, 2], "agency_id": 0, "0": [0, 1, 2, 4, 6, 11], "direction_id": 0, "1": [0, 1, 2, 4, 6, 11], "creat": [0, 1, 2, 3, 4, 6, 8, 11], "contaten": 0, "agenc": [0, 8], "direct": [0, 1, 6, 11], "e": [0, 1, 11], "452": 0, "111": 0, "pk": [0, 2], "construct": [0, 6, 11], "line_nam": 0, "0_452": 0, "111_452_pk1": 0, "calculate_start_end_tim": 0, "line_properties_dict": 0, "calcul": [0, 1, 2, 3, 4, 6], "start": [0, 1, 4, 8], "end": [0, 1, 6], "warn": [0, 1], "doesn": [0, 1], "take": [0, 1], "care": 0, "discongru": 0, "flavor": [0, 1], "create_add_route_card_dict": 0, "format": [0, 1, 2, 4, 8, 11], "route_properti": 0, "being": 0, "updat": [0, 1, 11], "A": [0, 1, 2, 3, 4, 6, 11], "addit": [0, 1, 6, 8, 11], "create_delete_route_card_dict": 0, "base_transit_line_properties_dict": 0, "delet": [0, 1], "style": [0, 6], "attribut": [0, 1, 2, 3, 4, 11], "find": [0, 1, 4], "create_update_route_card_dict": 0, "updated_properties_dict": 0, "cube_properties_to_standard_properti": 0, "cube_properties_dict": 0, "convert": [0, 1, 4, 6], "most": 0, "pertin": 0, "like": [0, 1, 2, 6], "headwai": [0, 2, 4], "varibl": [0, 2], "stnadard": 0, "unit": [0, 1, 6], "minut": 0, "second": [0, 4, 6], "correct": 0, "base_transit": 0, "identifi": [0, 1, 11], "what": [0, 1, 3, 6], "need": [0, 1, 2, 3, 4], "For": [0, 1, 4, 6], "multipl": [0, 1, 6, 11], "make": [0, 1, 11], "duplic": 0, "so": [0, 1, 5, 6, 11], "each": [0, 1, 3, 6], "condit": [0, 1], "contain": [0, 1, 4, 6, 8, 11], "requir": [0, 1, 6, 11], "evalu": [0, 1, 3], "differ": [0, 2, 6], "between": [0, 1, 2, 3, 6], "evaluate_route_property_differ": 0, "properties_build": 0, "properties_bas": 0, "time_period_numb": 0, "absolut": [0, 6], "true": [0, 1, 3, 4, 5, 6, 11], "validate_bas": 0, "fals": [0, 1, 3, 4, 6, 11], "check": [0, 1, 3, 6], "ani": [0, 1, 3, 6, 11], "entri": [0, 3], "property_nam": 0, "property_valu": 0, "us": [0, 1, 2, 3, 4, 6, 11], "command": [0, 1, 3], "rather": [0, 1], "than": [0, 1, 6], "If": [0, 1, 3, 4, 6, 11], "automat": 0, "note": [0, 6, 11], "onli": [0, 1, 3, 4, 6], "numer": [0, 4, 6], "frequenc": [0, 4, 11], "suitabl": 0, "write": [0, 1, 3, 4, 11], "evaluate_route_shape_chang": 0, "shape_build": 0, "shape_bas": 0, "two": [0, 1, 6, 8, 11], "build": [0, 1, 3, 11], "version": [0, 6, 11], "ddatafram": 0, "get_time_period_numbers_from_cube_properti": 0, "properties_list": 0, "associ": [0, 2], "them": [0, 1, 4, 6, 8], "all": [0, 1, 2, 5, 6, 11], "found": [0, 1, 6], "unpack_route_nam": 0, "unpack": 0, "info": [0, 1, 2], "link": [1, 2, 3, 6, 11], "kwarg": [1, 2, 3, 6], "roadwaynetwork": [1, 3, 4], "subclass": [1, 11], "network_wrangl": [1, 8, 11], "represent": [1, 3, 4, 6], "physic": 1, "roadwai": [1, 2, 3, 4, 11], "geodatafram": [1, 11], "specifi": [1, 3, 6], "default": [1, 2, 3, 4, 6, 11], "cr": [1, 3, 6], "coordin": [1, 3, 6], "refer": [1, 3, 4, 6, 11], "system": [1, 3], "espg": [1, 3], "node_foreign_kei": [1, 3], "tabl": [1, 2, 3, 6], "link_foreign_kei": [1, 3], "foreign": [1, 3], "shape_foreign_kei": [1, 3, 11], "unique_link_id": [1, 3], "unique_node_id": [1, 3], "modes_to_network_link_vari": [1, 3], "map": [1, 2, 3, 6, 11], "mode": [1, 3, 4, 11], "modes_to_network_nodes_vari": [1, 3], "managed_lanes_node_id_scalar": [1, 3], "scalar": [1, 3, 6], "primari": [1, 3], "correspond": [1, 3, 4], "manag": [1, 2, 3, 11], "lane": [1, 2, 3, 11], "managed_lanes_link_id_scalar": [1, 3], "managed_lanes_required_attribut": [1, 3], "must": [1, 3, 6], "keep_same_attributes_ml_and_gp": [1, 3], "parallel": [1, 3, 6], "gener": [1, 3], "purpos": [1, 3, 6], "add_count": 1, "network_vari": 1, "aadt": 1, "mndot_count_shst_data": 1, "none": [1, 3, 4, 5, 6], "widot_count_shst_data": 1, "mndot_count_variable_shp": [1, 2], "widot_count_variable_shp": 1, "count": [1, 2], "mc": [1, 2, 4], "join": [1, 3, 4, 6, 11], "data": [1, 2, 3, 4, 8, 11], "via": 1, "shst": 1, "api": 1, "match": 1, "result": [1, 6, 11], "should": [1, 2, 3, 6, 11], "written": [1, 11], "path": [1, 3, 4, 6, 11], "mndot": [1, 2], "locat": [1, 2, 3, 4], "widot": 1, "geodatabas": 1, "add_incident_link_data_to_nod": 1, "links_df": [1, 11], "nodes_df": [1, 11], "link_vari": 1, "unique_node_kei": 1, "model_node_id": [1, 2], "go": [1, 11], "assess": [1, 3], "connect": 1, "incid": 1, "where": 1, "length": [1, 6], "n": [1, 2, 3, 6, 11], "out": [1, 2, 5, 6, 8], "add_new_roadway_feature_chang": 1, "featur": [1, 6], "also": [1, 6, 11], "valid": [1, 6, 11], "add_variable_using_shst_refer": 1, "var_shst_csvdata": 1, "shst_csv_variabl": 1, "network_var_typ": 1, "overwrit": 1, "bool": [1, 3, 6], "addition_map": 1, "show": 1, "project_card_dictionari": 1, "wrapper": [1, 8, 11], "apply_managed_lane_feature_chang": 1, "link_idx": 1, "in_plac": [1, 11], "lndice": 1, "whether": 1, "decid": 1, "connector": [1, 2], "when": [1, 3, 4, 6], "thei": [1, 2], "more": [1, 6, 11], "apply_python_calcul": 1, "pycod": 1, "execut": 1, "code": [1, 2, 4, 6, 11], "apply_roadway_feature_chang": [1, 11], "select": [1, 11], "pass": [1, 5, 6], "assess_connect": [1, 11], "ignore_end_nod": [1, 11], "graph": 1, "disconnect": 1, "subgraph": 1, "describ": [1, 11], "member": [1, 6], "one": [1, 6, 11], "drive": [1, 11], "walk": [1, 11], "bike": 1, "ignor": [1, 6], "strai": 1, "singleton": 1, "tupl": [1, 6], "osmnx": [1, 11], "networkx": 1, "digraph": 1, "build_selection_kei": 1, "selection_dict": 1, "combin": [1, 2, 4, 6], "queri": [1, 11], "b": [1, 2, 11], "you": [1, 3, 11], "selection_dictonari": 1, "serv": 1, "calculate_area_typ": 1, "area_type_shap": [1, 2], "area_type_shape_vari": 1, "area_typ": [1, 2, 3], "area_type_codes_dict": 1, "downtown_area_type_shap": [1, 2], "downtown_area_typ": [1, 2], "area": [1, 2, 3, 6], "centroid": [1, 2, 6], "geometri": [1, 2, 6], "field": [1, 2, 11], "determin": [1, 3, 6], "label": 1, "isn": 1, "perfect": 1, "much": 1, "quicker": 1, "other": [1, 6, 11], "The": [1, 2, 5, 6, 11], "geodadabas": 1, "input": [1, 6], "downtown": [1, 2], "boundari": [1, 2, 6], "counti": [1, 2, 3], "calculate_centroidconnect": 1, "centroidconnect": [1, 2, 3], "highest_taz_numb": [1, 2], "as_integ": 1, "max": 1, "taz": [1, 2], "calculate_counti": 1, "county_shap": [1, 2], "county_shape_vari": 1, "county_codes_dict": 1, "calculate_dist": 1, "distanc": [1, 2, 3, 6], "centroidconnect_onli": 1, "mile": 1, "centroidconnector": 1, "calculate_mpo": 1, "county_network_vari": 1, "mpo": [1, 2], "mpo_counti": [1, 2], "param": [1, 4], "county_vari": 1, "region": [1, 6], "calculate_us": 1, "defauli": 1, "convert_int": 1, "int_col_nam": 1, "create_ml_vari": 1, "ml_lane": [1, 2], "ml": 1, "placehold": 1, "come": 1, "log": [1, 3, 8], "create_calculated_vari": 1, "create_dummy_connector_link": 1, "ml_df": 1, "access_lan": 1, "egress_lan": 1, "access_roadwai": 1, "ml_access": 1, "egress_roadwai": 1, "access_name_prefix": 1, "access": [1, 4, 6], "dummi": 1, "egress_name_prefix": 1, "egress": 1, "gp_df": 1, "roadai": 1, "prefix": 1, "create_hov_corridor_vari": 1, "segment_id": [1, 2], "hov": 1, "corridor": 1, "create_managed_lane_network": [1, 11], "keep_additional_attributes_ml_and_gp": 1, "separ": [1, 3, 6], "look": 1, "want": [1, 11], "leav": 1, "some": 1, "rigor": 1, "test": [1, 2, 6], "create_managed_vari": 1, "dataframe_to_fixed_width": 1, "df": 1, "fix": [1, 2], "width": [1, 6], "transform": [1, 6], "delete_roadway_feature_chang": 1, "ignore_miss": 1, "get": [1, 2, 6, 11], "miss": 1, "fail": [1, 6], "deletion_map": 1, "fill_na": 1, "fill": [1, 6], "na": 1, "from_roadwaynetwork": [1, 11], "roadway_network_object": 1, "get_attribut": 1, "join_kei": 1, "source_shst_ref_df": 1, "source_gdf": 1, "field_nam": 1, "get_managed_lane_node_id": 1, "nodes_list": 1, "4500000": 1, "237": 1, "get_modal_graph": 1, "bike_access": [1, 2], "bu": [1, 4], "bus_onli": [1, 2], "drive_access": [1, 2, 11], "rail": [1, 4], "rail_onli": [1, 2], "walk_access": [1, 2], "strongli": 1, "vertex": [1, 6], "reachabl": 1, "everi": [1, 6], "get_modal_links_nod": 1, "kept": 1, "both": [1, 4, 6, 11], "filter": [1, 6], "right": [1, 6], "now": 1, "we": [1, 4, 11], "don": 1, "becaus": [1, 4, 6, 11], "mark": 1, "issu": 1, "discuss": 1, "http": [1, 4, 6, 11], "github": [1, 6, 11], "com": [1, 4, 6, 11], "wsp": [1, 11], "sag": [1, 11], "145": 1, "modal_nodes_df": 1, "mode_node_vari": 1, "get_property_by_time_period_and_group": 1, "prop": 1, "categori": [1, 2, 11], "default_return": 1, "seri": [1, 11], "group": 1, "16": [1, 2, 6], "00": [1, 2], "19": [1, 2], "option": [1, 6], "sov": [1, 2], "search": [1, 2, 8, 11], "hov3": [1, 2], "hov2": [1, 2], "identify_seg": 1, "o_id": 1, "d_id": 1, "endpoint": 1, "up": [1, 5, 11], "segment": [1, 4, 6, 11], "candid": 1, "otherwis": [1, 4, 6], "ram": 1, "hog": 1, "could": [1, 6, 11], "odd": 1, "shortest": 1, "segment_vari": 1, "keep": [1, 6], "identify_segment_endpoint": 1, "min_connecting_link": 1, "10": [1, 2, 6], "min_dist": 1, "max_link_devi": 1, "2": [1, 2, 4, 6, 11], "is_network_connect": [1, 11], "consid": [1, 6, 11], "cach": 1, "long": [1, 6], "load_transform_network": 1, "node_filenam": [1, 11], "link_filenam": [1, 11], "shape_filenam": [1, 11], "4326": [1, 6], "validate_schema": 1, "disk": 1, "schema": [1, 11], "shapes_df": [1, 11], "network_connection_plot": 1, "g": [1, 6], "disconnected_subgraph_nod": 1, "plot": 1, "fig": 1, "ax": [1, 6], "orig_dest_nodes_foreign_kei": 1, "whatev": 1, "u": 1, "v": [1, 2, 11], "ab": 1, "noth": 1, "assum": 1, "a_id": 1, "b_id": 1, "ox_graph": 1, "unique_link_kei": 1, "model_link_id": [1, 2, 3], "arrai": [1, 6], "remov": [1, 6], "certain": 1, "do": [1, 6, 11], "too": [1, 5], "link_df": 1, "referenc": 1, "multidigraph": 1, "path_search": 1, "candidate_links_df": 1, "weight_column": 1, "weight_factor": 1, "search_breadth": 1, "5": [1, 2, 4, 6], "max_search_breadth": 1, "candidate_link": 1, "part": [1, 6, 11], "foreigh": 1, "destin": 1, "weight": 1, "iter": [1, 6], "multipli": 1, "fast": [1, 11], "recalculate_calculated_vari": [1, 3], "recalculate_dist": [1, 3], "json": [1, 11], "geojson": 1, "skip": 1, "speed": 1, "spatial": [1, 6], "etc": [1, 2, 3, 11], "re": 1, "read_match_result": 1, "lot": 1, "same": [1, 4, 6], "concaten": 1, "singl": [1, 3, 6], "geopanda": [1, 11], "sure": 1, "why": 1, "util": [1, 8, 11], "rename_variables_for_dbf": 1, "input_df": 1, "variable_crosswalk": 1, "output_vari": [1, 2], "convert_geometry_to_xi": 1, "renam": [1, 3], "dbf": 1, "shp": [1, 2], "char": 1, "crosswalk": [1, 3], "x": [1, 2, 6], "y": [1, 2, 6], "roadway_net_to_gdf": 1, "roadway_net": 1, "turn": [1, 11], "export": [1, 8, 11], "sophist": 1, "attach": 1, "roadway_standard_to_met_council_network": 1, "output_epsg": [1, 2], "consist": [1, 6, 11], "metcouncil": [1, 2, 4, 8, 11], "": [1, 2, 4, 6, 11], "model": [1, 2], "expect": [1, 6], "epsg": [1, 2, 6], "output": [1, 2, 3, 4, 6], "select_roadway_featur": [1, 11], "search_mod": 1, "force_search": 1, "sp_weight_factor": 1, "satisfi": [1, 6], "criteria": 1, "net": [1, 11], "osm": [1, 2], "share": [1, 11], "street": [1, 11], "osm_model_link_id": 1, "1234": 1, "shstid": 1, "4321": 1, "regex": 1, "facil": [1, 2, 11], "main": 1, "st": [1, 2], "least": [1, 11], "perform": [1, 4], "even": 1, "previou": 1, "discourag": 1, "meander": 1, "ref": 1, "here": [1, 6], "defaul": 1, "selection_has_unique_link_id": 1, "selection_dictionari": 1, "selection_map": 1, "selected_link_idx": 1, "candidate_link_idx": 1, "selected_links_idx": 1, "candidate_links_idx": 1, "shortest_path": 1, "graph_links_df": 1, "100": 1, "four": 1, "nx": 1, "split_properties_by_time_period_and_categori": 1, "properties_to_split": [1, 2], "split": [1, 2, 4], "structur": 1, "stratifi": 1, "times_period": 1, "am": [1, 2], "6": [1, 2, 4, 6], "pm": [1, 2], "15": [1, 2], "update_dist": 1, "use_shap": 1, "inplac": 1, "straight": 1, "avail": 1, "portion": 1, "provid": [1, 3, 4, 6, 11], "entir": 1, "crow": 1, "fly": 1, "meter": [1, 6], "nan": 1, "validate_link_schema": 1, "schema_loc": 1, "roadway_network_link": 1, "validate_node_schema": 1, "node_fil": 1, "roadway_network_nod": 1, "validate_properti": 1, "ignore_exist": 1, "require_existing_for_chang": 1, "theproject": 1, "dictonari": 1, "validate_select": 1, "selection_requir": 1, "whetther": 1, "minimum": [1, 5, 6], "validate_shape_schema": 1, "shape_fil": 1, "roadway_network_shap": 1, "validate_uniqu": 1, "confirm": 1, "met": 1, "filenam": [1, 3, 11], "were": 1, "save": 1, "write_roadway_as_fixedwidth": [1, 11], "output_dir": 1, "node_output_vari": 1, "link_output_vari": 1, "output_link_txt": [1, 2], "output_node_txt": [1, 2], "output_link_header_width_txt": [1, 2], "output_node_header_width_txt": [1, 2], "output_cube_network_script": [1, 2], "drive_onli": 1, "function": [1, 5, 6, 8, 11], "doe": [1, 4], "header": [1, 2], "3": [1, 2, 4, 6, 11], "script": [1, 2], "run": [1, 8], "databas": [1, 3], "record": [1, 4], "write_roadway_as_shp": [1, 11], "data_to_csv": 1, "data_to_dbf": 1, "output_link_shp": [1, 2], "output_node_shp": [1, 2], "output_link_csv": [1, 2], "output_node_csv": [1, 2], "output_gpkg": 1, "output_link_gpkg_lay": 1, "output_node_gpkg_lay": 1, "output_gpkg_link_filt": 1, "gpkg": 1, "csv": [1, 2, 3, 11], "full": [1, 6], "geopackag": 1, "layer": [1, 11], "subset": 1, "calculated_valu": [1, 3], "dai": [2, 4, 11], "can": [2, 6, 8, 11], "runtim": [2, 11], "initi": [2, 3, 11], "keyword": [2, 6, 11], "argument": [2, 6, 11], "explicitli": [2, 11], "highlight": [2, 11], "attr": 2, "time_period_to_tim": 2, "abbrevi": [2, 4], "gtf": [2, 4, 11], "highwai": [2, 3, 8, 11], "ea": 2, "md": 2, "ev": 2, "cube_time_period": 2, "4": [2, 4, 6], "demand": 2, "allow": [2, 6], "suffix": 2, "truck": 2, "trk": 2, "final": 2, "lanes_am": 2, "time_periods_to_tim": 2, "shapefil": 2, "r": 2, "metcouncil_data": 2, "cb_2017_us_county_5m": 2, "county_variable_shp": 2, "lanes_lookup_fil": 2, "lookup": 2, "centroid_connect_lan": 2, "anoka": 2, "dakota": 2, "hennepin": 2, "ramsei": 2, "scott": 2, "washington": 2, "carver": 2, "taz_shap": 2, "tazofficialwcurrentforecast": 2, "taz_data": 2, "highest": 2, "3100": 2, "link_id": 2, "shstgeometryid": 2, "roadway_class": 2, "truck_access": 2, "trn_priority_ea": 2, "trn_priority_am": 2, "trn_priority_md": 2, "trn_priority_pm": 2, "trn_priority_ev": 2, "ttime_assert_ea": 2, "ttime_assert_am": 2, "ttime_assert_md": 2, "ttime_assert_pm": 2, "ttime_assert_ev": 2, "lanes_ea": 2, "lanes_md": 2, "lanes_pm": 2, "lanes_ev": 2, "price_sov_ea": 2, "price_hov2_ea": 2, "price_hov3_ea": 2, "price_truck_ea": 2, "price_sov_am": 2, "price_hov2_am": 2, "price_hov3_am": 2, "price_truck_am": 2, "price_sov_md": 2, "price_hov2_md": 2, "price_hov3_md": 2, "price_truck_md": 2, "price_sov_pm": 2, "price_hov2_pm": 2, "price_hov3_pm": 2, "price_truck_pm": 2, "price_sov_ev": 2, "price_hov2_ev": 2, "price_hov3_ev": 2, "price_truck_ev": 2, "roadway_class_idx": 2, "facility_typ": 2, "osm_node_id": [2, 6, 11], "bike_nod": 2, "transit_nod": 2, "walk_nod": 2, "drive_nod": 2, "ml_lanes_ea": 2, "ml_lanes_am": 2, "ml_lanes_md": 2, "ml_lanes_pm": 2, "ml_lanes_ev": 2, "osm_facility_type_dict": 2, "thrivemsp2040communitydesign": 2, "area_type_variable_shp": 2, "comdes2040": 2, "area_type_code_dict": 2, "23": 2, "urban": [2, 4], "center": [2, 6], "24": 2, "25": 2, "35": 2, "36": 2, "41": 2, "51": 2, "52": 2, "53": 2, "60": 2, "downtownzones_taz": 2, "mrcc_roadway_class_shap": 2, "mrcc": 2, "trans_mrcc_centerlin": 2, "mrcc_roadway_class_variable_shp": 2, "mrcc_roadway_class_shp": 2, "route_si": 2, "widot_roadway_class_shap": 2, "wisconsin": 2, "wisconsin_lanes_counts_median": 2, "wislr": 2, "widot_roadway_class_variable_shp": 2, "rdwy_ctgy_": 2, "mndot_count_shap": 2, "count_mn": 2, "aadt_2017_count_loc": 2, "osm_highway_facility_type_crosswalk": 2, "legacy_tm2_attribut": 2, "shstreferenceid": 2, "legaci": 2, "tm2": 2, "osm_lanes_attribut": 2, "tam_tm2_attribut": 2, "tam": 2, "tom_tom_attribut": 2, "tomtom": 2, "tomtom_attribut": 2, "sfcta_attribut": 2, "sfcta": 2, "geograph": 2, "102646": 2, "scratch": 2, "txt": [2, 11], "links_header_width": 2, "nodes_header_width": 2, "import": [2, 6], "make_complete_network_from_fixed_width_fil": 2, "county_link_range_dict": 2, "county_code_dict": 2, "7": [2, 4, 11], "extern": 2, "chisago": 2, "11": 2, "goodhu": 2, "12": 2, "isanti": 2, "13": 2, "le": 2, "sueur": 2, "14": 2, "mcleod": 2, "pierc": 2, "polk": 2, "17": 2, "rice": 2, "18": 2, "sherburn": 2, "siblei": 2, "20": 2, "croix": 2, "21": 2, "wright": 2, "22": 2, "maz_shape_fil": 2, "route_type_bus_mode_dict": 2, "urb": 2, "loc": 2, "sub": [2, 6], "express": [2, 4, 6], "route_type_mode_dict": 2, "8": [2, 4, 6], "9": [2, 4], "cube_time_periods_nam": 2, "op": 2, "detail": [2, 5], "zone": 2, "possibl": [2, 6], "roadway_link_chang": 3, "roadway_node_chang": 3, "transit_chang": [3, 4], "base_roadway_network": 3, "base_cube_transit_network": 3, "build_cube_transit_network": 3, "project_nam": 3, "produc": [3, 6], "test_project": [3, 11], "create_project": [3, 11], "base_cube_transit_sourc": 3, "o": [3, 4, 11], "build_cube_transit_sourc": 3, "transit_route_shape_chang": [3, 11], "evaluate_chang": [3, 11], "write_project_card": [3, 11], "scratch_dir": [3, 11], "t_transit_shape_test": [3, 11], "yml": [3, 11], "default_project_nam": 3, "level": 3, "constant": 3, "static_valu": 3, "card_data": 3, "cubetransit": [3, 8], "bunch": 3, "projectcard": [3, 8], "case": 3, "standardtransit": [3, 8], "add_highway_chang": 3, "limit_variables_to_existing_network": 3, "hoc": [3, 11], "add_transit_chang": 3, "roadway_log_fil": 3, "roadway_shp_fil": 3, "roadway_csv_fil": 3, "network_build_fil": 3, "emme_node_id_crosswalk_fil": 3, "emme_name_crosswalk_fil": 3, "base_roadway_dir": 3, "base_transit_dir": [3, 4, 11], "consum": 3, "logfil": 3, "emm": [3, 4], "folder": 3, "base_cube_transit_fil": 3, "build_cube_transit_fil": 3, "first": [3, 4, 6], "recalcul": 3, "determine_roadway_network_changes_compat": 3, "emme_id_to_wrangler_id": 3, "emme_link_change_df": 3, "emme_node_change_df": 3, "emme_transit_changes_df": 3, "rewrit": 3, "wrangler": [3, 8, 11], "emme_name_to_wrangler_nam": 3, "aggreg": 3, "get_object_from_network_build_command": 3, "row": [3, 4], "histori": 3, "l": 3, "get_operation_from_network_build_command": 3, "action": 3, "c": [3, 6], "d": [3, 6], "read_logfil": 3, "logfilenam": 3, "reprsent": [3, 11], "read_network_build_fil": 3, "networkbuildfilenam": 3, "nework": 3, "assign_group": 3, "user": [3, 11], "TO": 3, "ptg_feed": 4, "hold": [4, 11], "feed": [4, 11], "partridg": [4, 11], "manipul": [4, 11], "cube_transit_net": [4, 11], "read_gtf": [4, 11], "write_as_cube_lin": [4, 11], "write_dir": [4, 11], "outfil": [4, 11], "calculate_cube_mod": 4, "assign": 4, "logic": 4, "route_typ": 4, "develop": [4, 11], "googl": 4, "cube_mod": 4, "route_type_to_cube_mod": 4, "tram": 4, "streetcar": 4, "light": 4, "further": 4, "disaggreg": 4, "buse": 4, "suburban": 4, "longnam": 4, "lower": [4, 6], "elif": 4, "99": 4, "local": [4, 11], "els": [4, 6], "route_long_nam": 4, "cube_format": 4, "represnt": 4, "notat": 4, "trip": 4, "trip_id": 4, "shape_id": [4, 11], "tod": 4, "onewai": 4, "oper": [4, 6], "fromtransitnetwork": 4, "transit_network_object": 4, "modelroadwaynetwork": [4, 8], "gtfs_feed_dir": 4, "route_properties_gtfs_to_cub": 4, "prepar": 4, "trip_df": 4, "shape_gtfs_to_cub": 4, "add_nntim": 4, "shape_gtfs_to_dict_list": 4, "step": 4, "through": 4, "todo": 4, "elimin": 4, "necess": 4, "tag": [4, 11], "begin": 4, "As": 4, "m": 4, "minim": 4, "modif": 4, "question": 4, "shape_pt_sequ": 4, "shape_mode_node_id": 4, "is_stop": 4, "stop_sequ": 4, "shape_gtfs_to_emm": 4, "trip_row": 4, "time_to_cube_time_period": 4, "start_time_sec": 4, "as_str": 4, "verbos": 4, "midnight": [4, 6], "this_tp": 4, "this_tp_num": 4, "outpath": 4, "after": 4, "setuplog": 5, "infologfilenam": 5, "debuglogfilenam": 5, "logtoconsol": 5, "infolog": 5, "ters": 5, "just": 5, "give": 5, "bare": 5, "composit": 5, "clear": 5, "later": 5, "debuglog": 5, "veri": [5, 6], "noisi": 5, "debug": 5, "spew": 5, "consol": 5, "point": 6, "arg": 6, "basegeometri": 6, "possibli": 6, "z": 6, "zero": 6, "dimension": 6, "float": 6, "sequenc": 6, "individu": 6, "p": 6, "print": 6, "almost_equ": 6, "decim": 6, "equal": 6, "place": 6, "deprec": 6, "sinc": 6, "confus": 6, "equals_exact": 6, "instead": 6, "approxim": 6, "compon": [6, 8], "linestr": 6, "1e": 6, "buffer": 6, "quad_seg": 6, "cap_styl": 6, "round": 6, "join_styl": 6, "mitre_limit": 6, "single_sid": 6, "dilat": 6, "eros": 6, "small": 6, "mai": 6, "sometim": 6, "tidi": 6, "polygon": 6, "around": [6, 8, 11], "resolut": 6, "angl": 6, "fillet": 6, "buffercapstyl": 6, "squar": 6, "flat": 6, "circular": 6, "rectangular": 6, "while": 6, "involv": 6, "bufferjoinstyl": 6, "mitr": 6, "bevel": 6, "midpoint": 6, "edg": [6, 8], "touch": 6, "depend": 6, "limit": 6, "ratio": 6, "sharp": 6, "corner": 6, "offset": 6, "meet": 6, "miter": 6, "extend": 6, "To": [6, 11], "prevent": 6, "unreason": 6, "control": 6, "maximum": 6, "exce": 6, "side": 6, "sign": 6, "left": 6, "hand": 6, "regular": 6, "cap": 6, "alwai": 6, "forc": 6, "equival": 6, "cap_flat": 6, "quadseg": 6, "alia": 6, "strictli": 6, "wkt": 6, "load": 6, "gon": 6, "approx": 6, "radiu": 6, "circl": 6, "1365484905459": 6, "128": 6, "141513801144": 6, "triangl": 6, "exterior": 6, "coord": 6, "contains_properli": 6, "complet": 6, "common": 6, "document": [6, 11], "covered_bi": 6, "cover": 6, "cross": 6, "grid_siz": 6, "disjoint": 6, "unitless": 6, "dwithin": 6, "given": [6, 11], "topolog": 6, "toler": 6, "comparison": 6, "geometrytyp": 6, "hausdorff_dist": 6, "hausdorff": 6, "interpol": 6, "normal": 6, "along": 6, "linear": 6, "taken": 6, "measur": 6, "revers": 6, "rang": 6, "index": [6, 8], "handl": 6, "clamp": 6, "interpret": 6, "fraction": 6, "line_interpolate_point": 6, "intersect": 6, "line_locate_point": 6, "nearest": 6, "form": 6, "canon": 6, "ring": 6, "multi": 6, "multilinestr": 6, "overlap": 6, "point_on_surfac": 6, "guarante": 6, "cheapli": 6, "representative_point": 6, "relat": 6, "de": 6, "9im": 6, "matrix": 6, "relate_pattern": 6, "pattern": 6, "relationship": 6, "interior": 6, "unchang": 6, "is_ccw": 6, "clockwis": 6, "max_segment_length": 6, "vertic": 6, "longer": 6, "evenli": 6, "subdivid": 6, "densifi": 6, "unmodifi": 6, "array_lik": 6, "greater": 6, "simplifi": 6, "preserve_topologi": 6, "dougla": 6, "peucker": 6, "algorithm": 6, "unless": 6, "topologi": 6, "preserv": 6, "invalid": 6, "svg": 6, "scale_factor": 6, "fill_color": 6, "opac": 6, "element": 6, "factor": 6, "diamet": 6, "hex": 6, "color": 6, "66cc99": 6, "ff3333": 6, "symmetric_differ": 6, "symmetr": 6, "union": 6, "dimens": 6, "bound": 6, "collect": 6, "empti": 6, "null": 6, "minx": 6, "mini": 6, "maxx": 6, "maxi": 6, "geometr": 6, "convex_hul": 6, "convex": 6, "hull": 6, "less": 6, "three": [6, 11], "multipoint": 6, "triangular": 6, "imagin": 6, "elast": 6, "band": 6, "stretch": 6, "coordinatesequ": 6, "envelop": 6, "figur": 6, "geom_typ": 6, "has_z": 6, "is_clos": 6, "close": 6, "applic": 6, "is_empti": 6, "is_r": 6, "is_simpl": 6, "simpl": 6, "mean": 6, "is_valid": 6, "definit": 6, "minimum_clear": 6, "move": 6, "minimum_rotated_rectangl": 6, "orient": 6, "rotat": 6, "rectangl": 6, "enclos": 6, "unlik": 6, "constrain": 6, "degener": 6, "oriented_envelop": 6, "wkb": 6, "wkb_hex": 6, "xy": 6, "shell": 6, "hole": 6, "It": [6, 8], "space": 6, "pair": [6, 11], "tripl": 6, "abov": 6, "classmethod": 6, "from_bound": 6, "xmin": 6, "ymin": 6, "xmax": 6, "ymax": 6, "stroke": 6, "partial": 6, "func": 6, "futur": 6, "call": 6, "column_name_to_part": 6, "create_locationrefer": 6, "geodesic_point_buff": 6, "lat": 6, "lon": 6, "get_shared_streets_intersection_hash": 6, "per": [6, 11], "sharedstreet": 6, "j": 6, "blob": 6, "0e6d7de0aee2e9ae3b007d1e45284b06cc241d02": 6, "src": 6, "l553": 6, "l565": 6, "93": 6, "0965985": 6, "44": 6, "952112199999995": 6, "954734870": 6, "69f13f881649cb21ee3b359730790bb9": 6, "hhmmss_to_datetim": 6, "hhmmss_str": 6, "datetim": 6, "hh": 6, "mm": 6, "ss": 6, "dt": 6, "secs_to_datetim": 6, "sec": 6, "shorten_nam": 6, "geom": 6, "xp": 6, "yp": 6, "zp": 6, "shall": 6, "ident": 6, "def": 6, "id_func": 6, "g2": 6, "g1": 6, "pyproj": 6, "accur": 6, "wgs84": 6, "utm": 6, "32618": 6, "from_cr": 6, "always_xi": 6, "support": 6, "lambda": 6, "unidecod": 6, "error": 6, "replace_str": 6, "transliter": 6, "unicod": 6, "ascii": 6, "\u5317\u4eb0": 6, "bei": 6, "jing": 6, "tri": 6, "codec": 6, "charact": 6, "fall": 6, "back": 6, "five": 6, "faster": 6, "slightli": 6, "slower": 6, "unicode_expect_nonascii": 6, "present": 6, "replac": [6, 11], "strict": 6, "rais": 6, "unidecodeerror": 6, "substitut": 6, "might": [6, 11], "packag": [8, 11], "mtc": 8, "aim": 8, "networkwrangl": [8, 11], "refin": 8, "respect": 8, "citilab": 8, "softwar": [8, 11], "instal": 8, "bleed": 8, "clone": 8, "brief": 8, "intro": 8, "workflow": 8, "quickstart": 8, "jupyt": 8, "notebook": 8, "setup": 8, "scenario": 8, "audit": 8, "report": 8, "logger": 8, "modul": 8, "page": 8, "suggest": 11, "virtualenv": 11, "conda": 11, "virtual": 11, "environ": 11, "recommend": 11, "pip": 11, "lasso": 11, "config": 11, "channel": 11, "forg": 11, "rtree": 11, "my_lasso_environ": 11, "activ": 11, "git": 11, "master": 11, "pypi": 11, "repositori": 11, "date": 11, "branch": 11, "work": 11, "your": 11, "machin": 11, "edit": 11, "plan": 11, "well": 11, "cd": 11, "team": 11, "contribut": 11, "bxack": 11, "pleas": 11, "fork": 11, "befor": 11, "upstream": 11, "branchnam": 11, "frequent": 11, "instruct": 11, "good": 11, "atom": 11, "sublim": 11, "text": 11, "syntax": 11, "desktop": 11, "built": 11, "mashup": 11, "open": 11, "In": 11, "nest": 11, "span": 11, "implement": 11, "novel": 11, "travel": 11, "break": 11, "publictransport": 11, "done": 11, "gui": 11, "public": 11, "transport": 11, "infrastructur": 11, "servic": 11, "tier": 11, "made": 11, "mainli": 11, "my_link_fil": 11, "my_node_fil": 11, "my_shape_fil": 11, "my_select": 11, "35e": 11, "961117623": 11, "2564047368": 11, "my_chang": 11, "my_net": 11, "ml_net": 11, "_": 11, "disconnected_nod": 11, "my_out_prefix": 11, "my_dir": 11, "my_base_scenario": 11, "road_net": 11, "stpaul_link_fil": 11, "stpaul_node_fil": 11, "stpaul_shape_fil": 11, "transit_net": 11, "stpaul_dir": 11, "card_filenam": 11, "3_multiple_roadway_attribute_chang": 11, "multiple_chang": 11, "4_simple_managed_lan": 11, "project_card_directori": 11, "project_card": 11, "project_cards_list": 11, "my_scenario": 11, "create_scenario": 11, "base_scenario": 11, "check_scenario_requisit": 11, "apply_all_project": 11, "scenario_summari": 11, "base_transit_sourc": 11, "build_transit_sourc": 11, "understand": 11, "how": 11, "overrid": 11, "those": 11, "instanti": 11, "yaml": 11, "configur": 11, "config_fil": 11, "f": 11, "my_config": 11, "safe_load": 11, "model_road_net": 11, "my_paramet": 11, "accomplish": 11, "goal": 11, "top": 11, "learn": 11, "basic": 11, "creation": 11, "ipynb": 11}, "objects": {"": [[7, 0, 0, "-", "lasso"]], "lasso": [[0, 1, 1, "", "CubeTransit"], [1, 1, 1, "", "ModelRoadwayNetwork"], [2, 1, 1, "", "Parameters"], [3, 1, 1, "", "Project"], [4, 1, 1, "", "StandardTransit"], [5, 0, 0, "-", "logger"], [6, 0, 0, "-", "util"]], "lasso.CubeTransit": [[0, 2, 1, "", "__init__"], [0, 2, 1, "", "add_additional_time_periods"], [0, 2, 1, "", "add_cube"], [0, 2, 1, "", "build_route_name"], [0, 2, 1, "", "calculate_start_end_times"], [0, 2, 1, "", "create_add_route_card_dict"], [0, 2, 1, "", "create_delete_route_card_dict"], [0, 2, 1, "", "create_from_cube"], [0, 2, 1, "", "create_update_route_card_dict"], [0, 2, 1, "", "cube_properties_to_standard_properties"], [0, 3, 1, "", "diff_dict"], [0, 2, 1, "", "evaluate_differences"], [0, 2, 1, "", "evaluate_route_property_differences"], [0, 2, 1, "", "evaluate_route_shape_changes"], [0, 2, 1, "", "get_time_period_numbers_from_cube_properties"], [0, 3, 1, "", "line_properties"], [0, 3, 1, "", "lines"], [0, 3, 1, "", "parameters"], [0, 3, 1, "", "program_type"], [0, 3, 1, "", "shapes"], [0, 3, 1, "", "source_list"], [0, 2, 1, "", "unpack_route_name"]], "lasso.ModelRoadwayNetwork": [[1, 3, 1, "", "CALCULATED_VALUES"], [1, 2, 1, "", "__init__"], [1, 2, 1, "", "add_counts"], [1, 2, 1, "", "add_incident_link_data_to_nodes"], [1, 2, 1, "", "add_new_roadway_feature_change"], [1, 2, 1, "", "add_variable_using_shst_reference"], [1, 2, 1, "", "addition_map"], [1, 2, 1, "", "apply"], [1, 2, 1, "", "apply_managed_lane_feature_change"], [1, 2, 1, "", "apply_python_calculation"], [1, 2, 1, "", "apply_roadway_feature_change"], [1, 2, 1, "", "assess_connectivity"], [1, 2, 1, "", "build_selection_key"], [1, 2, 1, "", "calculate_area_type"], [1, 2, 1, "", "calculate_centroidconnect"], [1, 2, 1, "", "calculate_county"], [1, 2, 1, "", "calculate_distance"], [1, 2, 1, "", "calculate_mpo"], [1, 2, 1, "", "calculate_use"], [1, 2, 1, "", "convert_int"], [1, 2, 1, "", "create_ML_variable"], [1, 2, 1, "", "create_calculated_variables"], [1, 2, 1, "", "create_dummy_connector_links"], [1, 2, 1, "", "create_hov_corridor_variable"], [1, 2, 1, "", "create_managed_lane_network"], [1, 2, 1, "", "create_managed_variable"], [1, 2, 1, "", "dataframe_to_fixed_width"], [1, 2, 1, "", "delete_roadway_feature_change"], [1, 2, 1, "", "deletion_map"], [1, 2, 1, "", "fill_na"], [1, 2, 1, "", "from_RoadwayNetwork"], [1, 2, 1, "", "get_attribute"], [1, 2, 1, "", "get_managed_lane_node_ids"], [1, 2, 1, "", "get_modal_graph"], [1, 2, 1, "", "get_modal_links_nodes"], [1, 2, 1, "", "get_property_by_time_period_and_group"], [1, 2, 1, "", "identify_segment"], [1, 2, 1, "", "identify_segment_endpoints"], [1, 2, 1, "", "is_network_connected"], [1, 2, 1, "", "load_transform_network"], [1, 2, 1, "", "network_connection_plot"], [1, 2, 1, "", "orig_dest_nodes_foreign_key"], [1, 2, 1, "", "ox_graph"], [1, 2, 1, "", "path_search"], [1, 2, 1, "", "read"], [1, 2, 1, "", "read_match_result"], [1, 2, 1, "", "rename_variables_for_dbf"], [1, 2, 1, "", "roadway_net_to_gdf"], [1, 2, 1, "", "roadway_standard_to_met_council_network"], [1, 2, 1, "", "select_roadway_features"], [1, 2, 1, "", "selection_has_unique_link_id"], [1, 2, 1, "", "selection_map"], [1, 2, 1, "", "shortest_path"], [1, 2, 1, "", "split_properties_by_time_period_and_category"], [1, 2, 1, "", "update_distance"], [1, 2, 1, "", "validate_link_schema"], [1, 2, 1, "", "validate_node_schema"], [1, 2, 1, "", "validate_properties"], [1, 2, 1, "", "validate_selection"], [1, 2, 1, "", "validate_shape_schema"], [1, 2, 1, "", "validate_uniqueness"], [1, 2, 1, "", "write"], [1, 2, 1, "", "write_roadway_as_fixedwidth"], [1, 2, 1, "", "write_roadway_as_shp"]], "lasso.Parameters": [[2, 2, 1, "", "__init__"], [2, 3, 1, "", "county_link_range_dict"], [2, 3, 1, "", "maz_shape_file"], [2, 3, 1, "", "properties_to_split"], [2, 3, 1, "", "zones"]], "lasso.Project": [[3, 3, 1, "", "CALCULATED_VALUES"], [3, 3, 1, "id0", "DEFAULT_PROJECT_NAME"], [3, 3, 1, "id1", "STATIC_VALUES"], [3, 2, 1, "", "__init__"], [3, 2, 1, "", "add_highway_changes"], [3, 2, 1, "", "add_transit_changes"], [3, 3, 1, "", "base_cube_transit_network"], [3, 3, 1, "", "base_roadway_network"], [3, 3, 1, "", "build_cube_transit_network"], [3, 3, 1, "", "card_data"], [3, 2, 1, "", "create_project"], [3, 2, 1, "", "determine_roadway_network_changes_compatability"], [3, 2, 1, "", "emme_id_to_wrangler_id"], [3, 2, 1, "", "emme_name_to_wrangler_name"], [3, 2, 1, "", "evaluate_changes"], [3, 2, 1, "", "get_object_from_network_build_command"], [3, 2, 1, "", "get_operation_from_network_build_command"], [3, 3, 1, "", "parameters"], [3, 3, 1, "", "project_name"], [3, 2, 1, "", "read_logfile"], [3, 2, 1, "", "read_network_build_file"], [3, 3, 1, "", "roadway_link_changes"], [3, 3, 1, "", "roadway_node_changes"], [3, 3, 1, "", "transit_changes"], [3, 2, 1, "", "write_project_card"]], "lasso.StandardTransit": [[4, 2, 1, "", "__init__"], [4, 2, 1, "", "calculate_cube_mode"], [4, 2, 1, "", "cube_format"], [4, 2, 1, "", "evaluate_differences"], [4, 3, 1, "", "feed"], [4, 2, 1, "", "fromTransitNetwork"], [4, 3, 1, "", "parameters"], [4, 2, 1, "", "read_gtfs"], [4, 2, 1, "", "route_properties_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_dict_list"], [4, 2, 1, "", "shape_gtfs_to_emme"], [4, 2, 1, "", "time_to_cube_time_period"], [4, 2, 1, "", "write_as_cube_lin"]], "lasso.logger": [[5, 4, 1, "", "setupLogging"]], "lasso.util": [[6, 1, 1, "", "Point"], [6, 1, 1, "", "Polygon"], [6, 4, 1, "", "column_name_to_parts"], [6, 4, 1, "", "create_locationreference"], [6, 4, 1, "", "geodesic_point_buffer"], [6, 4, 1, "", "get_shared_streets_intersection_hash"], [6, 4, 1, "", "hhmmss_to_datetime"], [6, 1, 1, "", "partial"], [6, 4, 1, "", "secs_to_datetime"], [6, 4, 1, "", "shorten_name"], [6, 4, 1, "", "transform"], [6, 4, 1, "", "unidecode"]], "lasso.util.Point": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "x"], [6, 5, 1, "", "xy"], [6, 5, 1, "", "y"], [6, 5, 1, "", "z"]], "lasso.util.Polygon": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "id0", "exterior"], [6, 2, 1, "", "from_bounds"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 5, 1, "id1", "interiors"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "xy"]], "lasso.util.partial": [[6, 3, 1, "", "args"], [6, 3, 1, "", "func"], [6, 3, 1, "", "keywords"]]}, "objtypes": {"0": "py:module", "1": "py:class", "2": "py:method", "3": "py:attribute", "4": "py:function", "5": "py:property"}, "objnames": {"0": ["py", "module", "Python module"], "1": ["py", "class", "Python class"], "2": ["py", "method", "Python method"], "3": ["py", "attribute", "Python attribute"], "4": ["py", "function", "Python function"], "5": ["py", "property", "Python property"]}, "titleterms": {"lasso": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], "cubetransit": [0, 11], "modelroadwaynetwork": [1, 11], "todo": 1, "paramet": [2, 10, 11], "project": [3, 9, 10, 11], "standardtransit": [4, 11], "logger": 5, "util": [6, 7], "class": 7, "function": 7, "base": 7, "welcom": 8, "": 8, "document": 8, "content": 8, "indic": 8, "tabl": 8, "run": [9, 11], "creat": 9, "file": [9, 10, 11], "scenario": [9, 11], "export": 9, "network": [9, 11], "audit": 9, "report": 9, "setup": 10, "set": 10, "addit": 10, "data": 10, "start": 11, "out": 11, "instal": 11, "bleed": 11, "edg": 11, "from": 11, "clone": 11, "brief": 11, "intro": 11, "compon": 11, "roadwaynetwork": 11, "transitnetwork": 11, "projectcard": 11, "typic": 11, "workflow": 11, "card": 11, "transit": 11, "lin": 11, "cube": 11, "log": 11, "model": 11, "quickstart": 11, "jupyt": 11, "notebook": 11}, "envversion": {"sphinx.domains.c": 3, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 9, "sphinx.domains.index": 1, "sphinx.domains.javascript": 3, "sphinx.domains.math": 2, "sphinx.domains.python": 4, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.intersphinx": 1, "sphinx.ext.todo": 2, "sphinx.ext.viewcode": 1, "sphinx": 58}, "alltitles": {"lasso.CubeTransit": [[0, "lasso-cubetransit"]], "lasso.ModelRoadwayNetwork": [[1, "lasso-modelroadwaynetwork"]], "Todo": [[1, "id1"], [1, "id2"], [1, "id3"], [1, "id4"], [1, "id5"], [1, "id6"]], "lasso.Parameters": [[2, "lasso-parameters"]], "lasso.Project": [[3, "lasso-project"]], "lasso.StandardTransit": [[4, "lasso-standardtransit"]], "lasso.logger": [[5, "module-lasso.logger"]], "lasso.util": [[6, "module-lasso.util"]], "Lasso Classes and Functions": [[7, "module-lasso"]], "Base Classes": [[7, "base-classes"]], "Utils and Functions": [[7, "utils-and-functions"]], "Welcome to lasso\u2019s documentation!": [[8, "welcome-to-lasso-s-documentation"]], "Contents:": [[8, null]], "Indices and tables": [[8, "indices-and-tables"]], "Running Lasso": [[9, "running-lasso"]], "Create project files": [[9, "create-project-files"]], "Create a scenario": [[9, "create-a-scenario"]], "Exporting networks": [[9, "exporting-networks"]], "Auditing and Reporting": [[9, "auditing-and-reporting"]], "Setup": [[10, "setup"]], "Projects": [[10, "projects"]], "Parameters": [[10, "parameters"], [11, "parameters"]], "Settings": [[10, "settings"]], "Additional Data Files": [[10, "additional-data-files"]], "Starting Out": [[11, "starting-out"]], "Installation": [[11, "installation"]], "Bleeding Edge": [[11, "bleeding-edge"]], "From Clone": [[11, "from-clone"]], "Brief Intro": [[11, "brief-intro"]], "Components": [[11, "components"]], "RoadwayNetwork": [[11, "roadwaynetwork"]], "TransitNetwork": [[11, "transitnetwork"]], "ProjectCard": [[11, "projectcard"]], "Scenario": [[11, "scenario"]], "Project": [[11, "project"]], "ModelRoadwayNetwork": [[11, "modelroadwaynetwork"]], "StandardTransit": [[11, "standardtransit"]], "CubeTransit": [[11, "cubetransit"]], "Typical Workflow": [[11, "typical-workflow"]], "Project Cards from Transit LIN Files": [[11, "project-cards-from-transit-lin-files"]], "Project Cards from Cube LOG Files": [[11, "project-cards-from-cube-log-files"]], "Model Network Files for a Scenario": [[11, "model-network-files-for-a-scenario"]], "Running Quickstart Jupyter Notebooks": [[11, "running-quickstart-jupyter-notebooks"]]}, "indexentries": {"cubetransit (class in lasso)": [[0, "lasso.CubeTransit"]], "__init__() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.__init__"]], "add_additional_time_periods() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_additional_time_periods"]], "add_cube() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_cube"]], "build_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.build_route_name"]], "calculate_start_end_times() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.calculate_start_end_times"]], "create_add_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_add_route_card_dict"]], "create_delete_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_delete_route_card_dict"]], "create_from_cube() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.create_from_cube"]], "create_update_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_update_route_card_dict"]], "cube_properties_to_standard_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.cube_properties_to_standard_properties"]], "diff_dict (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.diff_dict"]], "evaluate_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_differences"]], "evaluate_route_property_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_route_property_differences"]], "evaluate_route_shape_changes() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.evaluate_route_shape_changes"]], "get_time_period_numbers_from_cube_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.get_time_period_numbers_from_cube_properties"]], "line_properties (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.line_properties"]], "lines (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.lines"]], "parameters (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.parameters"]], "program_type (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.program_type"]], "shapes (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.shapes"]], "source_list (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.source_list"]], "unpack_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.unpack_route_name"]], "calculated_values (lasso.modelroadwaynetwork attribute)": [[1, "lasso.ModelRoadwayNetwork.CALCULATED_VALUES"]], "modelroadwaynetwork (class in lasso)": [[1, "lasso.ModelRoadwayNetwork"]], "__init__() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.__init__"]], "add_counts() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_counts"]], "add_incident_link_data_to_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.add_incident_link_data_to_nodes"]], "add_new_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_new_roadway_feature_change"]], "add_variable_using_shst_reference() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_variable_using_shst_reference"]], "addition_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.addition_map"]], "apply() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply"]], "apply_managed_lane_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_managed_lane_feature_change"]], "apply_python_calculation() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_python_calculation"]], "apply_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_roadway_feature_change"]], "assess_connectivity() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.assess_connectivity"]], "build_selection_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.build_selection_key"]], "calculate_area_type() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_area_type"]], "calculate_centroidconnect() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_centroidconnect"]], "calculate_county() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_county"]], "calculate_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_distance"]], "calculate_mpo() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_mpo"]], "calculate_use() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_use"]], "convert_int() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.convert_int"]], "create_ml_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_ML_variable"]], "create_calculated_variables() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_calculated_variables"]], "create_dummy_connector_links() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_dummy_connector_links"]], "create_hov_corridor_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_hov_corridor_variable"]], "create_managed_lane_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_lane_network"]], "create_managed_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_variable"]], "dataframe_to_fixed_width() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.dataframe_to_fixed_width"]], "delete_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.delete_roadway_feature_change"]], "deletion_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.deletion_map"]], "fill_na() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.fill_na"]], "from_roadwaynetwork() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.from_RoadwayNetwork"]], "get_attribute() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_attribute"]], "get_managed_lane_node_ids() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_managed_lane_node_ids"]], "get_modal_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_graph"]], "get_modal_links_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_links_nodes"]], "get_property_by_time_period_and_group() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.get_property_by_time_period_and_group"]], "identify_segment() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment"]], "identify_segment_endpoints() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment_endpoints"]], "is_network_connected() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.is_network_connected"]], "load_transform_network() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.load_transform_network"]], "network_connection_plot() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.network_connection_plot"]], "orig_dest_nodes_foreign_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.orig_dest_nodes_foreign_key"]], "ox_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.ox_graph"]], "path_search() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.path_search"]], "read() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read"]], "read_match_result() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read_match_result"]], "rename_variables_for_dbf() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.rename_variables_for_dbf"]], "roadway_net_to_gdf() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.roadway_net_to_gdf"]], "roadway_standard_to_met_council_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.roadway_standard_to_met_council_network"]], "select_roadway_features() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.select_roadway_features"]], "selection_has_unique_link_id() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_has_unique_link_id"]], "selection_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_map"]], "shortest_path() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.shortest_path"]], "split_properties_by_time_period_and_category() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.split_properties_by_time_period_and_category"]], "update_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.update_distance"]], "validate_link_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_link_schema"]], "validate_node_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_node_schema"]], "validate_properties() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_properties"]], "validate_selection() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_selection"]], "validate_shape_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_shape_schema"]], "validate_uniqueness() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_uniqueness"]], "write() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write"]], "write_roadway_as_fixedwidth() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_fixedwidth"]], "write_roadway_as_shp() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_shp"]], "parameters (class in lasso)": [[2, "lasso.Parameters"]], "__init__() (lasso.parameters method)": [[2, "lasso.Parameters.__init__"]], "county_link_range_dict (lasso.parameters attribute)": [[2, "lasso.Parameters.county_link_range_dict"]], "maz_shape_file (lasso.parameters attribute)": [[2, "lasso.Parameters.maz_shape_file"]], "properties_to_split (lasso.parameters attribute)": [[2, "lasso.Parameters.properties_to_split"]], "zones (lasso.parameters attribute)": [[2, "lasso.Parameters.zones"]], "calculated_values (lasso.project attribute)": [[3, "lasso.Project.CALCULATED_VALUES"]], "default_project_name (lasso.project attribute)": [[3, "id0"], [3, "lasso.Project.DEFAULT_PROJECT_NAME"]], "project (class in lasso)": [[3, "lasso.Project"]], "static_values (lasso.project attribute)": [[3, "id1"], [3, "lasso.Project.STATIC_VALUES"]], "__init__() (lasso.project method)": [[3, "lasso.Project.__init__"]], "add_highway_changes() (lasso.project method)": [[3, "lasso.Project.add_highway_changes"]], "add_transit_changes() (lasso.project method)": [[3, "lasso.Project.add_transit_changes"]], "base_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.base_cube_transit_network"]], "base_roadway_network (lasso.project attribute)": [[3, "lasso.Project.base_roadway_network"]], "build_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.build_cube_transit_network"]], "card_data (lasso.project attribute)": [[3, "lasso.Project.card_data"]], "create_project() (lasso.project static method)": [[3, "lasso.Project.create_project"]], "determine_roadway_network_changes_compatability() (lasso.project static method)": [[3, "lasso.Project.determine_roadway_network_changes_compatability"]], "emme_id_to_wrangler_id() (lasso.project static method)": [[3, "lasso.Project.emme_id_to_wrangler_id"]], "emme_name_to_wrangler_name() (lasso.project static method)": [[3, "lasso.Project.emme_name_to_wrangler_name"]], "evaluate_changes() (lasso.project method)": [[3, "lasso.Project.evaluate_changes"]], "get_object_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_object_from_network_build_command"]], "get_operation_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_operation_from_network_build_command"]], "parameters (lasso.project attribute)": [[3, "lasso.Project.parameters"]], "project_name (lasso.project attribute)": [[3, "lasso.Project.project_name"]], "read_logfile() (lasso.project static method)": [[3, "lasso.Project.read_logfile"]], "read_network_build_file() (lasso.project static method)": [[3, "lasso.Project.read_network_build_file"]], "roadway_link_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_link_changes"]], "roadway_node_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_node_changes"]], "transit_changes (lasso.project attribute)": [[3, "lasso.Project.transit_changes"]], "write_project_card() (lasso.project method)": [[3, "lasso.Project.write_project_card"]], "standardtransit (class in lasso)": [[4, "lasso.StandardTransit"]], "__init__() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.__init__"]], "calculate_cube_mode() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.calculate_cube_mode"]], "cube_format() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.cube_format"]], "evaluate_differences() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.evaluate_differences"]], "feed (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.feed"]], "fromtransitnetwork() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.fromTransitNetwork"]], "parameters (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.parameters"]], "read_gtfs() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.read_gtfs"]], "route_properties_gtfs_to_cube() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.route_properties_gtfs_to_cube"]], "shape_gtfs_to_cube() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_cube"]], "shape_gtfs_to_dict_list() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_dict_list"]], "shape_gtfs_to_emme() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_emme"]], "time_to_cube_time_period() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.time_to_cube_time_period"]], "write_as_cube_lin() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.write_as_cube_lin"]], "lasso.logger": [[5, "module-lasso.logger"]], "module": [[5, "module-lasso.logger"], [6, "module-lasso.util"], [7, "module-lasso"]], "setuplogging() (in module lasso.logger)": [[5, "lasso.logger.setupLogging"]], "point (class in lasso.util)": [[6, "lasso.util.Point"]], "polygon (class in lasso.util)": [[6, "lasso.util.Polygon"]], "almost_equals() (lasso.util.point method)": [[6, "lasso.util.Point.almost_equals"]], "almost_equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.almost_equals"]], "area (lasso.util.point property)": [[6, "lasso.util.Point.area"]], "area (lasso.util.polygon property)": [[6, "lasso.util.Polygon.area"]], "args (lasso.util.partial attribute)": [[6, "lasso.util.partial.args"]], "boundary (lasso.util.point property)": [[6, "lasso.util.Point.boundary"]], "boundary (lasso.util.polygon property)": [[6, "lasso.util.Polygon.boundary"]], "bounds (lasso.util.point property)": [[6, "lasso.util.Point.bounds"]], "bounds (lasso.util.polygon property)": [[6, "lasso.util.Polygon.bounds"]], "buffer() (lasso.util.point method)": [[6, "lasso.util.Point.buffer"]], "buffer() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.buffer"]], "centroid (lasso.util.point property)": [[6, "lasso.util.Point.centroid"]], "centroid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.centroid"]], "column_name_to_parts() (in module lasso.util)": [[6, "lasso.util.column_name_to_parts"]], "contains() (lasso.util.point method)": [[6, "lasso.util.Point.contains"]], "contains() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains"]], "contains_properly() (lasso.util.point method)": [[6, "lasso.util.Point.contains_properly"]], "contains_properly() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains_properly"]], "convex_hull (lasso.util.point property)": [[6, "lasso.util.Point.convex_hull"]], "convex_hull (lasso.util.polygon property)": [[6, "lasso.util.Polygon.convex_hull"]], "coords (lasso.util.point property)": [[6, "lasso.util.Point.coords"]], "coords (lasso.util.polygon property)": [[6, "lasso.util.Polygon.coords"]], "covered_by() (lasso.util.point method)": [[6, "lasso.util.Point.covered_by"]], "covered_by() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covered_by"]], "covers() (lasso.util.point method)": [[6, "lasso.util.Point.covers"]], "covers() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covers"]], "create_locationreference() (in module lasso.util)": [[6, "lasso.util.create_locationreference"]], "crosses() (lasso.util.point method)": [[6, "lasso.util.Point.crosses"]], "crosses() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.crosses"]], "difference() (lasso.util.point method)": [[6, "lasso.util.Point.difference"]], "difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.difference"]], "disjoint() (lasso.util.point method)": [[6, "lasso.util.Point.disjoint"]], "disjoint() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.disjoint"]], "distance() (lasso.util.point method)": [[6, "lasso.util.Point.distance"]], "distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.distance"]], "dwithin() (lasso.util.point method)": [[6, "lasso.util.Point.dwithin"]], "dwithin() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.dwithin"]], "envelope (lasso.util.point property)": [[6, "lasso.util.Point.envelope"]], "envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.envelope"]], "equals() (lasso.util.point method)": [[6, "lasso.util.Point.equals"]], "equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals"]], "equals_exact() (lasso.util.point method)": [[6, "lasso.util.Point.equals_exact"]], "equals_exact() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals_exact"]], "exterior (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.exterior"]], "exterior (lasso.util.polygon property)": [[6, "id0"]], "from_bounds() (lasso.util.polygon class method)": [[6, "lasso.util.Polygon.from_bounds"]], "func (lasso.util.partial attribute)": [[6, "lasso.util.partial.func"]], "geodesic_point_buffer() (in module lasso.util)": [[6, "lasso.util.geodesic_point_buffer"]], "geom_type (lasso.util.point property)": [[6, "lasso.util.Point.geom_type"]], "geom_type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.geom_type"]], "geometrytype() (lasso.util.point method)": [[6, "lasso.util.Point.geometryType"]], "geometrytype() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.geometryType"]], "get_shared_streets_intersection_hash() (in module lasso.util)": [[6, "lasso.util.get_shared_streets_intersection_hash"]], "has_z (lasso.util.point property)": [[6, "lasso.util.Point.has_z"]], "has_z (lasso.util.polygon property)": [[6, "lasso.util.Polygon.has_z"]], "hausdorff_distance() (lasso.util.point method)": [[6, "lasso.util.Point.hausdorff_distance"]], "hausdorff_distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.hausdorff_distance"]], "hhmmss_to_datetime() (in module lasso.util)": [[6, "lasso.util.hhmmss_to_datetime"]], "interiors (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.interiors"]], "interiors (lasso.util.polygon property)": [[6, "id1"]], "interpolate() (lasso.util.point method)": [[6, "lasso.util.Point.interpolate"]], "interpolate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.interpolate"]], "intersection() (lasso.util.point method)": [[6, "lasso.util.Point.intersection"]], "intersection() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersection"]], "intersects() (lasso.util.point method)": [[6, "lasso.util.Point.intersects"]], "intersects() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersects"]], "is_closed (lasso.util.point property)": [[6, "lasso.util.Point.is_closed"]], "is_closed (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_closed"]], "is_empty (lasso.util.point property)": [[6, "lasso.util.Point.is_empty"]], "is_empty (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_empty"]], "is_ring (lasso.util.point property)": [[6, "lasso.util.Point.is_ring"]], "is_ring (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_ring"]], "is_simple (lasso.util.point property)": [[6, "lasso.util.Point.is_simple"]], "is_simple (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_simple"]], "is_valid (lasso.util.point property)": [[6, "lasso.util.Point.is_valid"]], "is_valid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_valid"]], "keywords (lasso.util.partial attribute)": [[6, "lasso.util.partial.keywords"]], "lasso.util": [[6, "module-lasso.util"]], "length (lasso.util.point property)": [[6, "lasso.util.Point.length"]], "length (lasso.util.polygon property)": [[6, "lasso.util.Polygon.length"]], "line_interpolate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_interpolate_point"]], "line_interpolate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_interpolate_point"]], "line_locate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_locate_point"]], "line_locate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_locate_point"]], "minimum_clearance (lasso.util.point property)": [[6, "lasso.util.Point.minimum_clearance"]], "minimum_clearance (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_clearance"]], "minimum_rotated_rectangle (lasso.util.point property)": [[6, "lasso.util.Point.minimum_rotated_rectangle"]], "minimum_rotated_rectangle (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_rotated_rectangle"]], "normalize() (lasso.util.point method)": [[6, "lasso.util.Point.normalize"]], "normalize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.normalize"]], "oriented_envelope (lasso.util.point property)": [[6, "lasso.util.Point.oriented_envelope"]], "oriented_envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.oriented_envelope"]], "overlaps() (lasso.util.point method)": [[6, "lasso.util.Point.overlaps"]], "overlaps() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.overlaps"]], "partial (class in lasso.util)": [[6, "lasso.util.partial"]], "point_on_surface() (lasso.util.point method)": [[6, "lasso.util.Point.point_on_surface"]], "point_on_surface() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.point_on_surface"]], "project() (lasso.util.point method)": [[6, "lasso.util.Point.project"]], "project() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.project"]], "relate() (lasso.util.point method)": [[6, "lasso.util.Point.relate"]], "relate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate"]], "relate_pattern() (lasso.util.point method)": [[6, "lasso.util.Point.relate_pattern"]], "relate_pattern() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate_pattern"]], "representative_point() (lasso.util.point method)": [[6, "lasso.util.Point.representative_point"]], "representative_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.representative_point"]], "reverse() (lasso.util.point method)": [[6, "lasso.util.Point.reverse"]], "reverse() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.reverse"]], "secs_to_datetime() (in module lasso.util)": [[6, "lasso.util.secs_to_datetime"]], "segmentize() (lasso.util.point method)": [[6, "lasso.util.Point.segmentize"]], "segmentize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.segmentize"]], "shorten_name() (in module lasso.util)": [[6, "lasso.util.shorten_name"]], "simplify() (lasso.util.point method)": [[6, "lasso.util.Point.simplify"]], "simplify() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.simplify"]], "svg() (lasso.util.point method)": [[6, "lasso.util.Point.svg"]], "svg() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.svg"]], "symmetric_difference() (lasso.util.point method)": [[6, "lasso.util.Point.symmetric_difference"]], "symmetric_difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.symmetric_difference"]], "touches() (lasso.util.point method)": [[6, "lasso.util.Point.touches"]], "touches() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.touches"]], "transform() (in module lasso.util)": [[6, "lasso.util.transform"]], "type (lasso.util.point property)": [[6, "lasso.util.Point.type"]], "type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.type"]], "unidecode() (in module lasso.util)": [[6, "lasso.util.unidecode"]], "union() (lasso.util.point method)": [[6, "lasso.util.Point.union"]], "union() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.union"]], "within() (lasso.util.point method)": [[6, "lasso.util.Point.within"]], "within() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.within"]], "wkb (lasso.util.point property)": [[6, "lasso.util.Point.wkb"]], "wkb (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb"]], "wkb_hex (lasso.util.point property)": [[6, "lasso.util.Point.wkb_hex"]], "wkb_hex (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb_hex"]], "wkt (lasso.util.point property)": [[6, "lasso.util.Point.wkt"]], "wkt (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkt"]], "x (lasso.util.point property)": [[6, "lasso.util.Point.x"]], "xy (lasso.util.point property)": [[6, "lasso.util.Point.xy"]], "xy (lasso.util.polygon property)": [[6, "lasso.util.Polygon.xy"]], "y (lasso.util.point property)": [[6, "lasso.util.Point.y"]], "z (lasso.util.point property)": [[6, "lasso.util.Point.z"]], "lasso": [[7, "module-lasso"]]}}) \ No newline at end of file diff --git a/branch/test_no_change/setup/index.html b/branch/test_no_change/setup/index.html new file mode 100644 index 0000000..224865c --- /dev/null +++ b/branch/test_no_change/setup/index.html @@ -0,0 +1,133 @@ + + + + + + + Setup — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Setup

+
+

Projects

+
+
+

Parameters

+
+
+

Settings

+
+
+

Additional Data Files

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/test_no_change/starting/index.html b/branch/test_no_change/starting/index.html new file mode 100644 index 0000000..1741429 --- /dev/null +++ b/branch/test_no_change/starting/index.html @@ -0,0 +1,434 @@ + + + + + + + Starting Out — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Starting Out

+
+

Installation

+

If you are managing multiple python versions, we suggest using virtualenv or conda virtual environments.

+

Example using a conda environment (recommended) and using the package manager pip to install Lasso from the source on GitHub.

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/Lasso@master
+
+
+

Lasso will install network_wrangler from the PyPi repository because it is included in Lasso’s requirements.txt.

+
+

Bleeding Edge

+

If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the develop branch of

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/network_wrangler@develop
+pip install git+https://github.com/wsp-sag/Lasso@develop
+
+
+
+
+

From Clone

+

If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in editable mode.

+

if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas osmnx -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+git clone https://github.com/wsp-sag/Lasso
+git clone https://github.com/wsp-sag/network_wrangler
+cd network_wrangler
+pip install -e .
+cd ..
+cd Lasso
+pip install -e .
+
+
+

Notes:

+
    +
  1. The -e installs it in editable mode.

  2. +
  3. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per these directions on github.

  4. +
  5. if you wanted to install from a specific tag/version number or branch, replace @master with @<branchname> or @tag

  6. +
  7. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler

  8. +
+

If you are going to be doing Lasso development, we also recommend:

+
    +
  • a good IDE such as Atom, VS Code, Sublime Text, etc. +with Python syntax highlighting turned on.

  • +
  • GitHub Desktop to locally update your clones

  • +
+
+
+
+

Brief Intro

+

Lasso is a ‘wrapper’ around the Network Wrangler utility.

+

Both Lasso and NetworkWrangler are built around the following data schemas:

+
    +
  • [roadway network], which is based on a mashup of Open Street Map and Shared Streets. In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category.

  • +
  • [transit network], which is based on a frequency-based implementation of the csv-based GTFS; and

  • +
  • [project card], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml.

  • +
+

In addition, Lasso utilizes the following data schemas:

+
    +
  • [MetCouncil Model Roadway Network Schema], which adds data fields to the roadway network schema that MetCouncil uses in their travel model including breaking out data fields by time period.

  • +
  • [MetCouncil Model Transit Network Schema], which uses the Cube PublicTransport format, and

  • +
  • [Cube Log Files], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler.

  • +
  • [Cube public transport line files], which define a set of transit lines in the cube software.

  • +
+
+

Components

+

Network Wrangler has the following atomic parts:

+
    +
  • RoadwayNetwork object, which represents the roadway network data as GeoDataFrames;

  • +
  • TransitNetwork object, which represents the transit network data as DataFrames;

  • +
  • ProjectCard object, which represents the data of the project card. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.;

  • +
  • Scenario object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network.

  • +
+

In addition, Lasso has the following atomic parts:

+
    +
  • Project object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file.

  • +
  • ModelRoadwayNetwork object is a subclass of RoadwayNetwork and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube.

  • +
  • StandardTransit, an object for holding a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s Cube Line files.

  • +
  • CubeTransit, an object for storing information about transit defined in Cube public transport line files +. Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries.

  • +
  • Parameters, A class representing all the parameters defining the networks +including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class.

  • +
+
+

RoadwayNetwork

+

Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames links_df, nodes_df, and shapes_df.

+
net = RoadwayNetwork.read(
+        link_filename=MY_LINK_FILE,
+        node_filename=MY_NODE_FILE,
+        shape_filename=MY_SHAPE_FILE,
+        shape_foreign_key ='shape_id',
+        
+    )
+my_selection = {
+    "link": [{"name": ["I 35E"]}],
+    "A": {"osm_node_id": "961117623"},  # start searching for segments at A
+    "B": {"osm_node_id": "2564047368"},
+}
+net.select_roadway_features(my_selection)
+
+my_change = [
+    {
+        'property': 'lanes',
+        'existing': 1,
+        'set': 2,
+     },
+     {
+        'property': 'drive_access',
+        'set': 0,
+      },
+]
+
+my_net.apply_roadway_feature_change(
+    my_net.select_roadway_features(my_selection),
+    my_change
+)
+
+ml_net = net.create_managed_lane_network(in_place=False)
+
+ml_net.is_network_connected(mode="drive"))
+
+_, disconnected_nodes = ml_net.assess_connectivity(
+  mode="walk",
+  ignore_end_nodes=True
+)
+ml_net.write(filename=my_out_prefix, path=my_dir)
+
+
+
+
+

TransitNetwork

+
+
+

ProjectCard

+
+
+

Scenario

+

Manages sets of project cards and tiering from a base scenario/set of networks.

+

+my_base_scenario = {
+    "road_net": RoadwayNetwork.read(
+        link_filename=STPAUL_LINK_FILE,
+        node_filename=STPAUL_NODE_FILE,
+        shape_filename=STPAUL_SHAPE_FILE,
+        fast=True,
+        shape_foreign_key ='shape_id',
+    ),
+    "transit_net": TransitNetwork.read(STPAUL_DIR),
+}
+
+card_filenames = [
+    "3_multiple_roadway_attribute_change.yml",
+    "multiple_changes.yml",
+    "4_simple_managed_lane.yml",
+]
+
+project_card_directory = os.path.join(STPAUL_DIR, "project_cards")
+
+project_cards_list = [
+    ProjectCard.read(os.path.join(project_card_directory, filename), validate=False)
+    for filename in card_filenames
+]
+
+my_scenario = Scenario.create_scenario(
+  base_scenario=my_base_scenario,
+  project_cards_list=project_cards_list,
+)
+my_scenario.check_scenario_requisites()
+
+my_scenario.apply_all_projects()
+
+my_scenario.scenario_summary()
+
+
+
+
+

Project

+

Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network;

+

+test_project = Project.create_project(
+  base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+  build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+  )
+
+test_project.evaluate_changes()
+
+test_project.write_project_card(
+  os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+  )
+
+
+
+
+

ModelRoadwayNetwork

+

A subclass of network_wrangler’s RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema.

+
net = ModelRoadwayNetwork.read(
+      link_filename=STPAUL_LINK_FILE,
+      node_filename=STPAUL_NODE_FILE,
+      shape_filename=STPAUL_SHAPE_FILE,
+      fast=True,
+      shape_foreign_key ='shape_id',
+  )
+
+net.write_roadway_as_fixedwidth()
+
+
+
+
+

StandardTransit

+

Translates the standard GTFS data to MetCouncil’s Cube Line files.

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+

CubeTransit

+

Used by the project class and has the capability to:

+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+

Parameters

+

Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary.

+
# read parameters from a yaml configuration  file
+# could also provide as a key/value pair
+with open(config_file) as f:
+      my_config = yaml.safe_load(f)
+
+# provide parameters at instantiation of ModelRoadwayNetwork
+model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork(
+            my_scenario.road_net, parameters=my_config.get("my_parameters", {})
+        )
+# network written with direction from the parameters given
+model_road_net.write_roadway_as_shp()
+
+
+
+
+
+

Typical Workflow

+

Workflows in Lasso and Network Wrangler typically accomplish one of two goals:

+
    +
  1. Create Project Cards to document network changes as a result of either transit or roadway projects.

  2. +
  3. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network.

  4. +
+
+

Project Cards from Transit LIN Files

+
+
+

Project Cards from Cube LOG Files

+
+
+

Model Network Files for a Scenario

+
+
+
+
+

Running Quickstart Jupyter Notebooks

+

To learn basic lasso functionality, please refer to the following jupyter notebooks in the /notebooks directory:

+
    +
  • Lasso Project Card Creation Quickstart.ipynb

  • +
  • Lasso Scenario Creation Quickstart.ipynb

  • +
+

Jupyter notebooks can be started by activating the lasso conda environment and typing jupyter notebook:

+
conda activate <my_lasso_environment>
+jupyter notebook
+
+
+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/genindex/index.html b/genindex/index.html new file mode 100644 index 0000000..9362016 --- /dev/null +++ b/genindex/index.html @@ -0,0 +1,1038 @@ + + + + + + Index — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Index

+ +
+ _ + | A + | B + | C + | D + | E + | F + | G + | H + | I + | K + | L + | M + | N + | O + | P + | R + | S + | T + | U + | V + | W + | X + | Y + | Z + +
+

_

+ + +
+ +

A

+ + + +
+ +

B

+ + + +
+ +

C

+ + + +
+ +

D

+ + + +
+ +

E

+ + + +
+ +

F

+ + + +
+ +

G

+ + + +
+ +

H

+ + + +
+ +

I

+ + + +
+ +

K

+ + +
+ +

L

+ + + +
+ +

M

+ + + +
+ +

N

+ + + +
+ +

O

+ + + +
+ +

P

+ + + +
+ +

R

+ + + +
+ +

S

+ + + +
+ +

T

+ + + +
+ +

U

+ + + +
+ +

V

+ + + +
+ +

W

+ + + +
+ +

X

+ + + +
+ +

Y

+ + +
+ +

Z

+ + + +
+ + + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/index.html b/index.html new file mode 100644 index 0000000..e09d620 --- /dev/null +++ b/index.html @@ -0,0 +1,179 @@ + + + + + + + Welcome to lasso’s documentation! — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Welcome to lasso’s documentation!

+

This package of utilities is a wrapper around the +network_wrangler package +for MetCouncil and MTC. It aims to have the following functionality:

+
    +
  1. parse Cube log files and base highway networks and create ProjectCards +for Network Wrangler

  2. +
  3. parse two Cube transit line files and create ProjectCards for NetworkWrangler

  4. +
  5. refine Network Wrangler highway networks to contain specific variables and +settings for the respective agency and export them to a format that can +be read in by Citilab’s Cube software.

  6. +
+ +
+
+

Indices and tables

+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/objects.inv b/objects.inv new file mode 100644 index 0000000..1f37899 Binary files /dev/null and b/objects.inv differ diff --git a/py-modindex/index.html b/py-modindex/index.html new file mode 100644 index 0000000..8e188ed --- /dev/null +++ b/py-modindex/index.html @@ -0,0 +1,134 @@ + + + + + + Python Module Index — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Python Module Index

+ +
+ l +
+ + + + + + + + + + + + + +
 
+ l
+ lasso +
    + lasso.logger +
    + lasso.util +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/running/index.html b/running/index.html new file mode 100644 index 0000000..52a12ed --- /dev/null +++ b/running/index.html @@ -0,0 +1,131 @@ + + + + + + + Running Lasso — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Running Lasso

+
+

Create project files

+
+
+

Create a scenario

+
+
+

Exporting networks

+
+
+

Auditing and Reporting

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/search/index.html b/search/index.html new file mode 100644 index 0000000..8f4005a --- /dev/null +++ b/search/index.html @@ -0,0 +1,124 @@ + + + + + + Search — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + + + +
+ +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + + + + + + \ No newline at end of file diff --git a/searchindex.js b/searchindex.js new file mode 100644 index 0000000..a285661 --- /dev/null +++ b/searchindex.js @@ -0,0 +1 @@ +Search.setIndex({"docnames": ["_generated/lasso.CubeTransit", "_generated/lasso.ModelRoadwayNetwork", "_generated/lasso.Parameters", "_generated/lasso.Project", "_generated/lasso.StandardTransit", "_generated/lasso.logger", "_generated/lasso.util", "autodoc", "index", "running", "setup", "starting"], "filenames": ["_generated/lasso.CubeTransit.rst", "_generated/lasso.ModelRoadwayNetwork.rst", "_generated/lasso.Parameters.rst", "_generated/lasso.Project.rst", "_generated/lasso.StandardTransit.rst", "_generated/lasso.logger.rst", "_generated/lasso.util.rst", "autodoc.rst", "index.rst", "running.md", "setup.md", "starting.md"], "titles": ["lasso.CubeTransit", "lasso.ModelRoadwayNetwork", "lasso.Parameters", "lasso.Project", "lasso.StandardTransit", "lasso.logger", "lasso.util", "Lasso Classes and Functions", "Welcome to lasso\u2019s documentation!", "Running Lasso", "Setup", "Starting Out"], "terms": {"class": [0, 1, 2, 3, 4, 6, 8, 11], "paramet": [0, 1, 3, 4, 6, 8], "sourc": [0, 1, 2, 3, 4, 5, 6, 11], "base": [0, 1, 2, 3, 4, 6, 8, 11], "object": [0, 1, 2, 3, 4, 6, 11], "store": [0, 1, 11], "inform": [0, 1, 4, 11], "about": [0, 1, 4, 11], "transit": [0, 1, 2, 3, 4, 8], "defin": [0, 1, 2, 3, 11], "cube": [0, 1, 2, 3, 4, 8], "line": [0, 1, 2, 3, 4, 6, 8, 11], "file": [0, 1, 2, 3, 4, 8], "ha": [0, 6, 11], "capabl": [0, 11], "pars": [0, 8, 11], "properti": [0, 1, 2, 4, 6, 11], "shape": [0, 1, 3, 4, 6, 11], "python": [0, 1, 2, 11], "dictionari": [0, 1, 2, 3, 4, 6, 11], "compar": [0, 3, 4, 11], "repres": [0, 2, 4, 6, 11], "chang": [0, 1, 3, 4, 11], "project": [0, 1, 2, 4, 6, 8], "card": [0, 1, 3, 4], "typic": [0, 3, 4, 6, 8], "usag": [0, 1, 3, 4], "exampl": [0, 1, 3, 4, 6, 11], "tn": [0, 11], "create_from_cub": [0, 11], "cube_dir": [0, 3, 11], "transit_change_list": [0, 11], "evaluate_differ": [0, 4, 11], "base_transit_network": [0, 3, 11], "list": [0, 1, 2, 3, 4, 6, 11], "string": [0, 1, 3, 4, 6], "uniqu": [0, 1, 3], "name": [0, 1, 2, 3, 4, 6, 11], "network": [0, 1, 2, 3, 4, 5, 8], "type": [0, 1, 2, 3, 4, 6, 11], "line_properti": 0, "kei": [0, 1, 3, 11], "valu": [0, 1, 3, 4, 6, 11], "ar": [0, 1, 2, 3, 6, 11], "These": 0, "directli": 0, "read": [0, 1, 3, 4, 8, 11], "from": [0, 1, 2, 3, 4, 6, 8], "haven": 0, "t": [0, 1, 6], "been": [0, 6], "translat": [0, 4, 11], "standard": [0, 1, 2, 4, 11], "dict": [0, 1, 2, 3, 4], "panda": [0, 1, 3], "datafram": [0, 1, 3, 4, 11], "node": [0, 1, 2, 3, 4, 6, 11], "follow": [0, 4, 8, 11], "column": [0, 1, 4], "node_id": 0, "int": [0, 1, 2, 3, 4, 6], "posit": [0, 6], "integ": [0, 1, 2], "id": [0, 1, 3, 4, 11], "number": [0, 1, 2, 3, 4, 6, 11], "neg": [0, 6], "indic": [0, 1, 6], "non": [0, 6], "stop": [0, 4], "boolean": [0, 1], "i": [0, 1, 3, 4, 5, 6, 8, 11], "order": [0, 1, 2, 6, 11], "within": [0, 1, 2, 6], "thi": [0, 1, 2, 3, 4, 6, 8, 11], "program_typ": 0, "either": [0, 1, 5, 6, 11], "pt": 0, "trnbld": 0, "str": [0, 1, 2, 3, 6], "instanc": [0, 1, 2, 3, 4, 11], "appli": [0, 1, 6], "which": [0, 1, 3, 6, 11], "includ": [0, 1, 2, 11], "time": [0, 1, 2, 4, 6, 11], "period": [0, 1, 2, 4, 11], "variabl": [0, 1, 2, 3, 4, 8, 11], "source_list": 0, "have": [0, 1, 6, 8], "ad": [0, 1, 3, 6, 11], "diff_dict": 0, "__init__": [0, 1, 2, 3, 4], "constructor": [0, 1, 3], "set": [0, 1, 2, 3, 4, 5, 6, 8, 11], "see": [0, 1, 3, 4, 6], "an": [0, 1, 3, 4, 6, 11], "method": [0, 1, 2, 3, 4, 6, 11], "add_additional_time_period": 0, "new_time_period_numb": 0, "orig_line_nam": 0, "copi": [0, 1, 3, 4, 6, 11], "rout": [0, 1, 4], "anoth": 0, "appropri": [0, 4], "specif": [0, 1, 8, 11], "new": [0, 1, 6, 11], "under": 0, "self": [0, 1, 2, 3, 4, 6], "origin": [0, 1, 6, 11], "its": [0, 1], "return": [0, 1, 3, 4, 6], "add_cub": 0, "transit_sourc": 0, "lin": [0, 3, 4], "add": [0, 1, 3, 6, 11], "exist": [0, 1, 3, 6, 11], "transitnetwork": [0, 4], "directori": [0, 1, 4, 11], "static": [0, 1, 3, 4], "build_route_nam": 0, "route_id": [0, 4], "time_period": [0, 1, 2], "agency_id": 0, "0": [0, 1, 2, 4, 6, 11], "direction_id": 0, "1": [0, 1, 2, 4, 6, 11], "creat": [0, 1, 2, 3, 4, 6, 8, 11], "contaten": 0, "agenc": [0, 8], "direct": [0, 1, 6, 11], "e": [0, 1, 11], "452": 0, "111": 0, "pk": [0, 2], "construct": [0, 6, 11], "line_nam": 0, "0_452": 0, "111_452_pk1": 0, "calculate_start_end_tim": 0, "line_properties_dict": 0, "calcul": [0, 1, 2, 3, 4, 6], "start": [0, 1, 4, 8], "end": [0, 1, 6], "warn": [0, 1], "doesn": [0, 1], "take": [0, 1], "care": 0, "discongru": 0, "flavor": [0, 1], "create_add_route_card_dict": 0, "format": [0, 1, 2, 4, 8, 11], "route_properti": 0, "being": 0, "updat": [0, 1, 11], "A": [0, 1, 2, 3, 4, 6, 11], "addit": [0, 1, 6, 8, 11], "create_delete_route_card_dict": 0, "base_transit_line_properties_dict": 0, "delet": [0, 1], "style": [0, 6], "attribut": [0, 1, 2, 3, 4, 11], "find": [0, 1, 4], "create_update_route_card_dict": 0, "updated_properties_dict": 0, "cube_properties_to_standard_properti": 0, "cube_properties_dict": 0, "convert": [0, 1, 4, 6], "most": 0, "pertin": 0, "like": [0, 1, 2, 6], "headwai": [0, 2, 4], "varibl": [0, 2], "stnadard": 0, "unit": [0, 1, 6], "minut": 0, "second": [0, 4, 6], "correct": 0, "base_transit": 0, "identifi": [0, 1, 11], "what": [0, 1, 3, 6], "need": [0, 1, 2, 3, 4], "For": [0, 1, 4, 6], "multipl": [0, 1, 6, 11], "make": [0, 1, 11], "duplic": 0, "so": [0, 1, 5, 6, 11], "each": [0, 1, 3, 6], "condit": [0, 1], "contain": [0, 1, 4, 6, 8, 11], "requir": [0, 1, 6, 11], "evalu": [0, 1, 3], "differ": [0, 2, 6], "between": [0, 1, 2, 3, 6], "evaluate_route_property_differ": 0, "properties_build": 0, "properties_bas": 0, "time_period_numb": 0, "absolut": [0, 6], "true": [0, 1, 3, 4, 5, 6, 11], "validate_bas": 0, "fals": [0, 1, 3, 4, 6, 11], "check": [0, 1, 3, 6], "ani": [0, 1, 3, 6, 11], "entri": [0, 3], "property_nam": 0, "property_valu": 0, "us": [0, 1, 2, 3, 4, 6, 11], "command": [0, 1, 3], "rather": [0, 1], "than": [0, 1, 6], "If": [0, 1, 3, 4, 6, 11], "automat": 0, "note": [0, 6, 11], "onli": [0, 1, 3, 4, 6], "numer": [0, 4, 6], "frequenc": [0, 4, 11], "suitabl": 0, "write": [0, 1, 3, 4, 11], "evaluate_route_shape_chang": 0, "shape_build": 0, "shape_bas": 0, "two": [0, 1, 6, 8, 11], "build": [0, 1, 3, 11], "version": [0, 6, 11], "ddatafram": 0, "get_time_period_numbers_from_cube_properti": 0, "properties_list": 0, "associ": [0, 2], "them": [0, 1, 4, 6, 8], "all": [0, 1, 2, 5, 6, 11], "found": [0, 1, 6], "unpack_route_nam": 0, "unpack": 0, "info": [0, 1, 2], "link": [1, 2, 3, 6, 11], "kwarg": [1, 2, 3, 6], "roadwaynetwork": [1, 3, 4], "subclass": [1, 11], "network_wrangl": [1, 8, 11], "represent": [1, 3, 4, 6], "physic": 1, "roadwai": [1, 2, 3, 4, 11], "geodatafram": [1, 11], "specifi": [1, 3, 6], "default": [1, 2, 3, 4, 6, 11], "cr": [1, 3, 6], "coordin": [1, 3, 6], "refer": [1, 3, 4, 6, 11], "system": [1, 3], "espg": [1, 3], "node_foreign_kei": [1, 3], "tabl": [1, 2, 3, 6], "link_foreign_kei": [1, 3], "foreign": [1, 3], "shape_foreign_kei": [1, 3, 11], "unique_link_id": [1, 3], "unique_node_id": [1, 3], "modes_to_network_link_vari": [1, 3], "map": [1, 2, 3, 6, 11], "mode": [1, 3, 4, 11], "modes_to_network_nodes_vari": [1, 3], "managed_lanes_node_id_scalar": [1, 3], "scalar": [1, 3, 6], "primari": [1, 3], "correspond": [1, 3, 4], "manag": [1, 2, 3, 11], "lane": [1, 2, 3, 11], "managed_lanes_link_id_scalar": [1, 3], "managed_lanes_required_attribut": [1, 3], "must": [1, 3, 6], "keep_same_attributes_ml_and_gp": [1, 3], "parallel": [1, 3, 6], "gener": [1, 3], "purpos": [1, 3, 6], "add_count": 1, "network_vari": 1, "aadt": 1, "mndot_count_shst_data": 1, "none": [1, 3, 4, 5, 6], "widot_count_shst_data": 1, "mndot_count_variable_shp": [1, 2], "widot_count_variable_shp": 1, "count": [1, 2], "mc": [1, 2, 4], "join": [1, 3, 4, 6, 11], "data": [1, 2, 3, 4, 8, 11], "via": 1, "shst": 1, "api": 1, "match": 1, "result": [1, 6, 11], "should": [1, 2, 3, 6, 11], "written": [1, 11], "path": [1, 3, 4, 6, 11], "mndot": [1, 2], "locat": [1, 2, 3, 4], "widot": 1, "geodatabas": 1, "add_incident_link_data_to_nod": 1, "links_df": [1, 11], "nodes_df": [1, 11], "link_vari": 1, "unique_node_kei": 1, "model_node_id": [1, 2], "go": [1, 11], "assess": [1, 3], "connect": 1, "incid": 1, "where": 1, "length": [1, 6], "n": [1, 2, 3, 6, 11], "out": [1, 2, 5, 6, 8], "add_new_roadway_feature_chang": 1, "featur": [1, 6], "also": [1, 6, 11], "valid": [1, 6, 11], "add_variable_using_shst_refer": 1, "var_shst_csvdata": 1, "shst_csv_variabl": 1, "network_var_typ": 1, "overwrit": 1, "bool": [1, 3, 6], "addition_map": 1, "show": 1, "project_card_dictionari": 1, "wrapper": [1, 8, 11], "apply_managed_lane_feature_chang": 1, "link_idx": 1, "in_plac": [1, 11], "lndice": 1, "whether": 1, "decid": 1, "connector": [1, 2], "when": [1, 3, 4, 6], "thei": [1, 2], "more": [1, 6, 11], "apply_python_calcul": 1, "pycod": 1, "execut": 1, "code": [1, 2, 4, 6, 11], "apply_roadway_feature_chang": [1, 11], "select": [1, 11], "pass": [1, 5, 6], "assess_connect": [1, 11], "ignore_end_nod": [1, 11], "graph": 1, "disconnect": 1, "subgraph": 1, "describ": [1, 11], "member": [1, 6], "one": [1, 6, 11], "drive": [1, 11], "walk": [1, 11], "bike": 1, "ignor": [1, 6], "strai": 1, "singleton": 1, "tupl": [1, 6], "osmnx": [1, 11], "networkx": 1, "digraph": 1, "build_selection_kei": 1, "selection_dict": 1, "combin": [1, 2, 4, 6], "queri": [1, 11], "b": [1, 2, 11], "you": [1, 3, 11], "selection_dictonari": 1, "serv": 1, "calculate_area_typ": 1, "area_type_shap": [1, 2], "area_type_shape_vari": 1, "area_typ": [1, 2, 3], "area_type_codes_dict": 1, "downtown_area_type_shap": [1, 2], "downtown_area_typ": [1, 2], "area": [1, 2, 3, 6], "centroid": [1, 2, 6], "geometri": [1, 2, 6], "field": [1, 2, 11], "determin": [1, 3, 6], "label": 1, "isn": 1, "perfect": 1, "much": 1, "quicker": 1, "other": [1, 6, 11], "The": [1, 2, 5, 6, 11], "geodadabas": 1, "input": [1, 6], "downtown": [1, 2], "boundari": [1, 2, 6], "counti": [1, 2, 3], "calculate_centroidconnect": 1, "centroidconnect": [1, 2, 3], "highest_taz_numb": [1, 2], "as_integ": 1, "max": 1, "taz": [1, 2], "calculate_counti": 1, "county_shap": [1, 2], "county_shape_vari": 1, "county_codes_dict": 1, "calculate_dist": 1, "distanc": [1, 2, 3, 6], "centroidconnect_onli": 1, "mile": 1, "centroidconnector": 1, "calculate_mpo": 1, "county_network_vari": 1, "mpo": [1, 2], "mpo_counti": [1, 2], "param": [1, 4], "county_vari": 1, "region": [1, 6], "calculate_us": 1, "defauli": 1, "convert_int": 1, "int_col_nam": 1, "create_ml_vari": 1, "ml_lane": [1, 2], "ml": 1, "placehold": 1, "come": 1, "log": [1, 3, 8], "create_calculated_vari": 1, "create_dummy_connector_link": 1, "ml_df": 1, "access_lan": 1, "egress_lan": 1, "access_roadwai": 1, "ml_access": 1, "egress_roadwai": 1, "access_name_prefix": 1, "access": [1, 4, 6], "dummi": 1, "egress_name_prefix": 1, "egress": 1, "gp_df": 1, "roadai": 1, "prefix": 1, "create_hov_corridor_vari": 1, "segment_id": [1, 2], "hov": 1, "corridor": 1, "create_managed_lane_network": [1, 11], "keep_additional_attributes_ml_and_gp": 1, "separ": [1, 3, 6], "look": 1, "want": [1, 11], "leav": 1, "some": 1, "rigor": 1, "test": [1, 2, 6], "create_managed_vari": 1, "dataframe_to_fixed_width": 1, "df": 1, "fix": [1, 2], "width": [1, 6], "transform": [1, 6], "delete_roadway_feature_chang": 1, "ignore_miss": 1, "get": [1, 2, 6, 11], "miss": 1, "fail": [1, 6], "deletion_map": 1, "fill_na": 1, "fill": [1, 6], "na": 1, "from_roadwaynetwork": [1, 11], "roadway_network_object": 1, "get_attribut": 1, "join_kei": 1, "source_shst_ref_df": 1, "source_gdf": 1, "field_nam": 1, "get_managed_lane_node_id": 1, "nodes_list": 1, "4500000": 1, "237": 1, "get_modal_graph": 1, "bike_access": [1, 2], "bu": [1, 4], "bus_onli": [1, 2], "drive_access": [1, 2, 11], "rail": [1, 4], "rail_onli": [1, 2], "walk_access": [1, 2], "strongli": 1, "vertex": [1, 6], "reachabl": 1, "everi": [1, 6], "get_modal_links_nod": 1, "kept": 1, "both": [1, 4, 6, 11], "filter": [1, 6], "right": [1, 6], "now": 1, "we": [1, 4, 11], "don": 1, "becaus": [1, 4, 6, 11], "mark": 1, "issu": 1, "discuss": 1, "http": [1, 4, 6, 11], "github": [1, 6, 11], "com": [1, 4, 6, 11], "wsp": [1, 11], "sag": [1, 11], "145": 1, "modal_nodes_df": 1, "mode_node_vari": 1, "get_property_by_time_period_and_group": 1, "prop": 1, "categori": [1, 2, 11], "default_return": 1, "seri": [1, 11], "group": 1, "16": [1, 2, 6], "00": [1, 2], "19": [1, 2], "option": [1, 6], "sov": [1, 2], "search": [1, 2, 8, 11], "hov3": [1, 2], "hov2": [1, 2], "identify_seg": 1, "o_id": 1, "d_id": 1, "endpoint": 1, "up": [1, 5, 11], "segment": [1, 4, 6, 11], "candid": 1, "otherwis": [1, 4, 6], "ram": 1, "hog": 1, "could": [1, 6, 11], "odd": 1, "shortest": 1, "segment_vari": 1, "keep": [1, 6], "identify_segment_endpoint": 1, "min_connecting_link": 1, "10": [1, 2, 6], "min_dist": 1, "max_link_devi": 1, "2": [1, 2, 4, 6, 11], "is_network_connect": [1, 11], "consid": [1, 6, 11], "cach": 1, "long": [1, 6], "load_transform_network": 1, "node_filenam": [1, 11], "link_filenam": [1, 11], "shape_filenam": [1, 11], "4326": [1, 6], "validate_schema": 1, "disk": 1, "schema": [1, 11], "shapes_df": [1, 11], "network_connection_plot": 1, "g": [1, 6], "disconnected_subgraph_nod": 1, "plot": 1, "fig": 1, "ax": [1, 6], "orig_dest_nodes_foreign_kei": 1, "whatev": 1, "u": 1, "v": [1, 2, 11], "ab": 1, "noth": 1, "assum": 1, "a_id": 1, "b_id": 1, "ox_graph": 1, "unique_link_kei": 1, "model_link_id": [1, 2, 3], "arrai": [1, 6], "remov": [1, 6], "certain": 1, "do": [1, 6, 11], "too": [1, 5], "link_df": 1, "referenc": 1, "multidigraph": 1, "path_search": 1, "candidate_links_df": 1, "weight_column": 1, "weight_factor": 1, "search_breadth": 1, "5": [1, 2, 4, 6], "max_search_breadth": 1, "candidate_link": 1, "part": [1, 6, 11], "foreigh": 1, "destin": 1, "weight": 1, "iter": [1, 6], "multipli": 1, "fast": [1, 11], "recalculate_calculated_vari": [1, 3], "recalculate_dist": [1, 3], "json": [1, 11], "geojson": 1, "skip": 1, "speed": 1, "spatial": [1, 6], "etc": [1, 2, 3, 11], "re": 1, "read_match_result": 1, "lot": 1, "same": [1, 4, 6], "concaten": 1, "singl": [1, 3, 6], "geopanda": [1, 11], "sure": 1, "why": 1, "util": [1, 8, 11], "rename_variables_for_dbf": 1, "input_df": 1, "variable_crosswalk": 1, "output_vari": [1, 2], "convert_geometry_to_xi": 1, "renam": [1, 3], "dbf": 1, "shp": [1, 2], "char": 1, "crosswalk": [1, 3], "x": [1, 2, 6], "y": [1, 2, 6], "roadway_net_to_gdf": 1, "roadway_net": 1, "turn": [1, 11], "export": [1, 8, 11], "sophist": 1, "attach": 1, "roadway_standard_to_met_council_network": 1, "output_epsg": [1, 2], "consist": [1, 6, 11], "metcouncil": [1, 2, 4, 8, 11], "": [1, 2, 4, 6, 11], "model": [1, 2], "expect": [1, 6], "epsg": [1, 2, 6], "output": [1, 2, 3, 4, 6], "select_roadway_featur": [1, 11], "search_mod": 1, "force_search": 1, "sp_weight_factor": 1, "satisfi": [1, 6], "criteria": 1, "net": [1, 11], "osm": [1, 2], "share": [1, 11], "street": [1, 11], "osm_model_link_id": 1, "1234": 1, "shstid": 1, "4321": 1, "regex": 1, "facil": [1, 2, 11], "main": 1, "st": [1, 2], "least": [1, 11], "perform": [1, 4], "even": 1, "previou": 1, "discourag": 1, "meander": 1, "ref": 1, "here": [1, 6], "defaul": 1, "selection_has_unique_link_id": 1, "selection_dictionari": 1, "selection_map": 1, "selected_link_idx": 1, "candidate_link_idx": 1, "selected_links_idx": 1, "candidate_links_idx": 1, "shortest_path": 1, "graph_links_df": 1, "100": 1, "four": 1, "nx": 1, "split_properties_by_time_period_and_categori": 1, "properties_to_split": [1, 2], "split": [1, 2, 4], "structur": 1, "stratifi": 1, "times_period": 1, "am": [1, 2], "6": [1, 2, 4, 6], "pm": [1, 2], "15": [1, 2], "update_dist": 1, "use_shap": 1, "inplac": 1, "straight": 1, "avail": 1, "portion": 1, "provid": [1, 3, 4, 6, 11], "entir": 1, "crow": 1, "fly": 1, "meter": [1, 6], "nan": 1, "validate_link_schema": 1, "schema_loc": 1, "roadway_network_link": 1, "validate_node_schema": 1, "node_fil": 1, "roadway_network_nod": 1, "validate_properti": 1, "ignore_exist": 1, "require_existing_for_chang": 1, "theproject": 1, "dictonari": 1, "validate_select": 1, "selection_requir": 1, "whetther": 1, "minimum": [1, 5, 6], "validate_shape_schema": 1, "shape_fil": 1, "roadway_network_shap": 1, "validate_uniqu": 1, "confirm": 1, "met": 1, "filenam": [1, 3, 11], "were": 1, "save": 1, "write_roadway_as_fixedwidth": [1, 11], "output_dir": 1, "node_output_vari": 1, "link_output_vari": 1, "output_link_txt": [1, 2], "output_node_txt": [1, 2], "output_link_header_width_txt": [1, 2], "output_node_header_width_txt": [1, 2], "output_cube_network_script": [1, 2], "drive_onli": 1, "function": [1, 5, 6, 8, 11], "doe": [1, 4], "header": [1, 2], "3": [1, 2, 4, 6, 11], "script": [1, 2], "run": [1, 8], "databas": [1, 3], "record": [1, 4], "write_roadway_as_shp": [1, 11], "data_to_csv": 1, "data_to_dbf": 1, "output_link_shp": [1, 2], "output_node_shp": [1, 2], "output_link_csv": [1, 2], "output_node_csv": [1, 2], "output_gpkg": 1, "output_link_gpkg_lay": 1, "output_node_gpkg_lay": 1, "output_gpkg_link_filt": 1, "gpkg": 1, "csv": [1, 2, 3, 11], "full": [1, 6], "geopackag": 1, "layer": [1, 11], "subset": 1, "calculated_valu": [1, 3], "dai": [2, 4, 11], "can": [2, 6, 8, 11], "runtim": [2, 11], "initi": [2, 3, 11], "keyword": [2, 6, 11], "argument": [2, 6, 11], "explicitli": [2, 11], "highlight": [2, 11], "attr": 2, "time_period_to_tim": 2, "abbrevi": [2, 4], "gtf": [2, 4, 11], "highwai": [2, 3, 8, 11], "ea": 2, "md": 2, "ev": 2, "cube_time_period": 2, "4": [2, 4, 6], "demand": 2, "allow": [2, 6], "suffix": 2, "truck": 2, "trk": 2, "final": 2, "lanes_am": 2, "time_periods_to_tim": 2, "shapefil": 2, "r": 2, "metcouncil_data": 2, "cb_2017_us_county_5m": 2, "county_variable_shp": 2, "lanes_lookup_fil": 2, "lookup": 2, "centroid_connect_lan": 2, "anoka": 2, "dakota": 2, "hennepin": 2, "ramsei": 2, "scott": 2, "washington": 2, "carver": 2, "taz_shap": 2, "tazofficialwcurrentforecast": 2, "taz_data": 2, "highest": 2, "3100": 2, "link_id": 2, "shstgeometryid": 2, "roadway_class": 2, "truck_access": 2, "trn_priority_ea": 2, "trn_priority_am": 2, "trn_priority_md": 2, "trn_priority_pm": 2, "trn_priority_ev": 2, "ttime_assert_ea": 2, "ttime_assert_am": 2, "ttime_assert_md": 2, "ttime_assert_pm": 2, "ttime_assert_ev": 2, "lanes_ea": 2, "lanes_md": 2, "lanes_pm": 2, "lanes_ev": 2, "price_sov_ea": 2, "price_hov2_ea": 2, "price_hov3_ea": 2, "price_truck_ea": 2, "price_sov_am": 2, "price_hov2_am": 2, "price_hov3_am": 2, "price_truck_am": 2, "price_sov_md": 2, "price_hov2_md": 2, "price_hov3_md": 2, "price_truck_md": 2, "price_sov_pm": 2, "price_hov2_pm": 2, "price_hov3_pm": 2, "price_truck_pm": 2, "price_sov_ev": 2, "price_hov2_ev": 2, "price_hov3_ev": 2, "price_truck_ev": 2, "roadway_class_idx": 2, "facility_typ": 2, "osm_node_id": [2, 6, 11], "bike_nod": 2, "transit_nod": 2, "walk_nod": 2, "drive_nod": 2, "ml_lanes_ea": 2, "ml_lanes_am": 2, "ml_lanes_md": 2, "ml_lanes_pm": 2, "ml_lanes_ev": 2, "osm_facility_type_dict": 2, "thrivemsp2040communitydesign": 2, "area_type_variable_shp": 2, "comdes2040": 2, "area_type_code_dict": 2, "23": 2, "urban": [2, 4], "center": [2, 6], "24": 2, "25": 2, "35": 2, "36": 2, "41": 2, "51": 2, "52": 2, "53": 2, "60": 2, "downtownzones_taz": 2, "mrcc_roadway_class_shap": 2, "mrcc": 2, "trans_mrcc_centerlin": 2, "mrcc_roadway_class_variable_shp": 2, "mrcc_roadway_class_shp": 2, "route_si": 2, "widot_roadway_class_shap": 2, "wisconsin": 2, "wisconsin_lanes_counts_median": 2, "wislr": 2, "widot_roadway_class_variable_shp": 2, "rdwy_ctgy_": 2, "mndot_count_shap": 2, "count_mn": 2, "aadt_2017_count_loc": 2, "osm_highway_facility_type_crosswalk": 2, "legacy_tm2_attribut": 2, "shstreferenceid": 2, "legaci": 2, "tm2": 2, "osm_lanes_attribut": 2, "tam_tm2_attribut": 2, "tam": 2, "tom_tom_attribut": 2, "tomtom": 2, "tomtom_attribut": 2, "sfcta_attribut": 2, "sfcta": 2, "geograph": 2, "102646": 2, "scratch": 2, "txt": [2, 11], "links_header_width": 2, "nodes_header_width": 2, "import": [2, 6], "make_complete_network_from_fixed_width_fil": 2, "county_link_range_dict": 2, "county_code_dict": 2, "7": [2, 4, 11], "extern": 2, "chisago": 2, "11": 2, "goodhu": 2, "12": 2, "isanti": 2, "13": 2, "le": 2, "sueur": 2, "14": 2, "mcleod": 2, "pierc": 2, "polk": 2, "17": 2, "rice": 2, "18": 2, "sherburn": 2, "siblei": 2, "20": 2, "croix": 2, "21": 2, "wright": 2, "22": 2, "route_type_bus_mode_dict": 2, "urb": 2, "loc": 2, "sub": [2, 6], "express": [2, 4, 6], "route_type_mode_dict": 2, "8": [2, 4, 6], "9": [2, 4], "cube_time_periods_nam": 2, "op": 2, "detail": [2, 5], "zone": 2, "possibl": [2, 6], "roadway_link_chang": 3, "roadway_node_chang": 3, "transit_chang": [3, 4], "base_roadway_network": 3, "base_cube_transit_network": 3, "build_cube_transit_network": 3, "project_nam": 3, "produc": [3, 6], "test_project": [3, 11], "create_project": [3, 11], "base_cube_transit_sourc": 3, "o": [3, 4, 11], "build_cube_transit_sourc": 3, "transit_route_shape_chang": [3, 11], "evaluate_chang": [3, 11], "write_project_card": [3, 11], "scratch_dir": [3, 11], "t_transit_shape_test": [3, 11], "yml": [3, 11], "default_project_nam": 3, "level": 3, "constant": 3, "static_valu": 3, "card_data": 3, "cubetransit": [3, 8], "bunch": 3, "projectcard": [3, 8], "case": 3, "standardtransit": [3, 8], "add_highway_chang": 3, "limit_variables_to_existing_network": 3, "hoc": [3, 11], "add_transit_chang": 3, "roadway_log_fil": 3, "roadway_shp_fil": 3, "roadway_csv_fil": 3, "network_build_fil": 3, "emme_node_id_crosswalk_fil": 3, "emme_name_crosswalk_fil": 3, "base_roadway_dir": 3, "base_transit_dir": [3, 4, 11], "consum": 3, "logfil": 3, "emm": [3, 4], "folder": 3, "base_cube_transit_fil": 3, "build_cube_transit_fil": 3, "first": [3, 4, 6], "recalcul": 3, "determine_roadway_network_changes_compat": 3, "emme_id_to_wrangler_id": 3, "emme_link_change_df": 3, "emme_node_change_df": 3, "emme_transit_changes_df": 3, "rewrit": 3, "wrangler": [3, 8, 11], "emme_name_to_wrangler_nam": 3, "aggreg": 3, "get_object_from_network_build_command": 3, "row": [3, 4], "histori": 3, "l": 3, "get_operation_from_network_build_command": 3, "action": 3, "c": [3, 6], "d": [3, 6], "read_logfil": 3, "logfilenam": 3, "reprsent": [3, 11], "read_network_build_fil": 3, "networkbuildfilenam": 3, "nework": 3, "assign_group": 3, "user": [3, 11], "TO": 3, "ptg_feed": 4, "hold": [4, 11], "feed": [4, 11], "partridg": [4, 11], "manipul": [4, 11], "cube_transit_net": [4, 11], "read_gtf": [4, 11], "write_as_cube_lin": [4, 11], "write_dir": [4, 11], "outfil": [4, 11], "calculate_cube_mod": 4, "assign": 4, "logic": 4, "route_typ": 4, "develop": [4, 11], "googl": 4, "cube_mod": 4, "route_type_to_cube_mod": 4, "tram": 4, "streetcar": 4, "light": 4, "further": 4, "disaggreg": 4, "buse": 4, "suburban": 4, "longnam": 4, "lower": [4, 6], "elif": 4, "99": 4, "local": [4, 11], "els": [4, 6], "route_long_nam": 4, "cube_format": 4, "represnt": 4, "notat": 4, "trip": 4, "trip_id": 4, "shape_id": [4, 11], "tod": 4, "onewai": 4, "oper": [4, 6], "fromtransitnetwork": 4, "transit_network_object": 4, "modelroadwaynetwork": [4, 8], "gtfs_feed_dir": 4, "route_properties_gtfs_to_cub": 4, "prepar": 4, "trip_df": 4, "shape_gtfs_to_cub": 4, "add_nntim": 4, "shape_gtfs_to_dict_list": 4, "step": 4, "through": 4, "todo": 4, "elimin": 4, "necess": 4, "tag": [4, 11], "begin": 4, "As": 4, "m": 4, "minim": 4, "modif": 4, "question": 4, "shape_pt_sequ": 4, "shape_mode_node_id": 4, "is_stop": 4, "stop_sequ": 4, "shape_gtfs_to_emm": 4, "trip_row": 4, "time_to_cube_time_period": 4, "start_time_sec": 4, "as_str": 4, "verbos": 4, "midnight": [4, 6], "this_tp": 4, "this_tp_num": 4, "outpath": 4, "after": 4, "setuplog": 5, "infologfilenam": 5, "debuglogfilenam": 5, "logtoconsol": 5, "infolog": 5, "ters": 5, "just": 5, "give": 5, "bare": 5, "composit": 5, "clear": 5, "later": 5, "debuglog": 5, "veri": [5, 6], "noisi": 5, "debug": 5, "spew": 5, "consol": 5, "point": 6, "arg": 6, "basegeometri": 6, "possibli": 6, "z": 6, "zero": 6, "dimension": 6, "float": 6, "sequenc": 6, "individu": 6, "p": 6, "print": 6, "almost_equ": 6, "decim": 6, "equal": 6, "place": 6, "deprec": 6, "sinc": 6, "confus": 6, "equals_exact": 6, "instead": 6, "approxim": 6, "compon": [6, 8], "linestr": 6, "1e": 6, "buffer": 6, "quad_seg": 6, "cap_styl": 6, "round": 6, "join_styl": 6, "mitre_limit": 6, "single_sid": 6, "dilat": 6, "eros": 6, "small": 6, "mai": 6, "sometim": 6, "tidi": 6, "polygon": 6, "around": [6, 8, 11], "resolut": 6, "angl": 6, "fillet": 6, "buffercapstyl": 6, "squar": 6, "flat": 6, "circular": 6, "rectangular": 6, "while": 6, "involv": 6, "bufferjoinstyl": 6, "mitr": 6, "bevel": 6, "midpoint": 6, "edg": [6, 8], "touch": 6, "depend": 6, "limit": 6, "ratio": 6, "sharp": 6, "corner": 6, "offset": 6, "meet": 6, "miter": 6, "extend": 6, "To": [6, 11], "prevent": 6, "unreason": 6, "control": 6, "maximum": 6, "exce": 6, "side": 6, "sign": 6, "left": 6, "hand": 6, "regular": 6, "cap": 6, "alwai": 6, "forc": 6, "equival": 6, "cap_flat": 6, "quadseg": 6, "alia": 6, "strictli": 6, "wkt": 6, "load": 6, "gon": 6, "approx": 6, "radiu": 6, "circl": 6, "1365484905459": 6, "128": 6, "141513801144": 6, "triangl": 6, "exterior": 6, "coord": 6, "contains_properli": 6, "complet": 6, "common": 6, "document": [6, 11], "covered_bi": 6, "cover": 6, "cross": 6, "grid_siz": 6, "disjoint": 6, "unitless": 6, "dwithin": 6, "given": [6, 11], "topolog": 6, "toler": 6, "comparison": 6, "geometrytyp": 6, "hausdorff_dist": 6, "hausdorff": 6, "interpol": 6, "normal": 6, "along": 6, "linear": 6, "taken": 6, "measur": 6, "revers": 6, "rang": 6, "index": [6, 8], "handl": 6, "clamp": 6, "interpret": 6, "fraction": 6, "line_interpolate_point": 6, "intersect": 6, "line_locate_point": 6, "nearest": 6, "form": 6, "canon": 6, "ring": 6, "multi": 6, "multilinestr": 6, "overlap": 6, "point_on_surfac": 6, "guarante": 6, "cheapli": 6, "representative_point": 6, "relat": 6, "de": 6, "9im": 6, "matrix": 6, "relate_pattern": 6, "pattern": 6, "relationship": 6, "interior": 6, "unchang": 6, "is_ccw": 6, "clockwis": 6, "max_segment_length": 6, "vertic": 6, "longer": 6, "evenli": 6, "subdivid": 6, "densifi": 6, "unmodifi": 6, "array_lik": 6, "greater": 6, "simplifi": 6, "preserve_topologi": 6, "dougla": 6, "peucker": 6, "algorithm": 6, "unless": 6, "topologi": 6, "preserv": 6, "invalid": 6, "svg": 6, "scale_factor": 6, "fill_color": 6, "opac": 6, "element": 6, "factor": 6, "diamet": 6, "hex": 6, "color": 6, "66cc99": 6, "ff3333": 6, "symmetric_differ": 6, "symmetr": 6, "union": 6, "dimens": 6, "bound": 6, "collect": 6, "empti": 6, "null": 6, "minx": 6, "mini": 6, "maxx": 6, "maxi": 6, "geometr": 6, "convex_hul": 6, "convex": 6, "hull": 6, "less": 6, "three": [6, 11], "multipoint": 6, "triangular": 6, "imagin": 6, "elast": 6, "band": 6, "stretch": 6, "coordinatesequ": 6, "envelop": 6, "figur": 6, "geom_typ": 6, "has_z": 6, "is_clos": 6, "close": 6, "applic": 6, "is_empti": 6, "is_r": 6, "is_simpl": 6, "simpl": 6, "mean": 6, "is_valid": 6, "definit": 6, "minimum_clear": 6, "move": 6, "minimum_rotated_rectangl": 6, "orient": 6, "rotat": 6, "rectangl": 6, "enclos": 6, "unlik": 6, "constrain": 6, "degener": 6, "oriented_envelop": 6, "wkb": 6, "wkb_hex": 6, "xy": 6, "shell": 6, "hole": 6, "It": [6, 8], "space": 6, "pair": [6, 11], "tripl": 6, "abov": 6, "classmethod": 6, "from_bound": 6, "xmin": 6, "ymin": 6, "xmax": 6, "ymax": 6, "stroke": 6, "partial": 6, "func": 6, "futur": 6, "call": 6, "column_name_to_part": 6, "create_locationrefer": 6, "geodesic_point_buff": 6, "lat": 6, "lon": 6, "get_shared_streets_intersection_hash": 6, "per": [6, 11], "sharedstreet": 6, "j": 6, "blob": 6, "0e6d7de0aee2e9ae3b007d1e45284b06cc241d02": 6, "src": 6, "l553": 6, "l565": 6, "93": 6, "0965985": 6, "44": 6, "952112199999995": 6, "954734870": 6, "69f13f881649cb21ee3b359730790bb9": 6, "hhmmss_to_datetim": 6, "hhmmss_str": 6, "datetim": 6, "hh": 6, "mm": 6, "ss": 6, "dt": 6, "secs_to_datetim": 6, "sec": 6, "shorten_nam": 6, "geom": 6, "xp": 6, "yp": 6, "zp": 6, "shall": 6, "ident": 6, "def": 6, "id_func": 6, "g2": 6, "g1": 6, "pyproj": 6, "accur": 6, "wgs84": 6, "utm": 6, "32618": 6, "from_cr": 6, "always_xi": 6, "support": 6, "lambda": 6, "unidecod": 6, "error": 6, "replace_str": 6, "transliter": 6, "unicod": 6, "ascii": 6, "\u5317\u4eb0": 6, "bei": 6, "jing": 6, "tri": 6, "codec": 6, "charact": 6, "fall": 6, "back": 6, "five": 6, "faster": 6, "slightli": 6, "slower": 6, "unicode_expect_nonascii": 6, "present": 6, "replac": [6, 11], "strict": 6, "rais": 6, "unidecodeerror": 6, "substitut": 6, "might": [6, 11], "packag": [8, 11], "mtc": 8, "aim": 8, "networkwrangl": [8, 11], "refin": 8, "respect": 8, "citilab": 8, "softwar": [8, 11], "instal": 8, "bleed": 8, "clone": 8, "brief": 8, "intro": 8, "workflow": 8, "quickstart": 8, "jupyt": 8, "notebook": 8, "setup": 8, "scenario": 8, "audit": 8, "report": 8, "logger": 8, "modul": 8, "page": 8, "suggest": 11, "virtualenv": 11, "conda": 11, "virtual": 11, "environ": 11, "recommend": 11, "pip": 11, "lasso": 11, "config": 11, "channel": 11, "forg": 11, "rtree": 11, "my_lasso_environ": 11, "activ": 11, "git": 11, "master": 11, "pypi": 11, "repositori": 11, "date": 11, "branch": 11, "work": 11, "your": 11, "machin": 11, "edit": 11, "plan": 11, "well": 11, "cd": 11, "team": 11, "contribut": 11, "bxack": 11, "pleas": 11, "fork": 11, "befor": 11, "upstream": 11, "branchnam": 11, "frequent": 11, "instruct": 11, "good": 11, "atom": 11, "sublim": 11, "text": 11, "syntax": 11, "desktop": 11, "built": 11, "mashup": 11, "open": 11, "In": 11, "nest": 11, "span": 11, "implement": 11, "novel": 11, "travel": 11, "break": 11, "publictransport": 11, "done": 11, "gui": 11, "public": 11, "transport": 11, "infrastructur": 11, "servic": 11, "tier": 11, "made": 11, "mainli": 11, "my_link_fil": 11, "my_node_fil": 11, "my_shape_fil": 11, "my_select": 11, "35e": 11, "961117623": 11, "2564047368": 11, "my_chang": 11, "my_net": 11, "ml_net": 11, "_": 11, "disconnected_nod": 11, "my_out_prefix": 11, "my_dir": 11, "my_base_scenario": 11, "road_net": 11, "stpaul_link_fil": 11, "stpaul_node_fil": 11, "stpaul_shape_fil": 11, "transit_net": 11, "stpaul_dir": 11, "card_filenam": 11, "3_multiple_roadway_attribute_chang": 11, "multiple_chang": 11, "4_simple_managed_lan": 11, "project_card_directori": 11, "project_card": 11, "project_cards_list": 11, "my_scenario": 11, "create_scenario": 11, "base_scenario": 11, "check_scenario_requisit": 11, "apply_all_project": 11, "scenario_summari": 11, "base_transit_sourc": 11, "build_transit_sourc": 11, "understand": 11, "how": 11, "overrid": 11, "those": 11, "instanti": 11, "yaml": 11, "configur": 11, "config_fil": 11, "f": 11, "my_config": 11, "safe_load": 11, "model_road_net": 11, "my_paramet": 11, "accomplish": 11, "goal": 11, "top": 11, "learn": 11, "basic": 11, "creation": 11, "ipynb": 11}, "objects": {"": [[7, 0, 0, "-", "lasso"]], "lasso": [[0, 1, 1, "", "CubeTransit"], [1, 1, 1, "", "ModelRoadwayNetwork"], [2, 1, 1, "", "Parameters"], [3, 1, 1, "", "Project"], [4, 1, 1, "", "StandardTransit"], [5, 0, 0, "-", "logger"], [6, 0, 0, "-", "util"]], "lasso.CubeTransit": [[0, 2, 1, "", "__init__"], [0, 2, 1, "", "add_additional_time_periods"], [0, 2, 1, "", "add_cube"], [0, 2, 1, "", "build_route_name"], [0, 2, 1, "", "calculate_start_end_times"], [0, 2, 1, "", "create_add_route_card_dict"], [0, 2, 1, "", "create_delete_route_card_dict"], [0, 2, 1, "", "create_from_cube"], [0, 2, 1, "", "create_update_route_card_dict"], [0, 2, 1, "", "cube_properties_to_standard_properties"], [0, 3, 1, "", "diff_dict"], [0, 2, 1, "", "evaluate_differences"], [0, 2, 1, "", "evaluate_route_property_differences"], [0, 2, 1, "", "evaluate_route_shape_changes"], [0, 2, 1, "", "get_time_period_numbers_from_cube_properties"], [0, 3, 1, "", "line_properties"], [0, 3, 1, "", "lines"], [0, 3, 1, "", "parameters"], [0, 3, 1, "", "program_type"], [0, 3, 1, "", "shapes"], [0, 3, 1, "", "source_list"], [0, 2, 1, "", "unpack_route_name"]], "lasso.ModelRoadwayNetwork": [[1, 3, 1, "", "CALCULATED_VALUES"], [1, 2, 1, "", "__init__"], [1, 2, 1, "", "add_counts"], [1, 2, 1, "", "add_incident_link_data_to_nodes"], [1, 2, 1, "", "add_new_roadway_feature_change"], [1, 2, 1, "", "add_variable_using_shst_reference"], [1, 2, 1, "", "addition_map"], [1, 2, 1, "", "apply"], [1, 2, 1, "", "apply_managed_lane_feature_change"], [1, 2, 1, "", "apply_python_calculation"], [1, 2, 1, "", "apply_roadway_feature_change"], [1, 2, 1, "", "assess_connectivity"], [1, 2, 1, "", "build_selection_key"], [1, 2, 1, "", "calculate_area_type"], [1, 2, 1, "", "calculate_centroidconnect"], [1, 2, 1, "", "calculate_county"], [1, 2, 1, "", "calculate_distance"], [1, 2, 1, "", "calculate_mpo"], [1, 2, 1, "", "calculate_use"], [1, 2, 1, "", "convert_int"], [1, 2, 1, "", "create_ML_variable"], [1, 2, 1, "", "create_calculated_variables"], [1, 2, 1, "", "create_dummy_connector_links"], [1, 2, 1, "", "create_hov_corridor_variable"], [1, 2, 1, "", "create_managed_lane_network"], [1, 2, 1, "", "create_managed_variable"], [1, 2, 1, "", "dataframe_to_fixed_width"], [1, 2, 1, "", "delete_roadway_feature_change"], [1, 2, 1, "", "deletion_map"], [1, 2, 1, "", "fill_na"], [1, 2, 1, "", "from_RoadwayNetwork"], [1, 2, 1, "", "get_attribute"], [1, 2, 1, "", "get_managed_lane_node_ids"], [1, 2, 1, "", "get_modal_graph"], [1, 2, 1, "", "get_modal_links_nodes"], [1, 2, 1, "", "get_property_by_time_period_and_group"], [1, 2, 1, "", "identify_segment"], [1, 2, 1, "", "identify_segment_endpoints"], [1, 2, 1, "", "is_network_connected"], [1, 2, 1, "", "load_transform_network"], [1, 2, 1, "", "network_connection_plot"], [1, 2, 1, "", "orig_dest_nodes_foreign_key"], [1, 2, 1, "", "ox_graph"], [1, 2, 1, "", "path_search"], [1, 2, 1, "", "read"], [1, 2, 1, "", "read_match_result"], [1, 2, 1, "", "rename_variables_for_dbf"], [1, 2, 1, "", "roadway_net_to_gdf"], [1, 2, 1, "", "roadway_standard_to_met_council_network"], [1, 2, 1, "", "select_roadway_features"], [1, 2, 1, "", "selection_has_unique_link_id"], [1, 2, 1, "", "selection_map"], [1, 2, 1, "", "shortest_path"], [1, 2, 1, "", "split_properties_by_time_period_and_category"], [1, 2, 1, "", "update_distance"], [1, 2, 1, "", "validate_link_schema"], [1, 2, 1, "", "validate_node_schema"], [1, 2, 1, "", "validate_properties"], [1, 2, 1, "", "validate_selection"], [1, 2, 1, "", "validate_shape_schema"], [1, 2, 1, "", "validate_uniqueness"], [1, 2, 1, "", "write"], [1, 2, 1, "", "write_roadway_as_fixedwidth"], [1, 2, 1, "", "write_roadway_as_shp"]], "lasso.Parameters": [[2, 2, 1, "", "__init__"], [2, 3, 1, "", "county_link_range_dict"], [2, 3, 1, "", "cube_time_periods"], [2, 3, 1, "", "properties_to_split"], [2, 3, 1, "", "zones"]], "lasso.Project": [[3, 3, 1, "", "CALCULATED_VALUES"], [3, 3, 1, "id0", "DEFAULT_PROJECT_NAME"], [3, 3, 1, "id1", "STATIC_VALUES"], [3, 2, 1, "", "__init__"], [3, 2, 1, "", "add_highway_changes"], [3, 2, 1, "", "add_transit_changes"], [3, 3, 1, "", "base_cube_transit_network"], [3, 3, 1, "", "base_roadway_network"], [3, 3, 1, "", "build_cube_transit_network"], [3, 3, 1, "", "card_data"], [3, 2, 1, "", "create_project"], [3, 2, 1, "", "determine_roadway_network_changes_compatability"], [3, 2, 1, "", "emme_id_to_wrangler_id"], [3, 2, 1, "", "emme_name_to_wrangler_name"], [3, 2, 1, "", "evaluate_changes"], [3, 2, 1, "", "get_object_from_network_build_command"], [3, 2, 1, "", "get_operation_from_network_build_command"], [3, 3, 1, "", "parameters"], [3, 3, 1, "", "project_name"], [3, 2, 1, "", "read_logfile"], [3, 2, 1, "", "read_network_build_file"], [3, 3, 1, "", "roadway_link_changes"], [3, 3, 1, "", "roadway_node_changes"], [3, 3, 1, "", "transit_changes"], [3, 2, 1, "", "write_project_card"]], "lasso.StandardTransit": [[4, 2, 1, "", "__init__"], [4, 2, 1, "", "calculate_cube_mode"], [4, 2, 1, "", "cube_format"], [4, 2, 1, "", "evaluate_differences"], [4, 3, 1, "", "feed"], [4, 2, 1, "", "fromTransitNetwork"], [4, 3, 1, "", "parameters"], [4, 2, 1, "", "read_gtfs"], [4, 2, 1, "", "route_properties_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_dict_list"], [4, 2, 1, "", "shape_gtfs_to_emme"], [4, 2, 1, "", "time_to_cube_time_period"], [4, 2, 1, "", "write_as_cube_lin"]], "lasso.logger": [[5, 4, 1, "", "setupLogging"]], "lasso.util": [[6, 1, 1, "", "Point"], [6, 1, 1, "", "Polygon"], [6, 4, 1, "", "column_name_to_parts"], [6, 4, 1, "", "create_locationreference"], [6, 4, 1, "", "geodesic_point_buffer"], [6, 4, 1, "", "get_shared_streets_intersection_hash"], [6, 4, 1, "", "hhmmss_to_datetime"], [6, 1, 1, "", "partial"], [6, 4, 1, "", "secs_to_datetime"], [6, 4, 1, "", "shorten_name"], [6, 4, 1, "", "transform"], [6, 4, 1, "", "unidecode"]], "lasso.util.Point": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "x"], [6, 5, 1, "", "xy"], [6, 5, 1, "", "y"], [6, 5, 1, "", "z"]], "lasso.util.Polygon": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "id0", "exterior"], [6, 2, 1, "", "from_bounds"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 5, 1, "id1", "interiors"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "xy"]], "lasso.util.partial": [[6, 3, 1, "", "args"], [6, 3, 1, "", "func"], [6, 3, 1, "", "keywords"]]}, "objtypes": {"0": "py:module", "1": "py:class", "2": "py:method", "3": "py:attribute", "4": "py:function", "5": "py:property"}, "objnames": {"0": ["py", "module", "Python module"], "1": ["py", "class", "Python class"], "2": ["py", "method", "Python method"], "3": ["py", "attribute", "Python attribute"], "4": ["py", "function", "Python function"], "5": ["py", "property", "Python property"]}, "titleterms": {"lasso": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], "cubetransit": [0, 11], "modelroadwaynetwork": [1, 11], "todo": 1, "paramet": [2, 10, 11], "project": [3, 9, 10, 11], "standardtransit": [4, 11], "logger": 5, "util": [6, 7], "class": 7, "function": 7, "base": 7, "welcom": 8, "": 8, "document": 8, "content": 8, "indic": 8, "tabl": 8, "run": [9, 11], "creat": 9, "file": [9, 10, 11], "scenario": [9, 11], "export": 9, "network": [9, 11], "audit": 9, "report": 9, "setup": 10, "set": 10, "addit": 10, "data": 10, "start": 11, "out": 11, "instal": 11, "bleed": 11, "edg": 11, "from": 11, "clone": 11, "brief": 11, "intro": 11, "compon": 11, "roadwaynetwork": 11, "transitnetwork": 11, "projectcard": 11, "typic": 11, "workflow": 11, "card": 11, "transit": 11, "lin": 11, "cube": 11, "log": 11, "model": 11, "quickstart": 11, "jupyt": 11, "notebook": 11}, "envversion": {"sphinx.domains.c": 2, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 8, "sphinx.domains.index": 1, "sphinx.domains.javascript": 2, "sphinx.domains.math": 2, "sphinx.domains.python": 3, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.intersphinx": 1, "sphinx.ext.todo": 2, "sphinx.ext.viewcode": 1, "sphinx": 57}, "alltitles": {"lasso.CubeTransit": [[0, "lasso-cubetransit"]], "lasso.ModelRoadwayNetwork": [[1, "lasso-modelroadwaynetwork"]], "Todo": [[1, "id1"], [1, "id2"], [1, "id3"], [1, "id4"], [1, "id5"], [1, "id6"]], "lasso.Parameters": [[2, "lasso-parameters"]], "lasso.Project": [[3, "lasso-project"]], "lasso.StandardTransit": [[4, "lasso-standardtransit"]], "lasso.logger": [[5, "module-lasso.logger"]], "lasso.util": [[6, "module-lasso.util"]], "Lasso Classes and Functions": [[7, "module-lasso"]], "Base Classes": [[7, "base-classes"]], "Utils and Functions": [[7, "utils-and-functions"]], "Welcome to lasso\u2019s documentation!": [[8, "welcome-to-lasso-s-documentation"]], "Contents:": [[8, null]], "Indices and tables": [[8, "indices-and-tables"]], "Running Lasso": [[9, "running-lasso"]], "Create project files": [[9, "create-project-files"]], "Create a scenario": [[9, "create-a-scenario"]], "Exporting networks": [[9, "exporting-networks"]], "Auditing and Reporting": [[9, "auditing-and-reporting"]], "Setup": [[10, "setup"]], "Projects": [[10, "projects"]], "Parameters": [[10, "parameters"], [11, "parameters"]], "Settings": [[10, "settings"]], "Additional Data Files": [[10, "additional-data-files"]], "Starting Out": [[11, "starting-out"]], "Installation": [[11, "installation"]], "Bleeding Edge": [[11, "bleeding-edge"]], "From Clone": [[11, "from-clone"]], "Brief Intro": [[11, "brief-intro"]], "Components": [[11, "components"]], "RoadwayNetwork": [[11, "roadwaynetwork"]], "TransitNetwork": [[11, "transitnetwork"]], "ProjectCard": [[11, "projectcard"]], "Scenario": [[11, "scenario"]], "Project": [[11, "project"]], "ModelRoadwayNetwork": [[11, "modelroadwaynetwork"]], "StandardTransit": [[11, "standardtransit"]], "CubeTransit": [[11, "cubetransit"]], "Typical Workflow": [[11, "typical-workflow"]], "Project Cards from Transit LIN Files": [[11, "project-cards-from-transit-lin-files"]], "Project Cards from Cube LOG Files": [[11, "project-cards-from-cube-log-files"]], "Model Network Files for a Scenario": [[11, "model-network-files-for-a-scenario"]], "Running Quickstart Jupyter Notebooks": [[11, "running-quickstart-jupyter-notebooks"]]}, "indexentries": {"cubetransit (class in lasso)": [[0, "lasso.CubeTransit"]], "__init__() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.__init__"]], "add_additional_time_periods() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_additional_time_periods"]], "add_cube() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_cube"]], "build_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.build_route_name"]], "calculate_start_end_times() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.calculate_start_end_times"]], "create_add_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_add_route_card_dict"]], "create_delete_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_delete_route_card_dict"]], "create_from_cube() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.create_from_cube"]], "create_update_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_update_route_card_dict"]], "cube_properties_to_standard_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.cube_properties_to_standard_properties"]], "diff_dict (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.diff_dict"]], "evaluate_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_differences"]], "evaluate_route_property_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_route_property_differences"]], "evaluate_route_shape_changes() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.evaluate_route_shape_changes"]], "get_time_period_numbers_from_cube_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.get_time_period_numbers_from_cube_properties"]], "line_properties (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.line_properties"]], "lines (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.lines"]], "parameters (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.parameters"]], "program_type (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.program_type"]], "shapes (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.shapes"]], "source_list (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.source_list"]], "unpack_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.unpack_route_name"]], "calculated_values (lasso.modelroadwaynetwork attribute)": [[1, "lasso.ModelRoadwayNetwork.CALCULATED_VALUES"]], "modelroadwaynetwork (class in lasso)": [[1, "lasso.ModelRoadwayNetwork"]], "__init__() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.__init__"]], "add_counts() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_counts"]], "add_incident_link_data_to_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.add_incident_link_data_to_nodes"]], "add_new_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_new_roadway_feature_change"]], "add_variable_using_shst_reference() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_variable_using_shst_reference"]], "addition_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.addition_map"]], "apply() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply"]], "apply_managed_lane_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_managed_lane_feature_change"]], "apply_python_calculation() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_python_calculation"]], "apply_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_roadway_feature_change"]], "assess_connectivity() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.assess_connectivity"]], "build_selection_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.build_selection_key"]], "calculate_area_type() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_area_type"]], "calculate_centroidconnect() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_centroidconnect"]], "calculate_county() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_county"]], "calculate_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_distance"]], "calculate_mpo() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_mpo"]], "calculate_use() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_use"]], "convert_int() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.convert_int"]], "create_ml_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_ML_variable"]], "create_calculated_variables() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_calculated_variables"]], "create_dummy_connector_links() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_dummy_connector_links"]], "create_hov_corridor_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_hov_corridor_variable"]], "create_managed_lane_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_lane_network"]], "create_managed_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_variable"]], "dataframe_to_fixed_width() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.dataframe_to_fixed_width"]], "delete_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.delete_roadway_feature_change"]], "deletion_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.deletion_map"]], "fill_na() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.fill_na"]], "from_roadwaynetwork() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.from_RoadwayNetwork"]], "get_attribute() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_attribute"]], "get_managed_lane_node_ids() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_managed_lane_node_ids"]], "get_modal_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_graph"]], "get_modal_links_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_links_nodes"]], "get_property_by_time_period_and_group() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.get_property_by_time_period_and_group"]], "identify_segment() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment"]], "identify_segment_endpoints() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment_endpoints"]], "is_network_connected() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.is_network_connected"]], "load_transform_network() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.load_transform_network"]], "network_connection_plot() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.network_connection_plot"]], "orig_dest_nodes_foreign_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.orig_dest_nodes_foreign_key"]], "ox_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.ox_graph"]], "path_search() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.path_search"]], "read() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read"]], "read_match_result() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read_match_result"]], "rename_variables_for_dbf() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.rename_variables_for_dbf"]], "roadway_net_to_gdf() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.roadway_net_to_gdf"]], "roadway_standard_to_met_council_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.roadway_standard_to_met_council_network"]], "select_roadway_features() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.select_roadway_features"]], "selection_has_unique_link_id() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_has_unique_link_id"]], "selection_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_map"]], "shortest_path() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.shortest_path"]], "split_properties_by_time_period_and_category() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.split_properties_by_time_period_and_category"]], "update_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.update_distance"]], "validate_link_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_link_schema"]], "validate_node_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_node_schema"]], "validate_properties() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_properties"]], "validate_selection() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_selection"]], "validate_shape_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_shape_schema"]], "validate_uniqueness() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_uniqueness"]], "write() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write"]], "write_roadway_as_fixedwidth() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_fixedwidth"]], "write_roadway_as_shp() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_shp"]], "parameters (class in lasso)": [[2, "lasso.Parameters"]], "__init__() (lasso.parameters method)": [[2, "lasso.Parameters.__init__"]], "county_link_range_dict (lasso.parameters attribute)": [[2, "lasso.Parameters.county_link_range_dict"]], "cube_time_periods (lasso.parameters attribute)": [[2, "lasso.Parameters.cube_time_periods"]], "properties_to_split (lasso.parameters attribute)": [[2, "lasso.Parameters.properties_to_split"]], "zones (lasso.parameters attribute)": [[2, "lasso.Parameters.zones"]], "calculated_values (lasso.project attribute)": [[3, "lasso.Project.CALCULATED_VALUES"]], "default_project_name (lasso.project attribute)": [[3, "id0"], [3, "lasso.Project.DEFAULT_PROJECT_NAME"]], "project (class in lasso)": [[3, "lasso.Project"]], "static_values (lasso.project attribute)": [[3, "id1"], [3, "lasso.Project.STATIC_VALUES"]], "__init__() (lasso.project method)": [[3, "lasso.Project.__init__"]], "add_highway_changes() (lasso.project method)": [[3, "lasso.Project.add_highway_changes"]], "add_transit_changes() (lasso.project method)": [[3, "lasso.Project.add_transit_changes"]], "base_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.base_cube_transit_network"]], "base_roadway_network (lasso.project attribute)": [[3, "lasso.Project.base_roadway_network"]], "build_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.build_cube_transit_network"]], "card_data (lasso.project attribute)": [[3, "lasso.Project.card_data"]], "create_project() (lasso.project static method)": [[3, "lasso.Project.create_project"]], "determine_roadway_network_changes_compatability() (lasso.project static method)": [[3, "lasso.Project.determine_roadway_network_changes_compatability"]], "emme_id_to_wrangler_id() (lasso.project static method)": [[3, "lasso.Project.emme_id_to_wrangler_id"]], "emme_name_to_wrangler_name() (lasso.project static method)": [[3, "lasso.Project.emme_name_to_wrangler_name"]], "evaluate_changes() (lasso.project method)": [[3, "lasso.Project.evaluate_changes"]], "get_object_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_object_from_network_build_command"]], "get_operation_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_operation_from_network_build_command"]], "parameters (lasso.project attribute)": [[3, "lasso.Project.parameters"]], "project_name (lasso.project attribute)": [[3, "lasso.Project.project_name"]], "read_logfile() (lasso.project static method)": [[3, "lasso.Project.read_logfile"]], "read_network_build_file() (lasso.project static method)": [[3, "lasso.Project.read_network_build_file"]], "roadway_link_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_link_changes"]], "roadway_node_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_node_changes"]], "transit_changes (lasso.project attribute)": [[3, "lasso.Project.transit_changes"]], "write_project_card() (lasso.project method)": [[3, "lasso.Project.write_project_card"]], "standardtransit (class in lasso)": [[4, "lasso.StandardTransit"]], "__init__() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.__init__"]], "calculate_cube_mode() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.calculate_cube_mode"]], "cube_format() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.cube_format"]], "evaluate_differences() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.evaluate_differences"]], "feed (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.feed"]], "fromtransitnetwork() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.fromTransitNetwork"]], "parameters (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.parameters"]], "read_gtfs() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.read_gtfs"]], "route_properties_gtfs_to_cube() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.route_properties_gtfs_to_cube"]], "shape_gtfs_to_cube() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_cube"]], "shape_gtfs_to_dict_list() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_dict_list"]], "shape_gtfs_to_emme() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_emme"]], "time_to_cube_time_period() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.time_to_cube_time_period"]], "write_as_cube_lin() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.write_as_cube_lin"]], "lasso.logger": [[5, "module-lasso.logger"]], "module": [[5, "module-lasso.logger"], [6, "module-lasso.util"], [7, "module-lasso"]], "setuplogging() (in module lasso.logger)": [[5, "lasso.logger.setupLogging"]], "point (class in lasso.util)": [[6, "lasso.util.Point"]], "polygon (class in lasso.util)": [[6, "lasso.util.Polygon"]], "almost_equals() (lasso.util.point method)": [[6, "lasso.util.Point.almost_equals"]], "almost_equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.almost_equals"]], "area (lasso.util.point property)": [[6, "lasso.util.Point.area"]], "area (lasso.util.polygon property)": [[6, "lasso.util.Polygon.area"]], "args (lasso.util.partial attribute)": [[6, "lasso.util.partial.args"]], "boundary (lasso.util.point property)": [[6, "lasso.util.Point.boundary"]], "boundary (lasso.util.polygon property)": [[6, "lasso.util.Polygon.boundary"]], "bounds (lasso.util.point property)": [[6, "lasso.util.Point.bounds"]], "bounds (lasso.util.polygon property)": [[6, "lasso.util.Polygon.bounds"]], "buffer() (lasso.util.point method)": [[6, "lasso.util.Point.buffer"]], "buffer() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.buffer"]], "centroid (lasso.util.point property)": [[6, "lasso.util.Point.centroid"]], "centroid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.centroid"]], "column_name_to_parts() (in module lasso.util)": [[6, "lasso.util.column_name_to_parts"]], "contains() (lasso.util.point method)": [[6, "lasso.util.Point.contains"]], "contains() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains"]], "contains_properly() (lasso.util.point method)": [[6, "lasso.util.Point.contains_properly"]], "contains_properly() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains_properly"]], "convex_hull (lasso.util.point property)": [[6, "lasso.util.Point.convex_hull"]], "convex_hull (lasso.util.polygon property)": [[6, "lasso.util.Polygon.convex_hull"]], "coords (lasso.util.point property)": [[6, "lasso.util.Point.coords"]], "coords (lasso.util.polygon property)": [[6, "lasso.util.Polygon.coords"]], "covered_by() (lasso.util.point method)": [[6, "lasso.util.Point.covered_by"]], "covered_by() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covered_by"]], "covers() (lasso.util.point method)": [[6, "lasso.util.Point.covers"]], "covers() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covers"]], "create_locationreference() (in module lasso.util)": [[6, "lasso.util.create_locationreference"]], "crosses() (lasso.util.point method)": [[6, "lasso.util.Point.crosses"]], "crosses() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.crosses"]], "difference() (lasso.util.point method)": [[6, "lasso.util.Point.difference"]], "difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.difference"]], "disjoint() (lasso.util.point method)": [[6, "lasso.util.Point.disjoint"]], "disjoint() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.disjoint"]], "distance() (lasso.util.point method)": [[6, "lasso.util.Point.distance"]], "distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.distance"]], "dwithin() (lasso.util.point method)": [[6, "lasso.util.Point.dwithin"]], "dwithin() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.dwithin"]], "envelope (lasso.util.point property)": [[6, "lasso.util.Point.envelope"]], "envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.envelope"]], "equals() (lasso.util.point method)": [[6, "lasso.util.Point.equals"]], "equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals"]], "equals_exact() (lasso.util.point method)": [[6, "lasso.util.Point.equals_exact"]], "equals_exact() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals_exact"]], "exterior (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.exterior"]], "exterior (lasso.util.polygon property)": [[6, "id0"]], "from_bounds() (lasso.util.polygon class method)": [[6, "lasso.util.Polygon.from_bounds"]], "func (lasso.util.partial attribute)": [[6, "lasso.util.partial.func"]], "geodesic_point_buffer() (in module lasso.util)": [[6, "lasso.util.geodesic_point_buffer"]], "geom_type (lasso.util.point property)": [[6, "lasso.util.Point.geom_type"]], "geom_type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.geom_type"]], "geometrytype() (lasso.util.point method)": [[6, "lasso.util.Point.geometryType"]], "geometrytype() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.geometryType"]], "get_shared_streets_intersection_hash() (in module lasso.util)": [[6, "lasso.util.get_shared_streets_intersection_hash"]], "has_z (lasso.util.point property)": [[6, "lasso.util.Point.has_z"]], "has_z (lasso.util.polygon property)": [[6, "lasso.util.Polygon.has_z"]], "hausdorff_distance() (lasso.util.point method)": [[6, "lasso.util.Point.hausdorff_distance"]], "hausdorff_distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.hausdorff_distance"]], "hhmmss_to_datetime() (in module lasso.util)": [[6, "lasso.util.hhmmss_to_datetime"]], "interiors (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.interiors"]], "interiors (lasso.util.polygon property)": [[6, "id1"]], "interpolate() (lasso.util.point method)": [[6, "lasso.util.Point.interpolate"]], "interpolate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.interpolate"]], "intersection() (lasso.util.point method)": [[6, "lasso.util.Point.intersection"]], "intersection() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersection"]], "intersects() (lasso.util.point method)": [[6, "lasso.util.Point.intersects"]], "intersects() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersects"]], "is_closed (lasso.util.point property)": [[6, "lasso.util.Point.is_closed"]], "is_closed (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_closed"]], "is_empty (lasso.util.point property)": [[6, "lasso.util.Point.is_empty"]], "is_empty (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_empty"]], "is_ring (lasso.util.point property)": [[6, "lasso.util.Point.is_ring"]], "is_ring (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_ring"]], "is_simple (lasso.util.point property)": [[6, "lasso.util.Point.is_simple"]], "is_simple (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_simple"]], "is_valid (lasso.util.point property)": [[6, "lasso.util.Point.is_valid"]], "is_valid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_valid"]], "keywords (lasso.util.partial attribute)": [[6, "lasso.util.partial.keywords"]], "lasso.util": [[6, "module-lasso.util"]], "length (lasso.util.point property)": [[6, "lasso.util.Point.length"]], "length (lasso.util.polygon property)": [[6, "lasso.util.Polygon.length"]], "line_interpolate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_interpolate_point"]], "line_interpolate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_interpolate_point"]], "line_locate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_locate_point"]], "line_locate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_locate_point"]], "minimum_clearance (lasso.util.point property)": [[6, "lasso.util.Point.minimum_clearance"]], "minimum_clearance (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_clearance"]], "minimum_rotated_rectangle (lasso.util.point property)": [[6, "lasso.util.Point.minimum_rotated_rectangle"]], "minimum_rotated_rectangle (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_rotated_rectangle"]], "normalize() (lasso.util.point method)": [[6, "lasso.util.Point.normalize"]], "normalize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.normalize"]], "oriented_envelope (lasso.util.point property)": [[6, "lasso.util.Point.oriented_envelope"]], "oriented_envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.oriented_envelope"]], "overlaps() (lasso.util.point method)": [[6, "lasso.util.Point.overlaps"]], "overlaps() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.overlaps"]], "partial (class in lasso.util)": [[6, "lasso.util.partial"]], "point_on_surface() (lasso.util.point method)": [[6, "lasso.util.Point.point_on_surface"]], "point_on_surface() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.point_on_surface"]], "project() (lasso.util.point method)": [[6, "lasso.util.Point.project"]], "project() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.project"]], "relate() (lasso.util.point method)": [[6, "lasso.util.Point.relate"]], "relate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate"]], "relate_pattern() (lasso.util.point method)": [[6, "lasso.util.Point.relate_pattern"]], "relate_pattern() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate_pattern"]], "representative_point() (lasso.util.point method)": [[6, "lasso.util.Point.representative_point"]], "representative_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.representative_point"]], "reverse() (lasso.util.point method)": [[6, "lasso.util.Point.reverse"]], "reverse() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.reverse"]], "secs_to_datetime() (in module lasso.util)": [[6, "lasso.util.secs_to_datetime"]], "segmentize() (lasso.util.point method)": [[6, "lasso.util.Point.segmentize"]], "segmentize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.segmentize"]], "shorten_name() (in module lasso.util)": [[6, "lasso.util.shorten_name"]], "simplify() (lasso.util.point method)": [[6, "lasso.util.Point.simplify"]], "simplify() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.simplify"]], "svg() (lasso.util.point method)": [[6, "lasso.util.Point.svg"]], "svg() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.svg"]], "symmetric_difference() (lasso.util.point method)": [[6, "lasso.util.Point.symmetric_difference"]], "symmetric_difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.symmetric_difference"]], "touches() (lasso.util.point method)": [[6, "lasso.util.Point.touches"]], "touches() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.touches"]], "transform() (in module lasso.util)": [[6, "lasso.util.transform"]], "type (lasso.util.point property)": [[6, "lasso.util.Point.type"]], "type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.type"]], "unidecode() (in module lasso.util)": [[6, "lasso.util.unidecode"]], "union() (lasso.util.point method)": [[6, "lasso.util.Point.union"]], "union() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.union"]], "within() (lasso.util.point method)": [[6, "lasso.util.Point.within"]], "within() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.within"]], "wkb (lasso.util.point property)": [[6, "lasso.util.Point.wkb"]], "wkb (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb"]], "wkb_hex (lasso.util.point property)": [[6, "lasso.util.Point.wkb_hex"]], "wkb_hex (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb_hex"]], "wkt (lasso.util.point property)": [[6, "lasso.util.Point.wkt"]], "wkt (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkt"]], "x (lasso.util.point property)": [[6, "lasso.util.Point.x"]], "xy (lasso.util.point property)": [[6, "lasso.util.Point.xy"]], "xy (lasso.util.polygon property)": [[6, "lasso.util.Polygon.xy"]], "y (lasso.util.point property)": [[6, "lasso.util.Point.y"]], "z (lasso.util.point property)": [[6, "lasso.util.Point.z"]], "lasso": [[7, "module-lasso"]]}}) \ No newline at end of file diff --git a/setup/index.html b/setup/index.html new file mode 100644 index 0000000..b8c1f8b --- /dev/null +++ b/setup/index.html @@ -0,0 +1,131 @@ + + + + + + + Setup — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Setup

+
+

Projects

+
+
+

Parameters

+
+
+

Settings

+
+
+

Additional Data Files

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/starting/index.html b/starting/index.html new file mode 100644 index 0000000..2f76c23 --- /dev/null +++ b/starting/index.html @@ -0,0 +1,432 @@ + + + + + + + Starting Out — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Starting Out

+
+

Installation

+

If you are managing multiple python versions, we suggest using virtualenv or conda virtual environments.

+

Example using a conda environment (recommended) and using the package manager pip to install Lasso from the source on GitHub.

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/Lasso@master
+
+
+

Lasso will install network_wrangler from the PyPi repository because it is included in Lasso’s requirements.txt.

+
+

Bleeding Edge

+

If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the develop branch of

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/network_wrangler@develop
+pip install git+https://github.com/wsp-sag/Lasso@develop
+
+
+
+
+

From Clone

+

If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in editable mode.

+

if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas osmnx -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+git clone https://github.com/wsp-sag/Lasso
+git clone https://github.com/wsp-sag/network_wrangler
+cd network_wrangler
+pip install -e .
+cd ..
+cd Lasso
+pip install -e .
+
+
+

Notes:

+
    +
  1. The -e installs it in editable mode.

  2. +
  3. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per these directions on github.

  4. +
  5. if you wanted to install from a specific tag/version number or branch, replace @master with @<branchname> or @tag

  6. +
  7. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler

  8. +
+

If you are going to be doing Lasso development, we also recommend:

+
    +
  • a good IDE such as Atom, VS Code, Sublime Text, etc. +with Python syntax highlighting turned on.

  • +
  • GitHub Desktop to locally update your clones

  • +
+
+
+
+

Brief Intro

+

Lasso is a ‘wrapper’ around the Network Wrangler utility.

+

Both Lasso and NetworkWrangler are built around the following data schemas:

+
    +
  • [roadway network], which is based on a mashup of Open Street Map and Shared Streets. In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category.

  • +
  • [transit network], which is based on a frequency-based implementation of the csv-based GTFS; and

  • +
  • [project card], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml.

  • +
+

In addition, Lasso utilizes the following data schemas:

+
    +
  • [MetCouncil Model Roadway Network Schema], which adds data fields to the roadway network schema that MetCouncil uses in their travel model including breaking out data fields by time period.

  • +
  • [MetCouncil Model Transit Network Schema], which uses the Cube PublicTransport format, and

  • +
  • [Cube Log Files], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler.

  • +
  • [Cube public transport line files], which define a set of transit lines in the cube software.

  • +
+
+

Components

+

Network Wrangler has the following atomic parts:

+
    +
  • RoadwayNetwork object, which represents the roadway network data as GeoDataFrames;

  • +
  • TransitNetwork object, which represents the transit network data as DataFrames;

  • +
  • ProjectCard object, which represents the data of the project card. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.;

  • +
  • Scenario object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network.

  • +
+

In addition, Lasso has the following atomic parts:

+
    +
  • Project object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file.

  • +
  • ModelRoadwayNetwork object is a subclass of RoadwayNetwork and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube.

  • +
  • StandardTransit, an object for holding a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s Cube Line files.

  • +
  • CubeTransit, an object for storing information about transit defined in Cube public transport line files +. Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries.

  • +
  • Parameters, A class representing all the parameters defining the networks +including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class.

  • +
+
+

RoadwayNetwork

+

Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames links_df, nodes_df, and shapes_df.

+
net = RoadwayNetwork.read(
+        link_filename=MY_LINK_FILE,
+        node_filename=MY_NODE_FILE,
+        shape_filename=MY_SHAPE_FILE,
+        shape_foreign_key ='shape_id',
+        
+    )
+my_selection = {
+    "link": [{"name": ["I 35E"]}],
+    "A": {"osm_node_id": "961117623"},  # start searching for segments at A
+    "B": {"osm_node_id": "2564047368"},
+}
+net.select_roadway_features(my_selection)
+
+my_change = [
+    {
+        'property': 'lanes',
+        'existing': 1,
+        'set': 2,
+     },
+     {
+        'property': 'drive_access',
+        'set': 0,
+      },
+]
+
+my_net.apply_roadway_feature_change(
+    my_net.select_roadway_features(my_selection),
+    my_change
+)
+
+ml_net = net.create_managed_lane_network(in_place=False)
+
+ml_net.is_network_connected(mode="drive"))
+
+_, disconnected_nodes = ml_net.assess_connectivity(
+  mode="walk",
+  ignore_end_nodes=True
+)
+ml_net.write(filename=my_out_prefix, path=my_dir)
+
+
+
+
+

TransitNetwork

+
+
+

ProjectCard

+
+
+

Scenario

+

Manages sets of project cards and tiering from a base scenario/set of networks.

+

+my_base_scenario = {
+    "road_net": RoadwayNetwork.read(
+        link_filename=STPAUL_LINK_FILE,
+        node_filename=STPAUL_NODE_FILE,
+        shape_filename=STPAUL_SHAPE_FILE,
+        fast=True,
+        shape_foreign_key ='shape_id',
+    ),
+    "transit_net": TransitNetwork.read(STPAUL_DIR),
+}
+
+card_filenames = [
+    "3_multiple_roadway_attribute_change.yml",
+    "multiple_changes.yml",
+    "4_simple_managed_lane.yml",
+]
+
+project_card_directory = os.path.join(STPAUL_DIR, "project_cards")
+
+project_cards_list = [
+    ProjectCard.read(os.path.join(project_card_directory, filename), validate=False)
+    for filename in card_filenames
+]
+
+my_scenario = Scenario.create_scenario(
+  base_scenario=my_base_scenario,
+  project_cards_list=project_cards_list,
+)
+my_scenario.check_scenario_requisites()
+
+my_scenario.apply_all_projects()
+
+my_scenario.scenario_summary()
+
+
+
+
+

Project

+

Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network;

+

+test_project = Project.create_project(
+  base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+  build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+  )
+
+test_project.evaluate_changes()
+
+test_project.write_project_card(
+  os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+  )
+
+
+
+
+

ModelRoadwayNetwork

+

A subclass of network_wrangler’s RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema.

+
net = ModelRoadwayNetwork.read(
+      link_filename=STPAUL_LINK_FILE,
+      node_filename=STPAUL_NODE_FILE,
+      shape_filename=STPAUL_SHAPE_FILE,
+      fast=True,
+      shape_foreign_key ='shape_id',
+  )
+
+net.write_roadway_as_fixedwidth()
+
+
+
+
+

StandardTransit

+

Translates the standard GTFS data to MetCouncil’s Cube Line files.

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+

CubeTransit

+

Used by the project class and has the capability to:

+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+

Parameters

+

Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary.

+
# read parameters from a yaml configuration  file
+# could also provide as a key/value pair
+with open(config_file) as f:
+      my_config = yaml.safe_load(f)
+
+# provide parameters at instantiation of ModelRoadwayNetwork
+model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork(
+            my_scenario.road_net, parameters=my_config.get("my_parameters", {})
+        )
+# network written with direction from the parameters given
+model_road_net.write_roadway_as_shp()
+
+
+
+
+
+

Typical Workflow

+

Workflows in Lasso and Network Wrangler typically accomplish one of two goals:

+
    +
  1. Create Project Cards to document network changes as a result of either transit or roadway projects.

  2. +
  3. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network.

  4. +
+
+

Project Cards from Transit LIN Files

+
+
+

Project Cards from Cube LOG Files

+
+
+

Model Network Files for a Scenario

+
+
+
+
+

Running Quickstart Jupyter Notebooks

+

To learn basic lasso functionality, please refer to the following jupyter notebooks in the /notebooks directory:

+
    +
  • Lasso Project Card Creation Quickstart.ipynb

  • +
  • Lasso Scenario Creation Quickstart.ipynb

  • +
+

Jupyter notebooks can be started by activating the lasso conda environment and typing jupyter notebook:

+
conda activate <my_lasso_environment>
+jupyter notebook
+
+
+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file