Skip to content

Commit

Permalink
Merge pull request #818 from conveyal/dev
Browse files Browse the repository at this point in the history
2022-07 Release
  • Loading branch information
ansoncfit authored Aug 1, 2022
2 parents 43068e1 + 12be87f commit a528745
Show file tree
Hide file tree
Showing 21 changed files with 295 additions and 66 deletions.
16 changes: 7 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,9 @@ We refer to the routing method as "realistic" because it works by planning door-

We say "Real-world and Reimagined" networks because R5's networks are built from widely available open OSM and GTFS data describing baseline transportation systems, but R5 includes a system for applying light-weight patches to those networks for immediate, interactive scenario comparison.

**Please note** that the Conveyal team does not provide technical support for third-party deployments of its analysis platform. We provide paid subscriptions to a cloud-based deployment of this system, which performs these complex calculations hundreds of times faster using a compute cluster. This project is open source primarily to ensure transparency and reproducibility in public planning and decision making processes, and in hopes that it may help researchers, students, and potential collaborators to understand and build upon our methodology.
**Please note** that the Conveyal team does not provide technical support for third-party deployments. R5 is a component of a specialized commercial system, and we align development efforts with our roadmap and the needs of subscribers to our hosted service. This service is designed to facilitate secure online collaboration, user-friendly data management and scenario editing through a web interface, and complex calculations performed hundreds of times faster using a compute cluster. These design goals may not align well with other use cases. This project is open source primarily to ensure transparency and reproducibility in public planning and decision making processes, and in hopes that it may help researchers, students, and potential collaborators to understand and build upon our methodology.

While the Conveyal team provides ongoing support and compatibility to subscribers, third-party projects using R5 as a library may not work with future releases. R5 does not currently expose a stable programming interface ("API" or "SDK"). As we release new features, previous functions and data types may change. The practical effect is that third-party wrappers or language bindings (e.g., for R or Python) may need to continue using an older release of R5 for feature compatibility (though not necessarily result compatibility, as the methods used in R5 are now relatively mature).

## Methodology

Expand All @@ -19,7 +21,7 @@ For details on the core methods implemented in Conveyal Analysis and R5, see:

### Citations

The Conveyal team is always eager to see cutting-edge uses of our software, so feel free to send us a copy of any thesis, report, or paper produced using this software. We also ask that any academic publications using this software cite the papers above, where relevant and appropriate.
The Conveyal team is always eager to see cutting-edge uses of our software, so feel free to send us a copy of any thesis, report, or paper produced using this software. We also ask that any academic or research publications using this software cite the papers above, where relevant and appropriate.

## Configuration

Expand Down Expand Up @@ -52,7 +54,7 @@ By default, IntelliJ will follow common Gradle practice and build R5 using the "

## Structured Commit Messages

We use structured commit messages to help generate changelogs and determine version numbers.
We use structured commit messages to help generate changelogs.

The first line of these messages is in the following format: `<type>(<scope>): <summary>`

Expand All @@ -68,10 +70,6 @@ The `(<scope>)` is optional and is often a class name. The `<summary>` should be
- devops: Changes to code that only affect deployment, logging, etc. No changes to user code.
- chore: Any other changes causing no changes to user code.

The body of the commit message (if any) should begin after one blank line. If the commit meets the definition of a major version change according to semantic versioning (e.g. a change in API visible to an external module), the commit message body should begin with `BREAKING CHANGE: <description>`.

Presence of a `fix` commit in a release should increment the number in the third (PATCH) position.
Presence of a `feat` commit in a release should increment the number in the second (MINOR) position.
Presence of a `BREAKING CHANGE` commit in a release should increment the number in the first (MAJOR) position.
The body of the commit message (if any) should begin after one blank line.

This is based on https://www.conventionalcommits.org.
From 2018 to 2020, we used major/minor/patch release numbering as suggested by https://www.conventionalcommits.org. Starting in 2021, we switched to major/minor release numbering, incrementing the minor version with regular feature releases and the major version only when there are substantial changes to the cluster computing components of our system. Because there is no public API at this time, the conventional definition of breaking changes under semantic versioning does not apply.
Original file line number Diff line number Diff line change
Expand Up @@ -168,11 +168,39 @@ public final void run () {
List<ApiTask> apiTaskList = tasks.stream()
.map(Task::toApiTask)
.collect(Collectors.toUnmodifiableList());
tasks.removeIf(t -> t.durationComplete().getSeconds() > 60);
// Purge old tasks.
tasks.removeIf(t -> t.durationComplete().toDays() > 3);
return apiTaskList;
}
}

/**
* Return a single task. Does not purge old tasks.
*/
public Task getTaskForUser (String userEmail, String taskId) {
synchronized (tasksForUser) {
Set<Task> tasks = tasksForUser.get(userEmail);
if (tasks == null) return null;
for (Task t : tasks) {
if (t.id.toString().equals(taskId)) {
return t;
}
}
return null;
}
}

/**
* Remove a task. Returns false if task does not exist. Returns true if task exists and is properly removed.
*/
public boolean removeTaskForUser (String userEmail, String taskId) {
synchronized (tasksForUser) {
Set<Task> tasks = tasksForUser.get(userEmail);
if (tasks == null) return false;
return tasks.removeIf(t -> t.id.toString().equals(taskId));
}
}

// Q: Should the caller ever create its own Tasks, or if are tasks created here inside the TaskScheduler from
// other raw information? Having the caller creating a Task seems like a good way to configure execution details
// like heavy/light/periodic, and submit user information without passing it in. That task request could be separate
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
import com.conveyal.gtfs.error.GTFSError;
import com.conveyal.gtfs.error.GeneralError;
import com.conveyal.gtfs.model.Stop;
import com.conveyal.gtfs.validator.PostLoadValidator;
import com.conveyal.osmlib.Node;
import com.conveyal.osmlib.OSM;
import com.conveyal.r5.analyst.progress.ProgressInputStream;
Expand Down Expand Up @@ -199,6 +200,9 @@ private Bundle create (Request req, Response res) {
feed.progressListener = progressListener;
feed.loadFromFile(zipFile, new ObjectId().toString());

// Perform any more complex validation that requires cross-table checks.
new PostLoadValidator(feed).validate();

// Find and validate the extents of the GTFS, defined by all stops in the feed.
for (Stop s : feed.stops.values()) {
bundleBounds.expandToInclude(s.stop_lon, s.stop_lat);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,8 @@ static class StopApiResponse extends BaseApiResponse {
}
}
/**
* Return StopApiResponse values for GTFS stops (location_type = 0) in a single feed
* Return StopApiResponse values for GTFS stops (location_type = 0) in a single feed.
* All other location_types (station, entrance, generic node, boarding area) are skipped.
*/
private List<StopApiResponse> getAllStopsForOneFeed(Request req, Response res) {
GTFSFeed feed = getFeedFromRequest(req);
Expand All @@ -160,8 +161,8 @@ private List<StopApiResponse> getAllStopsForOneFeed(Request req, Response res) {
}

/**
* Groups the feedId and stops (location_type = 0; not parent stations, entrances/exits, generic nodes, etc.) for a
* given GTFS feed
* Compound return type for the feedId and stops with location_type = 0 for a given GTFS feed.
* All other location_types (station, entrance, generic node, boarding area) are skipped.
*/
static class FeedGroupStopsApiResponse {
public final String feedId;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,18 +24,13 @@
import com.conveyal.r5.analyst.progress.NoopProgressListener;
import com.conveyal.r5.analyst.progress.Task;
import com.conveyal.r5.analyst.progress.WorkProduct;
import com.conveyal.r5.analyst.progress.WorkProductType;
import com.conveyal.r5.util.ExceptionUtils;
import com.conveyal.r5.util.InputStreamProvider;
import com.conveyal.r5.util.ProgressListener;
import com.fasterxml.jackson.databind.node.ObjectNode;
import com.google.common.io.Files;
import com.mongodb.QueryBuilder;
import org.apache.commons.fileupload.FileItem;
import org.apache.commons.fileupload.FileItemFactory;
import org.apache.commons.fileupload.FileUploadException;
import org.apache.commons.fileupload.disk.DiskFileItemFactory;
import org.apache.commons.fileupload.servlet.ServletFileUpload;
import org.apache.commons.io.FilenameUtils;
import org.bson.types.ObjectId;
import org.slf4j.Logger;
Expand Down Expand Up @@ -210,7 +205,7 @@ private void updateAndStoreDatasets (SpatialDataSource source,
if (dataset.format == FileStorageFormat.FREEFORM) {
dataset.name = String.join(" ", pointSet.name, "(freeform)");
}
dataset.setWebMercatorExtents(pointSet);
dataset.setWebMercatorExtents(pointSet.getWebMercatorExtents());
// TODO make origin and destination pointsets reference each other and indicate they are suitable
// for one-to-one analyses

Expand Down Expand Up @@ -327,7 +322,7 @@ private OpportunityDatasetUploadStatus createOpportunityDataset(Request req, Res
// Parse required fields. Will throw a ServerException on failure.
final String sourceName = HttpUtils.getFormField(formFields, "Name", true);
final String regionId = HttpUtils.getFormField(formFields, "regionId", true);
final int zoom = parseZoom(HttpUtils.getFormField(formFields, "zoom", false));
final int zoom = parseZoom(HttpUtils.getFormField(formFields, "zoom", true));

// Create a region-wide status object tracking the processing of opportunity data.
// Create the status object before doing anything including input and parameter validation, so that any problems
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@
import com.conveyal.analysis.SelectingGridReducer;
import com.conveyal.analysis.UserPermissions;
import com.conveyal.analysis.components.broker.Broker;
import com.conveyal.analysis.components.broker.Job;
import com.conveyal.analysis.components.broker.JobStatus;
import com.conveyal.analysis.models.AnalysisRequest;
import com.conveyal.analysis.models.OpportunityDataset;
Expand Down Expand Up @@ -33,7 +32,6 @@

import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
Expand Down Expand Up @@ -402,7 +400,7 @@ private RegionalAnalysis createRegionalAnalysis (Request req, Response res) thro
);
} else {
checkArgument(
dataset.getWebMercatorExtents().zoom == opportunityDatasets.get(0).getWebMercatorExtents().zoom,
dataset.zoom == opportunityDatasets.get(0).zoom,
"If multiple grids are specified as destinations, they must have identical resolutions (web mercator zoom levels)."
);
}
Expand Down
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
package com.conveyal.analysis.controllers;

import com.conveyal.analysis.AnalysisServerException;
import com.conveyal.analysis.UserPermissions;
import com.conveyal.analysis.components.TaskScheduler;
import com.conveyal.r5.analyst.progress.ApiTask;
import com.google.common.collect.Iterables;
import com.google.common.collect.Lists;
import com.conveyal.r5.analyst.progress.Task;
import com.google.common.collect.ImmutableMap;
import spark.Request;
import spark.Response;
import spark.Service;

import java.util.ArrayList;
import java.util.List;

import static com.conveyal.analysis.util.JsonUtil.toJson;
Expand Down Expand Up @@ -40,6 +40,7 @@ public UserActivityController (TaskScheduler taskScheduler) {
@Override
public void registerEndpoints (Service sparkService) {
sparkService.get("/api/activity", this::getActivity, toJson);
sparkService.delete("/api/activity/:id", this::removeActivity, toJson);
}

private ResponseModel getActivity (Request req, Response res) {
Expand All @@ -53,6 +54,21 @@ private ResponseModel getActivity (Request req, Response res) {
return responseModel;
}

private Object removeActivity (Request req, Response res) {
UserPermissions userPermissions = UserPermissions.from(req);
String id = req.params("id");
Task task = taskScheduler.getTaskForUser(userPermissions.email, id);
// Disallow removing active tasks via the API.
if (task.state.equals(Task.State.ACTIVE)) {
throw AnalysisServerException.badRequest("Cannot clear an active task.");
}
if (taskScheduler.removeTaskForUser(userPermissions.email, id)) {
return ImmutableMap.of("message", "Successfully cleared task.");
} else {
throw AnalysisServerException.badRequest("Failed to clear task.");
}
}

/** API model used only to structure activity JSON messages sent back to UI. */
public static class ResponseModel {
/** For example: "Server going down at 17:00 GMT for maintenance" or "Working to resolve known issue [link]." */
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,6 @@
import org.locationtech.jts.geom.Geometry;
import org.locationtech.jts.operation.union.UnaryUnionOp;
import org.opengis.feature.simple.SimpleFeature;
import org.opengis.referencing.FactoryException;
import org.opengis.referencing.operation.TransformException;
import spark.Request;

import java.io.File;
Expand All @@ -37,7 +35,6 @@
import java.util.stream.Collectors;
import java.util.zip.GZIPOutputStream;

import static com.conveyal.file.FileStorageFormat.GEOJSON;
import static com.conveyal.file.FileStorageFormat.SHP;
import static com.conveyal.r5.analyst.WebMercatorGridPointSet.parseZoom;
import static com.conveyal.r5.analyst.progress.WorkProductType.AGGREGATION_AREA;
Expand Down Expand Up @@ -175,7 +172,7 @@ private static void prefetchDataSource (String baseName, String extension, FileS
public void action (ProgressListener progressListener) throws Exception {

ArrayList<AggregationArea> aggregationAreas = new ArrayList<>();
String groupDescription = "Aggregation areas from polygons";
String groupDescription = "z" + this.zoom + ": aggregation areas";
DataGroup dataGroup = new DataGroup(userPermissions, spatialDataSource._id.toString(), groupDescription);

progressListener.beginTask("Reading data source", finalFeatures.size() + 1);
Expand Down Expand Up @@ -209,7 +206,7 @@ public void action (ProgressListener progressListener) throws Exception {
maskGrid.grid[pixel.x][pixel.y] = pixel.weight * 100_000;
});

AggregationArea aggregationArea = new AggregationArea(userPermissions, name, spatialDataSource);
AggregationArea aggregationArea = new AggregationArea(userPermissions, name, spatialDataSource, zoom);

try {
File gridFile = FileUtils.createScratchFile("grid");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,16 @@ public class AggregationArea extends BaseModel {
public String dataSourceId;
public String dataGroupId;

public int zoom;

/** Zero-argument constructor required for Mongo automatic POJO deserialization. */
public AggregationArea () { }

public AggregationArea(UserPermissions user, String name, SpatialDataSource dataSource) {
public AggregationArea(UserPermissions user, String name, SpatialDataSource dataSource, int zoom) {
super(user, name);
this.regionId = dataSource.regionId;
this.dataSourceId = dataSource._id.toString();
this.zoom = zoom;
}

@JsonIgnore
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,11 @@ public enum Level {
ERROR, WARN, INFO
}

/** Zero argument constructor required for MongoDB driver automatic POJO deserialization. */
public DataSourceValidationIssue () {

}

public DataSourceValidationIssue (Level level, String description) {
this.level = level;
this.description = description;
Expand Down
20 changes: 9 additions & 11 deletions src/main/java/com/conveyal/analysis/models/OpportunityDataset.java
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,10 @@

import com.conveyal.file.FileStorageFormat;
import com.conveyal.file.FileStorageKey;
import com.conveyal.r5.analyst.PointSet;
import com.conveyal.r5.analyst.WebMercatorExtents;
import com.fasterxml.jackson.annotation.JsonIgnore;

import static com.conveyal.file.FileCategory.GRIDS;
import static com.conveyal.r5.analyst.WebMercatorGridPointSet.DEFAULT_ZOOM;

/**
* A model object for storing metadata about opportunity datasets in Mongo, for sharing it with the frontend.
Expand All @@ -34,13 +32,18 @@ public class OpportunityDataset extends Model {
public String bucketName;

/**
* Bounds in web Mercator pixels. Note that no zoom level is specified here, it's fixed to a constant 9.
* Bounds in web Mercator pixels.
*/
public int north;
public int west;
public int width;
public int height;

/**
* The zoom level this opportunity dataset was rasterized at.
*/
public int zoom;

/**
* Total number of opportunities in the dataset, i.e. the sum of all opportunity counts at all points / grid cells.
* It appears the UI doesn't use this now, but it could. We might want to remove it.
Expand Down Expand Up @@ -103,21 +106,16 @@ public FileStorageKey getStorageKey (FileStorageFormat fileFormat) {
}

@JsonIgnore
public WebMercatorExtents getWebMercatorExtents () {
return new WebMercatorExtents(west, north, width, height, DEFAULT_ZOOM);
}

@JsonIgnore
public void setWebMercatorExtents (PointSet pointset) {
public void setWebMercatorExtents (WebMercatorExtents extents) {
// These bounds are currently in web Mercator pixels, which are relevant to Grids but are not natural units
// for FreeformPointSets. There are only unique minimal web Mercator bounds for FreeformPointSets if
// the zoom level is fixed in OpportunityDataset (FIXME we may change this soon).
// the zoom level is fixed in OpportunityDataset.
// Perhaps these metadata bounds should be WGS84 instead, it depends how the UI uses them.
WebMercatorExtents extents = pointset.getWebMercatorExtents();
this.west = extents.west;
this.north = extents.north;
this.width = extents.width;
this.height = extents.height;
this.zoom = extents.zoom;
}

/** Analysis region this dataset was uploaded in. */
Expand Down
6 changes: 6 additions & 0 deletions src/main/java/com/conveyal/gtfs/GTFSCache.java
Original file line number Diff line number Diff line change
Expand Up @@ -220,6 +220,12 @@ private void buildStopsIndex (String bundleScopedFeedId, STRtree tree) {
final GTFSFeed feed = this.get(bundleScopedFeedId);
LOG.info("{}: indexing {} stops", feed.feedId, feed.stops.size());
for (Stop stop : feed.stops.values()) {
// To match existing GTFS API, include only stop objects that have location_type 0.
// All other location_types (station, entrance, generic node, boarding area) are skipped.
// Missing (NaN) coordinates will confuse the spatial index and cascade hundreds of errors.
if (stop.location_type != 0 || !Double.isFinite(stop.stop_lat) || !Double.isFinite(stop.stop_lon)) {
continue;
}
Envelope stopEnvelope = new Envelope(stop.stop_lon, stop.stop_lon, stop.stop_lat, stop.stop_lat);
Point point = GeometryUtils.geometryFactory.createPoint(new Coordinate(stop.stop_lon, stop.stop_lat));

Expand Down
Loading

0 comments on commit a528745

Please sign in to comment.