Having authoritative LRS data sources is high on the agenda of any transportation agency. But let’s face it, most data administrators admit that the quality of their LRS data is less than desirable. Few have the means to measure information quality, let alone improve quality and prevent quality issues from coming up or propagating.
Yes, we may blame this on the lack of comprehensive QC tools designed to handle LRS data. However, TIQM (Total Information Quality Management) principles would point out the key issue is the absence of information quality definition in the LRS domain.
The dichotomy between LRS network and LRS feature layers has led us to examine their different QC issues separately. On the LRS network side, the following rules should be observed:
- RDBMS integrity rules
- Entity Integrity – LRS key is unique and not null.
- Reference Integrity – Route type value in an LRS table, for example, may need to be foreign-keyed to a route-types table.
- Topology rules
- Overshoots and undershoots should be identified and corrected.
- A pair of lines that cross each other without a common node should be raised to users’ attention to determine whether the pair should be broken at the intersecting point.
- LRS rules
- A route measure should be monotonic. In other words, any measure along a route should uniquely reference one and only one location on the route.
- Significant difference in calibrated route length and geometric length may indicate errors and therefore should be flagged.
- Common-sense rules
- Routes for vehicular travel should not have, for example, breaks of less than a foot.
- Routes for vehicular travel should not have sharp turns or spikes.
- A route should intersect with one or more routes.
- Technology-specific rules
- Esri, Oracle and PostGIS all have their own rules for storing LRS objects. Oracle for example requires route geometry direction be consistent with the route measure direction.
- User-defined rules
- Users can define their criteria to enforce road naming conventions, value ranges and patterns, for example.
- Performance rules
- Lines with overly-dense or even duplicate vertices should be generalized while retaining the desired representational resolution.
- Pseudo nodes should be removed to reduce line object count.
- Zero-length line should be removed. (Do not be surprised to find them in your spatial database!)
The QC of LRS feature layers does not deal with spatial geometry, but this does not make it any less challenging. Here is the list of LRS feature layer QC rules:
- Basic RDBMS rules
- Entity Integrity – Feature key is unique and not null.
- Reference integrity – Feature type value in a feature layer, for example, may need to be foreign-keyed to a table containing standard feature types.
- Domain rule
- Feature layer objects (segments) should only exist within the confines of the LRS network. For example, if a route has a mile range of 0 to 100 miles, any crash records referenced beyond the mile range violate the Domain rule.
- Overlap rule
- This rule catches duplicate or conflicting attribute values in feature layers.
- Gap rule
- This rule catches incomplete feature coverage of the network.
- User-defined rules (UDR)
- These rules are used to maintain the consistency of non-LRS attributes in feature layers. For instance, if a UDR specifies the acceptable integer values for crash severity range from 1 to 5, the generated QC report will pick out rows having null or 6.
LinearBench® Analyze has incorporated most of the rules outlined above. I hope this article helps with the discussion about the scope of LRS data QC, which will lead to more comprehensive LRS QC tools for practitioners.