Absolute Accuracy

Absolute accuracy is an evaluation of how well the measurements (LiDAR Points) of a dataset align with the true or accepted real values. The true values for a project area/datum are generally established by some source of known higher accuracy, such as a professional land survey. LiDAR is considered a digital elevation data source, and thus commonly evaluated against truth points in the vertical direction only.

Absolute accuracy results are highly influenced by these factors:

  • Reference station coordinate accuracy

  • Processing workflow & method of evaluation

  • Systemic calibration (lever arms & boresight)

Measuring Absolute Accuracy

Phoenix LiDAR recommends that all users and LiDAR data service providers understand the industry standards and establish an agreement with data end users about how data should be collected, produced and evaluated for quality. Phoenix’s method for conveying absolute accuracy during testing is based on independent checkpoints only. Any surveyed point used as control is never considered when computing accuracy statistics.

  1. A comparison to surveyed checkpoint.

    1. Only hard surface checkpoints are utilized.

    2. At least 20 checkpoints are considered.

    3. Outliers are only removed in the cases where they are obstructed by a vehicle or other non-permanent object.

  2. Then interpolating LiDAR elevation values from an TIN model

    1. LiDAR FOV is set to 90 degree max (or less depending on the scanner’s max FOV)

    2. TIN model based on all points except distant isolated noise (birds and pits) that get caught in an isolated points filter

  3. The difference between interpolated value and the surveyed checkpoint’s elevation is computed.

  4. Finally summarizing the dataset by using those differences to determine RMSEz.

Data providers should always reach an agreement with end users about data quality standards and reporting specifications.

Last updated