Absolute Accuracy
Last updated
Last updated
Absolute accuracy is an evaluation of how well the measurements (LiDAR Points) of a dataset align with the true or accepted real values. The true values for a project area/datum are generally established by some source of known higher accuracy, such as a professional land survey. LiDAR is considered a digital elevation data source, and thus commonly evaluated against truth points in the vertical direction only.
Absolute accuracy results are highly influenced by these factors:
Reference station coordinate accuracy
Processing workflow & method of evaluation
Systemic calibration (lever arms & boresight)
Phoenix LiDAR recommends that all users and LiDAR data service providers understand the industry standards and establish an agreement with data end users about how data should be collected, produced and evaluated for quality. Phoenix’s method for conveying absolute accuracy during testing is based on independent checkpoints only. Any surveyed point used as control is never considered when computing accuracy statistics.
A comparison to surveyed checkpoint.
Only hard surface checkpoints are utilized.
At least 20 checkpoints are considered.
Outliers are only removed in the cases where they are obstructed by a vehicle or other non-permanent object.
Then interpolating LiDAR elevation values from an TIN model
LiDAR FOV is set to 90 degree max (or less depending on the scanner’s max FOV)
TIN model based on all points except distant isolated noise (birds and pits) that get caught in an isolated points filter
The difference between interpolated value and the surveyed checkpoint’s elevation is computed.
Finally summarizing the dataset by using those differences to determine RMSEz.
Data providers should always reach an agreement with end users about data quality standards and reporting specifications.