Absolute Accuracy
Last updated
Last updated
Absolute accuracy is an evaluation of how well the measurements (LiDAR Points) of a dataset align with the true or accepted real values. The true values for a project area/datum are generally established by some source of known higher accuracy, such as a professional land survey. LiDAR is considered a digital elevation data source, and thus commonly evaluated against truth points in the vertical direction only.
Absolute accuracy results are highly influenced by these factors:
Reference station coordinate accuracy
Processing workflow & method of evaluation
Systemic calibration (lever arms & boresight)
A comparison to surveyed checkpoint.
Only hard surface checkpoints are utilized.
At least 20 checkpoints are considered.
Outliers are only removed in the cases where they are obstructed by a vehicle or other non-permanent object.
Then interpolating LiDAR elevation values from an TIN model
LiDAR FOV is set to 90 degree max (or less depending on the scanner’s max FOV)
TIN model based on all points except distant isolated noise (birds and pits) that get caught in an isolated points filter
The difference between interpolated value and the surveyed checkpoint’s elevation is computed.
Finally summarizing the dataset by using those differences to determine RMSEz.
Measuring absolute accuracy, as opposed to relative accuracy or precision, requires the use of additional data in the form of well distributed independent checkpoints, with accuracy 3x that of the LiDAR data under evaluation. Please refer to the ASPRS standards for full details of control & checkpoint requirements for data processing and reporting, or document for an overview. These independent checkpoints are vital to the absolute accuracy assessment.
ASPRS Positional Accuracy Standards for Digital Geospatial Data, 2014.
LiDAR Base Specifications, 2021. rev A.