Ordnerverwaltung für Geoprocessing and parameter estimation
Wähle die Ordner aus, zu welchen Du "Geoprocessing and parameter estimation" hinzufügen oder entfernen möchtest
0 Exakte Antworten
11 Text Antworten
0 Multiple Choice Antworten
Karte wurde gelöscht
What is Least-squares adjustment (what is redundant observations), and how they can be used in adjustments
carry out objective quality control to estimate the unknown parameters, by redundant observations, with mathematically well-defined rules (more observations available than necessary to determine unknows)
Random errors are dealt with in least-squares adjustment; It can help discover and deal with unknown systematic errors by adding additional unknows; It can help discover and remove the blunders.
What kinds of errors could happen in surveying and their features?
random errors: small, same probability of the positive and negative error of the same magnitude, inherent and cannot be removed completely, dealt with least-squares adjustment
systematic errors: dangerous for they accumulate, avoid by adequate instrument calibration and compensation and etc., if errors are known correct them before adjustment or model them in the adjustment by adding additional unknows
blunders: large errors due to carelessness, careful observations to remove, to discover and remove by adjustment.
Accuracy: closeness of observations to the true value, related to systematic, random errors and blunders
Precision: closeness of repeated observations to the sample mean, only related to random errors
What compenents the least-squares adjustment include?
two equally important compenents for quality-controlled observations, and parameters and their accuracies:
stochastic model: for precision, not related (diagonal variance-covariance matrice with off-diagonal covariance equals 0) random varibles (only with random errors) under normal distribution (observations belong to N)
cofactor matrix: scaled variance-covariance matrix (scale factor: priori variance of unit weight)
mathematical model: express mathematically relations between observations and between observations and other quantities of interest (parameters or unknowns of the adjustment)
most unlinear, nonlinear functions should firstly linearize
mixed adjustment: observations and parameters are related by an impicit nonliear function
observation equation: observations are explicitly related to the parameters
big pro: each observation generates one equation
condition equation: total elimination of the parameters
How would the law of variance-covariance propagation work for linear function?
Linearization: linearized around the chosen point of expansion
Minimization: based on the minimization of the function VPV, V is residuals of the observations, P is the weight matrix which is inverted from the Cofactor Matrix Q. Minimization is achieved by introducing Lagrange multipliers and to make the partial derivatives be zero. Estimated X->Lagrange multipliers K->residuals V->adjusted parameters and adjusted observations
Cofactor Q: the law of variance propagation, cofactor Matrix Qw->Qx->Qv->QLa
Posteriori Variance of unit weight: = VPV/(r-u), where r-u is the degree of freedom, which equals the number of redundant obervations. SigmaX->SigmaV->SigmaLa
iterations: Let the adjustments converge properly by converging both Vi and Xi to zero, i.e. the iteration has converged if |(VPV)i-(VPV)i-1}<a small positive number
The important equation of three adjustment models.