UTILIZING THE UNCERTAINTY OF POLYHEDRA FOR THE RECONSTRUCTION OF BUILDINGS

The reconstruction of urban areas suffers from the dilemma of modeling urban structures in a generic or specific way, thus being too unspecific or too restrictive. One approach is to model and to instantiate buildings as arbitrarily shaped polyhedra and to recognize comprised man-made structures in a subsequent stage by geometric reasoning. To do so, we assume the existence of boundary representations for buildings with vertical walls and horizontal ground floors. To stay generic and to avoid the use of templates for pre-defined building primitives, no further assumptions for the buildings’ outlines and the planar roof areas are made. Typically, roof areas are derived interactively or in an automatic process based on given point clouds or digital surface models. Due to the mensuration process and the assumption of planar boundaries, these planar faces are uncertain. Thus, a stochastic geometric reasoning process with statistical testing is appropriate to detected man-made structures followed by an adjustment to enforce the deduced geometric constraints. Unfortunately, city models usually do not feature information about the uncertainty of geometric entities. We present an approach to specify the uncertainty of the planes corresponding to the planar patches, i.e., polygons bounding a building, analytically. This paves the way to conduct the reasoning process with just a few assumptions. We explicate and demonstrate the approach with real data.


INTRODUCTION 1.1 Motivation
For the representation of urban scenes specific or generic models are conceivable, leading to the classical dilemma of being too unspecific or too restrictive (Heuel and Kolbe, 2001).Specific models comprise object knowledge, for instance about manmade structures, and can therefore directly be related to buildings.Parametric models, for instance, are often utilized for the representation of buildings although they are unable to represent objects of arbitrary shape.Therefore, buildings of arbitrary complexity should be described by generic models.Especially boundary representations are suitable representations for polyhedra.In the context of 3D city modeling they are used, but actually often obtained by converting parametric model instances.
Current research directions address the challenge to introduce building shape knowledge without being too restrictive.The detection of global regularities can be achieved for instance by clustering or hierarchical decomposition of planar elements, followed by a re-orientation and re-positioning to align the patches with the cluster centers, cf.(Zhou andNeumann, 2012, Verdie et al., 2015).Such approaches require specific thresholds and do not exploit the uncertainty of the extracted elements.In (Xiong et al., 2014, Xiong et al., 2015) the topological graph of identified roof areas is analyzed to instantiate and to combine pre-defined lowlevel shape primitives.Again, parametric models are avoided for the sake of flexibility.The result is a boundary representation with vertical walls and horizontal ground floor.
The instantiation of the building models is based on observations which are inherently uncertain -which holds for automatic and semiautomatic acquisitions.This uncertainty results from the measurements, wrong model assumptions, and wrong interpretations or inferences.In the context of matching building models with images this issue is pointed out in (Iwaszczuk et al., 2012).Thus a geometric reasoning to detect and to enforce man-made structures should take these uncertainties into account.In (Meidow, 2014) the use of pre-defined primitives is replaced by the recognition of man-made structures, i.e., geometric relations between adjacent roof areas.Geometric relations such as orthogonality or parallelism are found by statistical hypothesis testing and then enforced by a subsequent adjustment of the roof planes.
An automatic reconstruction of buildings is most often derived from airborne laser scanning data or aerial images.This offers the possibility to specify the uncertainty of roof areas: The uncertainty of the planes corresponding to the roof areas is a result of the plane fitting procedure.However, if the input for the reasoning process is already a generic representation of the building, e.g., an arbitrary shaped polyhedron, the information about the acquisition and its uncertainty is lost since 3D city models usually do not come along with this information.In this case the uncertainties of the planar patches bounding the building have to be derived just from the given boundary representation of the polyhedra.This is the main goal of this paper.

Contribution
We provide analytical expressions for the uncertainty of planes corresponding to planar patches represented by 3D polygons.These patches bound buildings as provided by 3D city models.By doing so, we consider multiply-connected regions, i.e., polygons or patches with holes, too.Examples are roof areas with openings for dormer windows and buildings with a flat roof around a courtyard.
The approach is based on closed form solutions for the determination of moments for arbitrarily shaped 2D polygons (Steger, 1996b, Steger, 1996a).The idea is to replace the sums over point coordinates occurring in the inverse covariance matrix, i.e., the normal equation matrix, for plane fitting by integrals.This implies the assumption that the points are distributed uniformly on the planar patch.We extend this approach to multiply-connected regions and to planar polygons embedded in 3D space.Methodologically, this is a refinement of the determination of the covariance matrix for the centroid representation given a point cloud representing a planar patch (Förstner and Wrobel, 2016, p. 397, p. 436).
To demonstrate the feasibility and the usefulness of the approach, we explicate a complete stochastic reasoning process for a building which comprises the determination of the planes' uncertainties, the inference of geometric relations, and eventually the enforcement of the deduced constraints by adjustment.

THEORETICAL BACKGROUND
After clarifying our notation, we present in Section 2.2 the centroid representation for planes, the estimation of the corresponding parameter values, and the estimation of the accompanying covariance matrix.For the computation of the normal equation matrix, sums of coordinate moments have to be computed.These sums are then replaced in Section 2.4 by integrals to perform the transition to analytical expressions for the uncertainty of plane parameters given a 3D planar polygon.

Notation
We denote geometric 2D entities, namely points, with lowercase letters and 3D entities, namely 3D points and planes, with upper-case letters.For the representation of points, planes and transformations we use also homogeneous coordinates.Homogeneous vectors and matrices are denoted with upright letters, e.g., x or H, Euclidean vectors and matrices with slanted letters, e.g., x or R. For homogeneous coordinates '=' means an assignment or an equivalence up to a scaling factor λ = 0. We distinguish between the name of a geometric entity denoted by a calligraphic letter, e.g., x , and its representation, e.g., x or x.We use the skew-symmetric matrix S(x) to induce the cross product S(x)y = x × y of two vectors.The operation d = diag(D) extracts the diagonal elements of the matrix D as vector d while D = Diag(d) constructs the diagonal matrix D based on the vector d.

Plane Parameters and their Uncertainty given a 3D
Point Cloud For the representation of an uncertain plane we utilize the centroid form, as it naturally results from the best fitting plane through a set of given points.The presentation comprises the centroid X0 where the uncertainty of a point on the plane is smallest, the rotation matrix R for the transformation of the plane into a local coordinate system, the maximum and minimum variances σ 2 α and σ 2 β of the plane's normal, and the variance σ 2 q of the position of X0 across the plane.Thus the nine parameters compiled in constitute a convenient representation (Förstner and Wrobel, 2016, p. 436).
Parameter Estimation For the estimation of the plane parameters we consider the distances of the given I points Xi, i = 1 . . .I to the best fitting plane A. Given a point X0 on A, the point-plane distances read with the plane's normal N and the points in Euclidean representation.With the squared distances of the least-squares method the objective function is We assume independent and identically distributed point coordinates with the weight w = 1/σ 2 for all observations.For the estimation of plane parameters with individual weights for the points, please refer to (Förstner and Wrobel, 2016, p. 436).Setting the derivative to zero reveals that the relation is fulfilled for the centroid X0 with the coordinates For the determination of the plane's orientation we compute the eigendecomposition of the matrix of second centralized moments with the three eigenvalues λ1 ≥ λ2 ≥ λ3 in Λ = Diag ([λ1, λ2, λ3]) and the orthonormal matrix R = [r1, r2, r3].
The normal of the plane is then N = r3.
The third eigenvalue is the sum of the squared residuals, thus the estimated variance of the point coordinates can be derived from assuming no outliers are present.If the redundancy I −3 is large enough the estimated variance σ 2 can be used to fix the weight w.
In the following we assume, that the original point cloud data are not available, and only the form of the planar patches is known; hence we need to make assumptions on σ 2 and the distribution of the 3D points.
Uncertainty Estimation For the specification of the plane's uncertainty in form of a covariance matrix we centralize and rotate the given points in a way that the best-fitting plane for the resulting points is A = [0, 0, 1, 0] T , i.e., the XY-plane, and the X-and the Y-axis correspond to the two major axes r1 and r2 of the moment matrix (7).The transformation is achieved by using the centroid X0 and the rotation matrix R. The covariance matrix is invariant w.r.t.this motion.
In the local coordinate system holds for the Z-coordinate of a point X i , where α and β are the angles between the plane's normal and the Z-axis.With the Jacobian for each observational equation the normal equation matrix becomes or more explicit Please note that the matrix N is diagonal due to the special use of the coordinate system in the centroid, i.e., The theoretical covariance matrix is Σ = N −1 and therefore diagonal, too.The variances of the estimated parameters q, α and β are for small angles α and β.
For the reasoning we need the uncertainty of the plane in the global coordinate system as obtained by the transformation given in the next paragraph.

Centroid representation to homogeneous representation
Given a plane A in the centroid representation (1), the homogeneous representation reads with the normal N being the third column of the rotation matrix R and the origin's distance D = N T X0 to the plane A. We refer to A h and A0 as the homogeneous and the Euclidean part of the homogeneous coordinates A of the plane.The covariance matrix for the plane A = [0, 0, 1, 0] T is then The point transformation X i = HX i with the motion matrix leads to the plane transformation A = CA with the cofactor matrix of H, see (Förstner and Wrobel, 2016, p. 258).Thus the covariance matrix of A is In the following we interpret the entries in the normal equation matrix (13) as moments.Assuming a continuous distribution function of the points defining a planar surface patch, we consider the moments of arbitrary polygons.

Moments of Arbitrary Polygons
Assuming that all points in a region have the same weight and are uniformly distributed, the normalized moments of order (m, n) of a region R are For m = n = 0 we get γ 0,0 = 1 since A is the area of the region R.The normalized centralized moments are with the centroid coordinates x 0 = γ 1,0 and y 0 = γ 0,1 .The centralized second moments can readily be computed via Thus it is not necessary to compute these quantities explicitly if the second moments are known (Steger, 1996b).
With the centralized second moments the normal equation matrix (15) can be written as Using the equations for the moments of polygonal regions (Steger, 1996b), we determine the two moments µxx and µyy.This is done by applying Green's theorem which allows to replace the area integrals by boundary integrals.By reducing the surface integral to a curve integral along the borders of the region R, represented by a polygon P , we are able to compute the integral as a function of the polygon's vertices: Let P (x, y) and Q(x, y) be two continuously differentiable functions on the two-dimensional region R, and let b(t) be the boundary of R.  (Steger, 1996b).
For polygons, i.e., closed sequences of K straight line segments s k , k = 1, . . ., K, the parametrization of the segment s k (t) is For the computation of the integral of the function F (x, y) = x m y n over R, the function F (x, y) has to be decomposed into ∂Q/∂x and ∂P/∂y according to (30) which cannot be done uniquely.Table 1 summarizes convenient decompositions for the required moments, i.e., the area, the centroid coordinates, and the centralized second moments of the region R (Steger, 1996b).
A decomposition for arbitrary moments can be found in (Steger, 1996a).
Integration and using the decompositions listed in Table 1 yields the following formulas.For details please refer to (Steger, 1996b).The polygon's area is taking index addition modulo K into account.The coordinates of the centroid are and the second (non-central) moments read The 2 × 2 matrix G of second moments is used in Section 3 to compute a polygons's orientation in 2D which will then be trans-formed into the 3D space.

APPROACH
Based on the aforementioned concepts, we explicate our approach in detail.After the computation of uncertain planes because of given 3D polygons, we explain the subsequent geometric reasoning consisting of hypothesis generation, statistical testing, and adjustment.

Uncertainty of Planes
First of all, we determine the plane defined by the K vertices {X1, X2, . . ., XK } of the planar polygon embedded in 3D.The plane's normal vector A h is (Mäntylä, 1988, p. 218) here in compact vector representation with and index addition modulo K.For boundary representations defining vertex points by the intersection of three or more planes, three non-collinear vertices are sufficient, i.e., to determine the normal.Figure 1 shows an example where this assumption is violated.Several points are incident to two planes only and hence no vertices.
For the point X defined by the mean coordinates X of the vertices 1 , the incidence X ∈ A holds.Expressed in homogeneous coordinates, this reads X T A = 0 with the 3D point To determine the moments of the polygon, we rotate the vertices into a plane A with Z=const for all points, i.e., A = [0, 0, 1, −Z] T .This can be achieved for instance by means of the smallest rotation (Förstner and Wrobel, 2016, p. 340) from vector a = [0, 0, 1] T to vector b = N for the application at hand.For horizontal ground floors N = [0, 0, −1] T holds and we use Q = Diag([1, −1, −1]) as rotation matrix.The transformed coplanar points are for all transformed points.
The plane A constitutes a 2D coordinate system and the polygon's vertices have coordinates (X k , Y k ).We assume the polygons to be possibly multiply-connected, i.e., they could enclose holes.Therefore we have to distinguish between exterior and potential interior boundaries, often denoted as rings.Figure 1 shows an example from a data set with LoD2-buildings provided by the Magistrat of the city of Linz, Austria (Linz, 2011).The buildings have been reconstructed by a semiautomatic approach with With R rings representing a polygon, the polygon's area is with the signed areas Ar according to (33) for each ring.The centroid of the polygon, being the normalized first moment, is with x kr = [X kr , Y kr ] T = [x kr , y kr ] T being the k-th vertex of the ring r with Kr vertices and edges, and the matrix M of normalized second moments contains the elements and A kr x kr y k+1,r + 2x kr y kr +2x k+1,r y k+1,r + x k+1,r y kr . (48) The matrix of centralized second moments reads and its eigendecomposition M = UΛU T yields the eigenvectors U = [u 1 , u 2 ] and the two eigenvalues [λ1, λ2] = diag(Λ).
Observe, the eigenvalues λ1 and λ2 of the 2×2 matrix M in (49), derived by integration, correspond to the eigenvalues λ1 and λ2 of the 3×3 matrix M in (7), derived by summation over all points.
The polygon's centroid x0 on the plane A = [0, 0, 1, −Z] T rep-resented in the 3D space can then be back-transformed via to obtain the polygon's centroid X0 in the 3D space.And eventually, the eigenvectors u 1 and u 2 in the plane A are rotated according to to determine the rotation matrix R as part of the plane's representation (1).
For the determination of the plane's uncertainty we consider a virtual scanning process yielding equally spaced points on the plane.Given the area A of a polygon and a sampling distance ∆ in two orthogonal directions, the number of sampling points in the polygon is S = A/∆ 2 .Assuming a known, representative weight w = 1/σ 2 for all coordinates in (15), we get N11 = wS, and therefore, using the eigenvalues of M in ( 49) Obviously, just the product σ∆ of the assumed uncertainty σ of the virtual sampling points and the spacing ∆ has to be specified to compute the complete representation of the uncertain plane A corresponding to a given polygon P .The uncertain plane in centroid representation (1) is completely specified by ( 50), ( 51), ( 52), ( 53) and (54).

Geometric Reasoning
For our boundary representation, we assume that each vertex is defined by at least three planes intersecting in one point.Adjacent faces lying in the same plane will therefore evoke undefined points.Thus in a pre-processing step we check the faces' binary relations encoded in the given boundary representation for identical planes and merge faces corresponding to the same plane.
Beyond that, we restrict ourselves to the relation orthogonality as the most dominant geometric constraint.For further conceivable constraints please refer to (Heuel, 2004) and (Förstner and Wrobel, 2016, p. 304ff).
The space of hypotheses is given by all pairs of adjacent faces represented by polygons.For the testing of geometric relations please refer to (Förstner and Wrobel, 2016, p. 304ff).Once all potential relations are tested, we have a set of constraints at hand.These result from those hypotheses which could not be rejected by the tests.For the concluding adjustment, a set of consistent, i.e., non-contradicting, and non-redundant constraints is mandatory since redundant constraints will lead to singular covariance matrices.Since we are dealing with imprecise and noisy observations, we have to face the possibility of non-rejected hypotheses which are contradictory, too.We utilize the greedy algorithm proposed in (Meidow and Hammer, 2016) and (Meidow et al., 2009) to select a set of independent constraints automatically.

EXPERIMENTS
For proving the usefulness of the approach we utilize a polyhedral building model which has been instantiated based on an airborne laser scan.Then we derive the uncertainty of the planes corresponding to the faces of the boundary representation, perform the geometric reasoning, determine a set of independent constraints, and eventually conduct an adjustment to enforce the inferred constraints.
Model Instantiation In the data analysis step, the points of the laser scan have been classified into roof and non-roof points.
Then the points representing roof areas have been grouped by utilizing the RANSAC-based shape detector (Schnabel et al., 2007) provided by the mesh processing software CLOUDCOMPARE.
Figure 2 shows points captured by a RIEGL LMS-Q560 scanner representing the roof of a farmhouse and a corresponding boundary representation.The reconstruction has been carried out by initially performing a 2D triangulation and computing alphashapes to determine the building's outline.The triangles with points of different planar point groups constitute the boundaries of the roof sections.By analyzing the run of these borders we obtained the interior roof structure, i.e., ridge lines, step-edges, and roof valleys.All traversed lines have been simplified by vertex decimation.The solid has been closed by assuming vertical walls on top of the irregularly shaped outline.
The example shows the result of an analysis step, from the point cloud to a polyhedral boundary model.In this paper we do not assume the uncertainty of the resulting planar patches to be known.However, we assume the standard deviations, except for a common factor, depend on their form.
As a result, we obtain a generic representation of the building.Only for the non-observed building parts, i.e., the walls and the floor, model assumptions are made.However, the result does not provide any information about its uncertainty.
Reasoning After the determination of the planes' uncertainties according to Section 3.1, we determined the set of constraints with a significance level of α = 0.05 (Section 3.2).Very small or thin roof areas are usually very uncertain, too.Therefore, it is likely that hypotheses with these areas involved are not rejected and wrong constraints will be inferred.Thus, we suppress constraints with faces featuring areas smaller than 16 m2 .
Figure 3 shows the boundary representation of the building with the inferred constraints: 28 times orthogonality and 5 times identity have been detected.The numerous constraints for the ground floor are not visualized for the sake of clarity.
A vertex of our boundary representation is defined by the intersection of at least three planes defined by the building's faces.Thus identical or almost identical planes will lead to indetermination.Therefore we merge adjacent faces which have been identified to lie in the same plane in a pre-processing step.horizontal and the building's outline is rectangular.Of course, the detection and enforcement of further constraints such as identical slopes for the roof areas is conceivable.

CONCLUSIONS AND OUTLOOK
We derived analytical expressions for the uncertainty of planes corresponding to the planar patches bounding a polyhedron.This paves the way to stochastic geometric reasoning for generic city models, i.e., the detection and enforcement of man-made structures such as orthogonality, parallelism, or identity, e.g., for groups of buildings.
The estimation of a plane's uncertainty based on a given point cloud yields a normal equation matrix.Assuming a continuous distribution function for points defining a planar patch, we interpret the entries in the normal equation matrix as moments for arbitrarily shaped 2 polygons.Furthermore, by considering the signed areas of polygons we are able to cope with multiplyconnected polygons, i.e., polygons with interior boundaries defining holes.
) with the vertices x k and x k−1 of the k-th segment s k (edge).The indices k are taken cyclically.Thus bP dx + Q dy = K k=1 s k P dx + Q dy (32)holds for (30).

Figure 1 .
Figure 1.Building with the id 1277 from the Linz data-set.The building's floor has been modeled by a 2-connected polygon.standard patterns, based on a digital surface model represented by a point cloud and given building outlines.The building with the ID 1277 features a courtyard, its floor has been modeled by a 2-connected polygon with an exterior and an interior boundary or ring.
shows the result of this simplification.After applying the hypothesis testing with the new faces, a set of 20 orthogonality constraints remain.Some of the inferred 20 constraints are redundant.The greedy algorithm identified a set of 15 independent constraints which are depicted together with the adjusted, i.e., constrained, boundary representation in Figure5.Now the ridge lines and the eaves are

Figure 2 .
Figure 2. Points of an airborne laser scan captured by a RIEGL LMS-Q560 scanner and the deduced boundary representation in two views.The points represent the roof areas of a farmhouse.

Table 1 .
Decompositions for the computations of moments following If b is piecewise differentiable and oriented such that the interior is left of the boundary path, an integral over the region R can be reduced to a curve integral over the boundary B of R in the following manner: