Resumen
Modeling urban scenes automatically is an important problem for both GIS and nonGIS specialists with applications like urban planning, autonomous driving, and virtual reality. In this paper, we present a novel contour deformation approach to generate regularized and vectorized 3D building models from the orthophoto and digital surface model (DSM).The proposed method has four major stages: dominant directions extraction, find target align direction, contour deformation, and model generation. To begin with, we extract dominant directions for each building contour in the orthophoto. Then every edge of the contour is assigned with one of the dominant directions via a Markov random field (MRF). Taking the assigned direction as target, we define a deformation energy with the Advanced Most-Isometric ParameterizationS (AMIPS) to align the contour to the dominant directions. Finally, the aligned contour is simplified and extruded to 3D models. Through the alignment deformation, we are able to straighten the contour while keeping the sharp turning corners. Our contour deformation based urban modeling approach is accurate and robust comparing with the state-of-the-arts as shown in experiments on the public dataset.