Article

Model Synthesis: A General Procedural Modeling Algorithm.

University of North Carolina at Chapel Hill, Chapel Hill.
IEEE transactions on visualization and computer graphics 08/2010; DOI: 10.1109/TVCG.2010.112
Source: PubMed

ABSTRACT We present a method for procedurally modeling general complex 3D shapes. Our approach can automatically generate complex models of buildings, man-made structures, or urban datasets in a few minutes based on user-defined inputs. The algorithm attempts to generate complex 3D models that resemble a user-defined input model and that satisfy various dimensional, geometric, and algebraic constraints to control the shape. These constraints are used to capture the intent of the user and generate shapes that look more natural. We also describe efficient techniques to handle complex shapes, highlight its performance on many different types of models. We compare model synthesis algorithms with other procedural modeling techniques, discuss the advantages of different approaches, and describe as close connection between model synthesis and context-sensitive grammars.

0 Bookmarks
 · 
88 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Solid textures, comprising 3D particles embedded in a matrix in a regular or semiregular pattern, are common in natural and man-made materials, such as brickwork, stone walls, plant cells in a leaf, etc. We present a novel technique for synthesizing such textures, starting from 2D image exemplars which provide cross-sections of the desired volume texture. The shapes and colors of typical particles embedded in the structure are estimated from their 2D cross-sections. Particle positions in the texture images are also used to guide spatial placement of the 3D particles during synthesis of the 3D texture. Our experiments demonstrate that our algorithm can produce higher quality structures than previous approaches; they are both compatible with the input images, and have a plausible 3D nature.
    IEEE Transactions on Visualization and Computer Graphics 03/2013; 19(3):460-469. · 1.92 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: We propose a method that generates stylized building models from examples (Figure ). Our method only requires minimal user input to capture the appearance of a Manhattan world (MW) building, and can automatically retarget the captured ‘look and feel’ to new models. The key contribution is a novel representation, namely the ‘style sheet’, that is captured independently from a building's structure. It summarizes characteristic shape and texture patterns on the building. In the retargeting stage, a style sheet is used to decorate new buildings of potentially different structures. Consistent face groups are proposed to capture complex texture patterns from the example model and to preserve the patterns in the retarget models. We will demonstrate how to learn such style sheets from different MW buildings and the results of using them to generate novel models.
    Computer Graphics Forum 10/2013; · 1.60 Impact Factor
  • Graphical Models 09/2014; · 0.97 Impact Factor