You will find on this page all scientific publications related to the HDWorlds project.

Publication summary


  • Data-driven Authoring of Large-scale Ecosystems

    K. Kapp, J. Gain, E. Guérin, E. Galin, A. Peytavie
    ACM Transactions on Graphics, Proceedings of SIGGRAPH Asia, 2020

    Abstract: Ecosystem simulations embody many of the botanical influences, such as sunlight, temperature, and moisture, but require hours to complete, while synthesis from statistical distributions tends not to capture fine-scale variety and complexity. Instead, we leverage real-world data and machine learning to derive a canopy height model (CHM) for unseen terrain provided by the user. Trees in the canopy layer are then fitted to the resulting CHM through a constrained iterative process that optimizes for a given distribution of species, and, finally, an understorey layer is synthesised using distributions derived from biome-specific undergrowth simulations. Such a hybrid datadriven approach has the advantage that it incorporates subtle biotic, abiotic, and disturbance factors implicitly encoded in the source data and evidences accepted biological behaviour, such as self-thinning, climatic adaptation, and gap dynamics.

  • Simulation, Modeling and Authoring of Glaciers

    O. Argudo, E. Galin, A. Peytavie, A. Paris, E. Guérin
    ACM Transactions on Graphics, Proceedings of SIGGRAPH Asia, 2020

    Abstract: In this paper, we combine a Shallow Ice Approximation simulation with a procedural amplification process to author high-resolution realistic glaciers. Our multiresolution method allows the interactive simulation of the formation and the evolution of glaciers over hundreds of years. The user can easily modify the environment variables, such as the average temperature or precipitation rate, to control the glacier growth, or directly use brushes to sculpt the ice or bedrock with interactive feedback. Mesoscale and smallscale landforms that are not captured by the glacier simulation, such as crevasses, moraines, seracs, ogives, or icefalls are synthesized using procedural rules inspired by observations in glaciology and according to the physical parameters derived from the simulation. Our method lends itself to seamless integration into production pipelines to decorate reliefs with glaciers and realistic ice features.

  • Content-aware texture deformation with dynamic control

    G. Guingo, F. Larue, B. Sauvage, N. Lutz, J-M. Dischler, M-P. Cani
    Computers & Graphics, 91, 2020

    Bouncing cube.png

    A few frames of a soft bouncing cube with texture deformation guided by the underlying surface animation. Top row: our texture deformation model. Bottom row: standard texture mapping. Dark parts, defined as elastic, shrink on impact with the ground and stretch after the rebound, when the object tends to recover its rest shape.

    Abstract: Textures improve the appearance of virtual scenes by mapping visual details on the surface of 3D objects. Various scenarios – such as real-time animation, interactive texture modelling, or offline post-production – require textures to be deformed in a controllable and plausible manner. We propose a novel approach to model and control texture deformations, which is easy to implement in a standard graphics pipeline. The deformation is implemented at pixel resolution as a warping in the parametric domain. The warping is controlled locally and dynamically by real-time integration along the streamlines of a pre-computed flow field. We propose a technique to pre-compute the flow field from a simple scalar map representing heterogeneous dynamic behaviors. Moreover, to manage sampling issues arising in over-stretched areas during deformation, we provide a mechanism based on re-sampling and texture synthesis. Warping may alternatively be controlled by deformation of the underlying surface, environment parameters or interactive editing, which demonstrates the versatility of our approach.
    [Get the paper here ]

  • Semi-Procedural Textures Using Point Process Texture Basis Functions

    P. Guehl, R. Allegre, J-M. Dischler, B. Benes, E. Galin
    Eurographics Symposium on Rendering, 2020


    A single texture or multiple texture maps (a), and a binary structure (b) are used to generate a semi-procedural output (d). It is a novel texture representation where structure is procedural (d, top) and details are data-driven (d, bottom). Generated textures have procedural properties: infinity, no repetition, self-consistency, and genericity. The structure can be edited by using parameters (only three of them are shown here). Morphing is implicitly obtained by interpolating these parameters. The data-driven details guarantee a good visual match with the exemplar. We call Semi-procedural synthesis the synthesis from a structure that matches the input exemplar, and Semi-procedural editing the synthesis from a user edited structure. A rendered view of the input material is shown for comparison (c).

    Abstract: We introduce a novel semi-procedural approach that avoids drawbacks of procedural textures and leverages advantages of data-driven texture synthesis. We split synthesis in two parts: 1) structure synthesis, based on a procedural parametric model and 2) color details synthesis, being data-driven. The procedural model consists of a generic Point Process Texture Basis Function (PPTBF), which extends sparse convolution noises by defining rich convolution kernels. They consist of a window function multiplied with a correlated statistical mixture of Gabor functions, both designed to encapsulate a large span of common spatial stochastic structures, including cells, cracks, grains, scratches, spots, stains, and waves. Parameters can be prescribed automatically by supplying binary structure exemplars. As for noise-based Gaussian textures, the PPTBF is used as stand-alone function, avoiding classification tasks that occur when handling multiple procedural assets. Because the PPTBF is based on a single set of parameters it allows for continuous transitions between different visual structures and an easy control over its visual characteristics. Color is consistently synthesized from the exemplar using a multiscale parallel texture synthesis by numbers, constrained by the PPTBF. The generated textures are parametric, infinite and avoid repetition. The data-driven part is automatic and guarantees strong visual resemblance with inputs.
    [Get the paper and code here ]

  • Segment Tracing Using Local Lipschitz Bounds

    E. Galin, E. Guérin, A. Paris, A. Peytavie
    Eurographics, 2020

    Segment tracing.png

    We tackle the problem of computationally intensive Sphere Tracing by evaluating local Lipschitz bound along the ray during the ray marching process. Our Segment Tracing method reduces the number of field function queries #f and accelerates ray-object intersection.

    Abstract: We introduce Segment Tracing, a new algorithm that accelerates the classical Sphere Tracing method for computing the intersection between a ray and an implicit surface. Our approach consists in computing the Lipschitz bound locally over a segment to improve the marching step computation and accelerate the overall process. We describe the computation of the Lipschitz bound for different operators and primitives. We demonstrate that our algorithm significantly reduces the number of field function queries compared to previous methods, without the need for additional accelerating data-structures. Our method can be applied to a vast variety of implicit models ranging from hierarchical procedural objects built from complex primitives, to simulation-generated implicit surfaces created from many particles.
    [Get the paper here ]

  • 2019

    • Procedural Riverscapes

      A. Peytavie, T. Dupont, E. Guérin, Y. Cortial, B. Benes, J. Gain, E. Galin
      Pacific Graphics, 2019


      From a bare-earth input terrain, our method calculates the slope and drainage area to automatically generate a river graph that is procedurally amplified into a detailed river course. In this case, 4.266 km-long, with 63k terrain and 42k flow primitives.

      Abstract: This paper addresses the problem of creating animated riverscapes through a novel procedural framework that generates the inscribing geometry of a river network and then synthesizes matching real-time water movement animation. Our approach takes bare-earth heightfields as input, derives hydrologically-inspired river network trajectories, carves riverbeds into the terrain, and then automatically generates a corresponding blend-flow tree for the water surface. Characteristics, such as the riverbed width, depth and shape, as well as elevation and flow of the fluid surface, are procedurally derived from the terrain and river type. The riverbed is inscribed by combining compactly supported elevation modifiers over the river course. Subsequently, the water surface is defined as a time-varying continuous function encoded as a blend-flow tree with leaves that are parameterized procedural flow primitives and internal nodes that are blend operators. While river generation is fully automated, we also incorporate intuitive interactive editing of both river trajectories and individual riverbed and flow primitives. The resulting framework enables the generation of a wide range of river forms, ranging from slow meandering rivers to rapids with churning water, including surface effects, such as foam and leaves carried downstream.
      [Get the paper here ]

    • Desertscape Simulation

      A. Paris, A. Peytavie, E. Guérin, O. Argudo, E. Galin
      Pacific Graphics, 2019


      Example of a desert landscape modeled with our simulation. The user defined a wind rose, added local swirls, and finally placed sand at the center of the scene. Our model automatically created a mega barchan and a star-shaped dune. The turbulence also created asymmetric transverse dunes as well as a linear dune.

      Abstract: We present an interactive aeolian simulation to author hot desert scenery. Wind is an important erosion agent in deserts which, despite its importance, has been neglected in computer graphics. Our framework overcomes this and allows generating a variety of sand dunes, including barchans, longitudinal and anchored dunes, and simulates abrasion which erodes bedrock and sculpts complex landforms. Given an input time varying high altitude wind field, we compute the wind field at the surface of the terrain according to the relief, and simulate the transport of sand blown by the wind. The user can interactively model complex desert landscapes, and control their evolution throughout time either by using a variety of interactive brushes or by prescribing events along a user-defined time-line.
      [Get the paper here ]

    • Orometry-based Terrain Analysis and Synthesis

      O. Argudo, E. Galin, A. Peytavie, A. Paris, J. Gain, E. Guérin
      Siggraph Asia, 2019


      Given a regional terrain type and a rough elevation control map as input, our method automatically generates a synthetic graph of connected peaks and saddles, which is, in turn, used to procedurally generate a detailed heightfield obeying the orometric properties of the prescribed terrain type.

      Abstract: Mountainous digital terrains are an important element of many virtual environments and find application in games, film, simulation and training. Unfortunately, while existing synthesis methods produce locally plausible results they often fail to respect global structure. This is exacerbated by a dearth of automated metrics for assessing terrain properties at a macro level. We address these issues by building on techniques from orometry, a field that involves the measurement of mountains and other relief features. First, we construct a sparse metric computed on the peaks and saddles of a mountain range and show that, when used for classification, this is capable of robustly distinguishing between different mountain ranges. Second, we present a synthesis method that takes a coarse elevation map as input and builds a graph of peaks and saddles respecting a given orometric distribution. This is then expanded into a fully continuous elevation function by deriving a consistent river network and shaping the valley slopes. In terms of authoring, users provide various control maps and are also able to edit, reposition, insert and remove terrain features all while retaining the characteristics of a selected mountain range. The result is a terrain analysis and synthesis method that considers and incorporates orometric properties, and is, on the basis of our user study, more visually plausible than existing terrain generation methods.
      [Get the paper here ]

    • Terrain Amplification with Implicit 3D Features

      A. Paris, E. Galin, A. Peytavie, E. Guérin, J. Gain
      ACM Transactions on Graphics, 38(5), 2019

      Terrain amplification.png

      From a 2D input height field, our method automatically generates an implicit model for representing the terrain, which is augmented with complex 3D landform features such as caves, overhangs, cliffs, arches or karsts. Our model can also represent dramatic and scenic science fiction landscapes such as floating islands, or giant rock spires.

      Abstract: While three-dimensional landforms, such as arches and overhangs, occupy a relatively small proportion of most computer generated landscapes, they are distinctive and dramatic and have an outsize visual impact. Unfortunately, the dominant heightfield representation of terrain precludes such features, and existing in-memory volumetric structures are too memory intensive to handle larger scenes. In this paper, we present a novel memory-optimized paradigm for representing and generating volumetric terrain based on implicit surfaces. We encode feature shapes and terrain geology using construction trees that arrange and combine implicit primitives. The landform primitives themselves are positioned using Poisson sampling, built using open shape grammars guided by stratified erosion and invasion percolation processes, and, finally, queried during polygonization. Users can also interactively author landforms using high-level modeling tools to create or edit the underlying construction trees, with support for iterative cycles of editing and simulation. We demonstrate that our framework is capable of importing existing large-scale heightfield terrains and amplifying them with such diverse structures as slot canyons, sea arches, stratified cliffs, fields of hoodoos, and complex karst cave networks.
      [Get the paper here ]

    • Dendry: A Procedural Model for Dendritic Patterns

      M. Gaillard, B. Benes, E. Guérin, E. Galin, D. Rohmer, M.-P. Cani
      ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, 2019


      Dendry is a locally computable procedural function that generates branching patterns at various scales. One application is terrain synthesis.

      Abstract: We introduce Dendry, a procedural function that generates dendritic patterns and is locally computable. The function is controlled by parameters such as the level of branching, the degree of local smoothing, random seeding and local disturbance parameters, and the range of the branching angles. It is also controlled by a global control function that defines the overall shape and can be used, for example, to initialize local minima. The algorithm returns the distance to a tree structure which is implicitly constructed on the fly, while requiring a small memory footprint. The evaluation can be performed in parallel for multiple points and scales linearly with the number of cores. We demonstrate an application of our model to the generation of terrain heighfields with consistent river networks. A quad core implementation of our algorithm takes about ten seconds for a 512x512 resolution grid on the CPU.
      [Get the paper here ]

    • A Review of Digital Terrain Modeling

      E. Galin, E. Guérin, A. Peytavie, M.-P. Cani, G. Cordonnier, B. Benes, J. Gain
      Computer Graphics Forum, 38(2), Eurographics, 2019

      Star terrain modeling.png

      Example of hydraulic erosion applied to a fractal pro-cedural terrain.

      Abstract: Terrains are a crucial component of three-dimensional scenes and are present in many Computer Graphics applications. Terrain modeling methods focus on capturing landforms in all their intricate detail, including eroded valleys arising from the interplay of varied phenomena, dendritic mountain ranges, and complex river networks. Set against this visual complexity is the need for user control over terrain features, without which designers are unable to adequately express their artistic intent. This article provides an overview of current terrain modeling and authoring techniques, organized according to three categories: procedural modeling, physically-based simulation of erosion and land formation processes, and example-based methods driven by scanned terrain data. We compare and contrast these techniques according to several criteria, specifically: the variety of achievable landforms; realism from both a perceptual and geomorphological perspective; issues of scale in terms of terrain extent and sampling precision; the different interaction metaphors and attendant forms of user-control, and computation and memory performance. We conclude with an in-depth discussion of possible research directions and outstanding technical and scientific challenges.
      [Get the paper here ]

    • Procedural Tectonic Planets

      Y. Cortial, A. Peytavie, E. Galin, E. Guérin
      Computer Graphics Forum, 38(2), Eurographics, 2019

      Proc tectonic planets.png

      Our method creates planets (1) with continents, peninsulas, island arcs, oceanic ridges and trenches (4) by computing the plate tectonics. The detailed relief (5, 6) of the planet is generated by amplifying the coarse crust model (2, 3) with landforms.

      Abstract: We present a procedural method for authoring synthetic tectonic planets. Instead of relying on computationally demanding physically-based simulations, we capture the fundamental phenomena into a procedural method that faithfully reproduces largescale planetary features generated by the movement and collision of the tectonic plates. We approximate complex phenomena such as plate subduction or collisions to deform the lithosphere, including the continental and oceanic crusts. The user can control the movement of the plates, which dynamically evolve and generate a variety of landforms such as continents, oceanic ridges, large scale mountain ranges or island arcs. Finally, we amplify the large-scale planet model with either procedurally defined or real-world elevation data to synthesize coherent detailed reliefs. Our method allows the user to control the evolution of an entire planet interactively, and to trigger specific events such as catastrophic plate rifting.
      [Get the paper here ]

    • Anisotropic Filtering for On-the-Fly Patch-based Texturing

      N. Lutz, B. Sauvage, F. Larue, J-M. Dischler
      Short paper, Eurographics, 2019

      Patch based aniso.png

      Our filtering method (right) is compared to the ground truth (middle) and no filtering (left). The ground truth is computed by an exact filtering of the high resolution.

      Abstract: On-the-fly patch-based texturing consists of choosing at run-time, for several patches within a tileable texture, one random candidate among a pre-computed set of possible contents. This category of methods generates unbounded textures, for which filtering is not straightforward, because the screen pixel footprint may overlap multiple patches in texture space, i.e. different randomly chosen contents. In this paper, we propose a real-time anisotropic filtering which is fully compliant with the standard graphics pipeline. The main idea is to pre-filter the contents independently, store them in an atlas, and combine them at run-time to produce the final pixel color. The patch-map, referencing to which patch belong the fetched texels, requires a specific filtering approach, in order to recover the patches that overlap at low resolutions. In addition, we show how this method can achieve blending at patch boundaries in order to further reduce visible seams, without modification of our filtering algorithm.


    • Procedural Cloudscapes

      A. Webanck, Y. Cortial, E. Guérin, E. Galin
      Computer Graphics Forum, 37(2), Eurographics, 2018

      Clouds types.png

      Our procedural cloudscape model allows to represent different types of clouds.

      Abstract: We present a phenomenological approach for modeling and animating cloudscapes. We propose a compact procedural model for representing the different types of cloud over a range of altitudes. We define primitive-based field functions that allow the user to control and author the cloud cover over large distances easily. Our approach allows us to animate cloudscapes by morphing: instead of simulating the evolution of clouds using a physically-based simulation, we compute the movement of clouds using key-frame interpolation and tackle the morphing problem as an Optimal Transport problem. The trajectories of the cloud cover primitives are generated by solving an Anisotropic Shortest Path problem with a cost function that takes into account the elevation of the terrain and the parameters of the wind field.
      [Get the paper here ]

    • Urban Weathering: Interactive Rendering of Polluted Cities

      I. Munoz-Pandiella, C. Bosch, S. Merillou, N. Merillou, G. Patow, X. Pueyo
      IEEE Transactions on Visualization and Computer Graphics, 2018

      Urban weathering.png

      Examples of polluted areas over a city. Views of buildings without pollution (left), with pollution (middle) and pollution layer from our simulation (right).

      Abstract: Weathering effects are ubiquitous phenomena in cities. Buildings age and deteriorate over time as they interact with the environment. Pollution accumulating on facades is a particularly visible consequence of this. Even though relevant work has been done to produce impressive images of virtual urban environments including weathering effects, so far, no technique using a global approach has been proposed to deal with weathering effects. Here, we propose a technique based on a fast physically-inspired approach, that focuses on modeling the changes in appearance due to pollution soiling on an urban scale. We consider pollution effects to depend on three main factors: wind, rain and sun exposure, and we take into account three intervening steps: deposition, reaction and washing. Using a low-cost pre-computation, we evaluate the pollution distribution throughout the city. Based on this and the use of screen-space operators, our method results in an efficient approach able to generate realistic images of urban scenes by combining the intervening factors at interactive rates. In addition, the pre-computation demands a reduced amount of memory to store the resulting pollution map and, as it is independent from scene complexity, it can suit large and complex models by adapting the map resolution.


    • Real‐Time Solar Exposure Simulation in Complex Cities

      I. Muñoz‐Pandiella, C. Bosch, N. Mérillou, X. Pueyo, S. Mérillou
      Computer Graphics Forum 36 (8), 554-566, 2017

      Solar expo simu.png

      Overview of our system. A directional solar exposure map is pre-computed over the specified time period and under the given conditions (bottom row). At runtime, we evaluate visibility in screen space by combining a global and local view of the scene (top row). Surface solar exposure is finally computed by sampling the map within the sky regions visible from each point (right).

      Abstract: In urban design, estimating solar exposure on complex city models is crucial but existing solutions typically focus on simplified building models and are too demanding in terms of memory and computational time. In this paper, we propose an interactive technique that estimates solar exposure on detailed urban scenes. Given a directional exposure map computed over a given time period, we estimate the sky visibility factor that serves to evaluate the final exposure at each visible point. This is done using a screen‐space method based on a two‐scale approach, which is geometry independent and has low storage costs. Our method performs at interactive rates and is designer‐oriented. The proposed technique is relevant in architecture and sustainable building design as it provides tools to estimate the energy performance of buildings as well as weathering effects in urban environments.
      [Get the paper here ]

    • Interactive Example-Based Terrain Authoring with Conditional Generative Adversarial Networks

      E. Guérin, J. Digne, E. Galin, A. Peytavie, C. Wolf, B. Benes, B. Martinez
      Transactions on Graphics (Proceedings of Siggraph Asia), 2017

      Terrain authoring.png

      Our interactive terrain modeling framework allows the user to quickly, easily, and intuitively author realistic terrain models by using sketches of crest lines, rivers, or iso contours. Our method consists of a training step and an interactive sketching step. During the training step, we analyze a large number of terrains and extract geometric features that will serve as sketching elements. The sketches are fed to a set of cGANs which learn various generative processes. During the authoring step, the pre-trained networks are used to synthesize a terrain model that matches the input sketch and the incremental edits.

      Abstract: Authoring virtual terrains presents a challenge and there is a strong need for authoring tools able to create realistic terrains with simple user-inputs and with high user control. We propose an example-based authoring pipeline that uses a set of terrain synthesizers dedicated to specific tasks. Each terrain synthesizer is a Conditional Generative Adversarial Network trained by using real-world terrains and their sketched counterparts. The training sets are built automatically with a view that the terrain synthesizers learn the generation from features that are easy to sketch. During the authoring process, the artist first creates a rough sketch of the main terrain features, such as rivers, valleys and ridges, and the algorithm automatically synthesizes a terrain corresponding to the sketch using the learned features of the training samples. Moreover, an erosion synthesizer can also generate terrain evolution by erosion at a very low computational cost. Our framework allows for an easy terrain authoring and provides a high level of realism for a minimum sketch cost. We show various examples of terrain synthesis created by experienced as well as inexperienced users who are able to design a vast variety of complex terrains in a very short time.
      [Get the paper here ]

    • Bi-Layer textures: a Model for Synthesis and Deformation of Composite Textures

      G. Guingo, B. Sauvage, J-M. Dischler, M-P. Cani
      Eurographics Symposium on Rendering, 2017


      Our noise model decomposes an input exemplar as a structure layer and a noise layer. The noise layer captures a spatially varying Gaussian noise as a blend of stationary noises. Large outputs are synthesized on-the-fly by synchronized synthesis of the layers. Variety can be achieved at the synthesis stage by deforming the structure layer while preserving fine scale appearance, encoded in the noise layer.

      Abstract: We propose a bi-layer representation for textures which is suitable for on-the-fly synthesis of unbounded textures from an input exemplar. The goal is to improve the variety of outputs while preserving plausible small-scale details. The insight is that many natural textures can be decomposed into a series of fine scale Gaussian patterns which have to be faithfully reproduced, and some non-homogeneous, larger scale structure which can be deformed to add variety. Our key contribution is a novel, bi-layer representation for such textures. It includes a model for spatially-varying Gaussian noise, together with a mechanism enabling synchronization with a structure layer. We propose an automatic method to instantiate our bi-layer model from an input exemplar. At the synthesis stage, the two layers are generated independently, synchronized and added, preserving the consistency of details even when the structure layer has been deformed to increase variety. We show on a variety of complex, real textures, that our method reduces repetition artifacts while preserving a coherent appearance.
      [Get the paper here ]

    • Authoring Landscapes by Combining Ecosystem and Terrain Erosion Simulation

      G. Cordonnier, E. Galin, J. Gain, B. Benes, E. Guérin, A. Peytavie, M.P. Cani
      SIGGRAPH, 2017

      Terrain erosion.png

      Our framework combines layered terrain and vegetation data and supports their interlinked simulation, which can be driven by users editing layers or triggering natural events. (1) The user first provides a bare-earth digital elevation map for time step t0 and our framework simulates interleaved erosion and plant growth, up to t0+215 years. (2) In the next time step at t0+210 years, a landslide creates boulders that destroy vegetation. One year later, the designer triggers a fire in the valley, which spreads to consume part of the forest. (3) After four more years at t0+215, the remaining trees have continued growing, new saplings have germinated and the humus layer is beginning to regenerate. The white loops indicate affected areas.

      Abstract: We introduce a novel framework for interactive landscape authoring that supports bi-directional feedback between erosion and vegetation simulation. Vegetation and terrain erosion have strong mutual impact and their interplay influences the overall realism of virtual scenes. Despite their importance, these complex interactions have been neglected in computer graphics. Our framework overcomes this by simulating the effect of a variety of geomorphological agents and the mutual interaction between different material and vegetation layers, including rock, sand, humus, grass, shrubs, and trees. Users are able to exploit these interactions with an authoring interface that consistently shapes the terrain and populates it with details. Our method, validated through side-by-side comparison with real terrains, can be used not only to generate realistic static landscapes, but also to follow the temporal evolution of a landscape over a few centuries.
      [Get the paper here ]

    • Coherent Multi-Layer Landscape Synthesis

      O. Argudo, C. Andujar, A. Chica, E. Guérin, J. Digne, A. Peytavie, E. Galin
      The Visual Computer, 2017

      Terrain dict.png

      Overview of our coherent multi-layer landscape synthesis: given a set of input exemplars, our method automatically creates a high-resolution consistent and coherent multi-layer terrain model from a low-resolution elevation model by matching input patches with the nearest dictionary atoms.

      Abstract: We present an efficient method for generating coherent multi-layer landscapes. We use a dictionary built from exemplars to synthesize high-resolution fully featured terrains from input low-resolution elevation data. Our example-based method consists in analyzing real-world terrain examples and learning the procedural rules directly from these inputs. We take into account not only the elevation of the terrain, but also additional layers such as the slope, orientation, drainage area, the density and distribution of vegetation, and the soil type. By increasing the variety of terrain exemplars, our method allows the user to synthesize and control different types of landscapes and biomes, such as temperate or rain forests, arid deserts and mountains.
      [Get the paper here ]