top of page

Procedural Cities using WaveFunctionCollapse

About
High Concept
  • Individual Thesis Project in Personal Engine and Unreal Engine 4

  • ~6 Month Development Time

A technique for procedurally generating city layouts using only a single example image rather than lots of input data or needing a user to create the primary layout.

Notable Features
  • WaveFunctionCollapse

  • Lot Subdivision

  • Building Placement

WaveFunctionCollapse

WaveFunctionCollapse (WFC) is an example-based image generation algorithm created by Maxim Gumin. It operates by extracting every unique pattern from an example image and initializes every pixel of the output image as a superposition of each pattern. As the algorithm runs, possible patterns are removed from the pixels until every pixel of the output has only one possible pattern remaining. An image is then output using those patterns. 

 

There are 4 main steps of WFC:

  1. Pattern Generation

  2. The Main Loop which consists of:

    1. Observation

    2. Propagation

  3. Image Output​

Pattern Generation: As mentioned above, pattern generation extracts every NxN pixel pattern from the example image. N is the size of pattern used in pixels and is a parameter set by a user at runtime. If a pattern occurs multiple times in the example image, then the weight for that pattern is increased so that it is more likely to occur in the output image.

Observation: Observation picks the output pixel that is closest to having only one possible pattern and uses a weighted random to pick a pattern for it, setting every other pattern to false for that pixel. This is done to introduce a change that then removes patterns from neighboring pixels in the propagation step.

Propagation: This propagates the effects of changes throughout the output. Whenever a pixel is changed, it is added to a queue of pixels to spread the effects of their changes to their neighbors. If there is no longer a pattern for the changed pixel that allows a pattern from one of its neighbors to correctly overlap with the changed pixel, then the neighbor's pattern is set to false. If a pixel is changed in this way, it is added to the queue. This serves as a recursive flood fill that can affect the entire output.

Image Output: This step simply creates an image using the final patterns for each pixel that are determined in the above steps. Images are output as PNGs.

Lot Subdivision

After an image is generated using WaveFunctionCollapse, it is loaded into Unreal Engine 4 as a texture. The pixels from this texture are used to place three different types of nodes in the scene: highway, road, and building nodes. The building nodes are then grouped into lots using a flood fill. If the area of a lot is greater than the desired lot size, a parameter set by the user, then the lot is split roughly in half. This is done by calculating the average position of the nodes in the lot and conceptually drawing a line from this average position to the nearest road or highway node. The lot is split into two lots along this line. This continues until the lots are within the desired size.

Building Placement

Once a lot is within the desired size, a building is placed in that lot. The size and location of the building are determined by zoning data. Specifically, the possible range of heights, how much of the lot to use, and how offset the building is from the road are all specified by users for each of the three zones: residential, industrial, and commercial. Buildings start centered at the average position of the lot and orient themselves to the nearest road node. The number of building nodes in front, to the left, to the right, and behind the building are determined using dot products and the size of the building in those directions are roughly set using the count of building nodes. If any of the corners or centers of the edges of the building are outside of the lot, they are pulled in until each of those positions is within the lot.

bottom of page