Quarantine, 2020

Embroidery is a procedural system for creating needlework patterns from 3D geometry.

Process

The embroidery stitching was the least challenging part of this project. There are some surprisingly complicated steps early on to prepare the 3D models so they can take advantage of the stitching setup. There are five main stages to prep the models.

  1. Burn lighting information into the model
  2. Slice the model with a bunch of parallel cutting planes
  3. Flatten the slices with occlusion culling
  4. Add silhouette outlines
  5. Add occlusion contours


1. Light the model

The stitch lines do a great job of showing the shape of the original model, but adding some lighting information is a nice way to incorporate different thread colors to simulate the lighting setup. This is all done in a point wrangle. I pass in a position for the light source and use dot product to get the lighting contribution per point (ignoring shadows).

2. Slice the model

This bit of brute force booleaning is the most time consuming part of the process. Since the final output before the stitching needs to be splines (polylines), the simplest way I have found to get interesting results is to cut the model using several hundred parallel cutting planes and setting the Houdini Boolean SOP to "seam" mode which will produce a spline from the boolean calculation.

3. Flatten the model with occlusion culling

The previous step is great but being able to see through the lines to the backside of the model makes the visuals very confusing. We need to eliminate the background lines. The easiest step would be to cull backfaces but this doesnt go far enough. The trick is to cast rays from each point on the lines towards the camera. If the ray collides with the original model, then that corresponding ray point should be deleted. After all the rays are checked, the points are flattened to the plane perpendicular to the camera.

4. Extract the silhouette

This step would be a bit more challenging if not for the fantastic SideFX Labs SOP called, you guessed it, Extract Silhouette. This SOP will return the outlines of a 3D model projected along the x, y, or z axis.

(Edit: The newest update of Extract Silhouette SOP allows for setting an arbitrary projection axis as well as incorporating the next step in my process, finding the occlusion contours.)

5. Add occlusion contours

The silhouette is a great addition, but sometimes the structure of the 3D model after it is flattened would benefit from extra lines to make the foreground stand out more noticeably agains the background. These additional lines are called occlusion contours and there are several methods that can be used to create them. I went with a brute force 3D Sobel filter version. I start by scattering 500,000 (arbitrary number from trial and error) points on the original 3D model. Each point is assigned a brightness based on the distance to the camera. I cull and flatten the points in the same manner used for culling and flattening the splines so they will all exist on the same plane. I then sum and average the brightness of a selection of the neighboring points and if this averaged brightness is too similar to the original point's brightness, that point gets deleted. I then connect the remaining points to each other, pass this spline geometry to a VDB which I immediately convert back to a mesh. Finally I use a Straight Skeleton 3D feature (also part of the SideFX Labs) to recreate the original splines. This gives me more predictable geometry to use in the stitching step. 

Time for the stitching

Now that I have a flat model comprised of lines with color information, I can start the embroidery. This step is pretty much as you imagine it. I walk each line and break it up into different length stitches. This will produce a generic dotted line. I wanted a bit more variety so I created a few different stitch options. For the final images, I only used 4 of the stitch patterns: 1) Running stitch (dotted or solid line), 2) Parallel stitch (multiple parallel running stitches), 3) Cross stitch (creates an X shape with a slight vertical offset to make the second stitch look like its overlapping the first stitch), and 4) Stem stitch (looks like an alternating diagonal running stitch).

Early R&D

I went through a few different ideas before settling on classic sculptures. Earlier explorations involved using a growth algorithm based on text and graphic shapes. 

Final notes

For the closer renders, I used some smarts from the fine duo at Entagma. They covered embroidery in one of their tutorials and showed how to turn a single thread into a winding multithread spiral. I used this process for the detail shots when seeing the threads up close would benefit from extra detail.

I wanted to have a little more visual interest so I decided against showing the work on completely flat cloth. I created a series of simple cloth-drop vellum projects and saved out the more interesting versions. After the embroidery model gets flattened to a plane, I transfer that plane to the cloth using the uvs on the cloth as rest positions for the embroidery. This also means that the setup can be easily animated, both by animating the stitches, but also having the stitches appear on animated cloth.