Parks x Needleby Sally Kong and Owen Trueblood, 2022
At the end of the 2021 when travel restrictions due to COVID were still in place, I was away from family, and all my friends were gone for the holidays, I turned to nature.
During that time, I visited 12 parks in New York City which had designated wildlife zones. Awestruck by the colors of fallen gingkos, evergreens, terminal buds, geese manure, and thistles in the midst of the muted winter landscape, I decided to build a robotic tufting system and create tufted maps of these places imbued with colors from my experience.
There were four major steps in producing these textile park visualizations:
Hacking the tufting gun so that it can be controlled programmatically
Generating tool paths for the tufting gun to tuft the shape of a park from each borough in New York City
Controlling a UR5 robot arm and tufting gun to stitch the shapes of the wildlife zones
Park inspired punch-needling done by hand to fill in the boundaries tufted by the robot
Hacking the Tufting Gun
Tufting is a textile manufacturing process where a needle shoves yarn into fabric and the fabric’s backing holds the yarn in place. It can be done by hand with a special large hollow needle called the punch needle or a machine called a tufting gun.
We first prototyped by mounting a punch needle on the UR5 robot arm, but decided to work with a tufting gun instead for speed. We wanted a way to control the tufting gun either from the robot controller or a computer. To do so, Owen replaced the motor controller, added an Arduino, exposed the Arduino’s USB connection, and added a standard stereo 3.5mm audio jack for the IO port. The USB was used to update the firmware, the digital output was used by the robot to turn the tufting gun on or off, and the analog output was used to set the speed of the tufting gun.
Generating Tool Paths
For the final pieces, I wanted the robot to draw the shapes of the parks which later I filled in with colors I saw from the parks. To do so, we used the NYC Parks Forever Wild dataset from Open Data NY which maps the ecologically important wildlife zones across 138 parks in NYC. For parsing the data and generating the tool path, I used a software called Houdini. I felt comfortable with Houdini because I’ve worked in the animation industry for the past five years where it is heavily used for creating procedural models, crowd simulations, and special effects.
Houdini provides a node-based programming environment for manipulating geometry and its off-the-shelf nodes allow for common operations like moving geometry around, simulating particles, adding noise, etc. But there are also nodes that allow arbitrary programs to be written to create any kind of operation that you might want. By combining some common geometry nodes and custom scripts, I made a Houdini node network did the following:
- Create a canvas polygon out of horizontal rows to represent the 16’ x 20’ monk cloth to be tufted on.
- Create Houdini polygon primitives from the NYC Parks Forever Wild GeoJSON data using vvzen’s Houdini-Geospatial-Tools.
- Transform the park shape primitive to fit in the canvas.
- Extrude the park shape to find intersections with the canvas polygon.
- Group points on canvas that intersects with the park shape.
Assign attributes for each point that describes the direction of the tooltip. I wanted the robot to move row by row from left to right, so I assigned the direction values while moving along the rows in the canvas polygon.
Create lines based on the point attribute’s state change. This corresponded to the lines that will be tufted by the robot. The robot motion planner takes in a series of points and does the inverse kinematics for us, so we just needed the endpoints of these lines.
Export endpoints and a command if the rufting gun would need to go down to tuft or come up to stop as a comma-separated values file.
Controlling the Machines
With the CSV of generated tool paths, the next step was to use it to control the robot arm and the tufting gun. For this, we wrote a Python script to parse the CSV and send signals to the robot arm to move to a position or power the tufting gun. To communicate with the robot, we used Owen’s Python wrapper of RoboDK and robolink. RoboDK is a simulator for industrial robot programming and robolink is a python module that interfaces with RoboDK.
Park inspired Punch-needling
In the spirit of the exhibit’s theme “Ground Truth IRL”, I visited parks from all five boroughs of New York City to experience and understand what’s within these boundaries of ecologically important wildlife zones. This was also during the holiday season of 2021 when my friends were gone to their respective homes while I couldn’t go back home to Korea due to COVID travel restrictions. With no friends and family, I turned to nature.
At first glance, the parks in winter seemed so bare, gray, and cold because the trees had shed all its foliage. But with more time and attention, other parts of the forests started to come into sight. There were vibrant pink thorns and vines in the midst of dead leaves, golden ochre terminal buds at the end of the branches gearing up for new growth, and so much green manure from Canada geese.
I could have made a geographically accurate map of these sights or a statistically accurate color palette. But for the final piece that I was punch-needling by hand, I chose to focus on and highlight the sights that struck me the most from my personal park excursions. While the public data gave the shape or context of the pieces, the color and feeling were dictated by my personal experience.
Parks x Needle was part of Data Through Design’s 2022 exhibit, Ground Truth IRL. There were 11 other projects presented in this exhibit including weaved bar charts dyed with food scraps that visualize food insecurity in NY, an installation examining the discount stores in NY, and an experimental video exploring the unseen traces of children in the OpenCity data.
For the exhibit, we presented five pieces, each representing a park from a borough in New York, and we also brought the robot tufting system for a live demo which allowed the audience to see, ask, and learn about the production process.