Building an interactive plotter art installation

2/4/2024

I was given the opportunity to participate in SIGGRAPH 2023 in Los Angeles, and I decided to showcase how art, code, and pen plotters all mesh together. In order to do this, I created an interactive art installation which brought together all three.

Building an interactive art installation requires careful planning and application architecture. In this post, I will explain my decisions and how I arrived at them.

My design goal was simple: A person can walk up, play with a MIDI controller, see a resulting image on a screen, and then send the image to a pen plotter. The plotter draws their unique drawing, and they can take it home and frame it.

There were far more details within the implementation, though:

  • The application had to be fast. Turning a dial on the MIDI controller should provide real-time feedback.
  • Compatibility with the pen plotter. It only accepts vector assets.
  • Handling a queue of artwork - in the case of crowds, all coming to the installation simultaneously.
  • Plots should take at most 10 minutes to make.
Interacting with canvas

The Generative Algorithms

I began concepting some simple generative algorithms in P5.JS. These had specific themes (raytracing, spirals, and box stacking) and were custom-tuned to make interesting but quick plotter drawings.

Pen plotters move quickly but can still get weighed down by thousands of pen movements. Some of my artworks take over 10 hours for the machine to draw, which would not work in the SIGGRAPH Labs environment, where different art and technology experiences surround visitors, and they're quickly distracted. If I could hold their attention for over 5 minutes, that would be a win.

I sped up the drawing operations with some trickery around how I sequenced the linework, taking care to limit individual lines by combining them. I was able to achieve sub-10-minute plots with three separate algorithms using reasonable speed settings on my Axidraw.

Adding Interactions

Typically, Generative Art is fully autonomous, and a pseudo-random number generator is called to create a randomized set of parameters. For AAS, I intended to pluck out the randomness and insert user controls.

I evaluated a few options for input devices. I landed on the Intech Grid series MIDI controllers for their simplicity and modularity. Meant to be snapped together, these controllers offered the correct number of inputs to encourage playful exploration without being overwhelming.

Intech Grid MIDI Controller

With my hardware and software selected, I went about connecting them. The WebMIDI API is a great way to send and receive data from any MIDI device and interfaced seamlessly with Javascript. Soon, I was accepting MIDI signals within my P5.JS application and adjusting parameters.

let midi;

if (navigator.requestMIDIAccess) {
  navigator.requestMIDIAccess().then(onMIDISuccess);
}
 
const onMIDISuccess = midiAccess => {
  midi = midiAccess;
  midi.inputs.forEach((entry) => {
    entry.onmidimessage = onMIDIMessage;
  });
}

const onMIDIMessage = event => {
  console.log(event); //Change a parameter of the algorithm here.
}

Rendering & Performance

When I make plotter art, I use P5.JS and the P5-SVG plugin to render my graphics as SVGs. This workflow allows me to download the files as vectors. Then, I can upload them to Saxi, a utility that controls my Axidraw pen plotters. The translation between coding art and having the pen plotter draw it is nearly seamless.

Rendering a graphic at 60FPS and responding to user input started to become slow with this approach, though. I found that updating the DOM with new SVG elements 60 times a second wasn't ideal. I flipped back to the typical rendering approach used by P5.JS, which leverages HTML canvas. It immediately fixed all frame rate issues and made the application feel much more fluid when twirling the MIDI controls.

To have the performance of canvas and compatibility of SVGs, I made two instances of P5.JS sketches, one with each rendering target. I only rendered the vector instance before the application was transmitted to the cloud job manager.

I needed to build an application UI to display helpful hints, an intro screen, and some submission steps. To do this, I brought in Svelte. Although I could have used React here, I like the ease of use that comes with Svelte. React felt heavy-handed since I wasn't worried about component reuse or application render performance at this layer. Never defeated, though, React managed to make its way into the stack later on.

The on-screen buttons match those on the Intech MIDI Controller

While reading about MIDI, I realized I could send signals to my Intech controller instead of simply receiving them.

A UX problem was bubbling to the surface: My application was growing in steps and pages, and users only got visual feedback from the screen. I came up with the idea of using these signals to tell my controller to change its LED button colors to indicate different button interactions, thus adding another mechanism for users to understand the process.

The Intech MIDI Controllers can run custom Lua. After some trial and error, I was able to write basic scripts that would listen for these MIDI messages and toggle various LEDs on the controllers.

A hiccup with APIs

After successfully connecting my application to MIDI, I moved the operation over to an iPad, which would be my vessel for the installation. I chose iPads because they have high-quality displays, USB-C connectors, and reasonably fast processors. Most of all, though, I already owned two iPads.

I realized that I had missed an important point: iPads only support Safari, and Safari does not support the WebMIDI API.

I immediately began looking up Android tablets and found that anything with decent performance would be a couple of hundred dollars, and build quality was iffy. I'm sure there are good options out there, but in a pinch, I got creative.

My solution was to skip the WebMIDI API and instead build a mobile app that could use native iOS MIDI APIs. I would render a web view within the native app and communicate with my application using post messages. This turned out to be mostly painless using Expo and React Native. This stack was the fastest path to success since I could keep using Javascript. I spent $99 on an Apple Developer account only to sideload my application on two iPads. It was a decent compromise.

Finally, I added a screen to the mobile app to capture the user's initials, making it easy for people to submit a job, leave, and return later to find their drawing on a table.

Capturing alphabetic input with MIDI controller sliders was a fun UX challenge.

The text selector

Sending Files

My client application was running on hardware, and I planned to have two instances on-site. The next problem was taking the graphic on the iPad and getting it to a pen plotter.

  • What would happen if there was a line and many people all wanted to make drawings in quick succession?
  • How could I later identify who made what, all while respecting user privacy?
  • What if the system went down, or I needed to swap out an iPad, computer, or pen plotter?

I decided to spin up a cloud instance and write a custom job manager that would allow me to see every submission and assign statuses to it. This way, a queue of art could build up, and I could easily keep track of it. The application needed to accept information from n-number iPads running my sideloaded drawing app.

I built a NextJS application and deployed it to Digital Ocean's App platform. I went with NextJS because of my familiarity with React and its ease of use for API creation using the /api directory. I could have used many other backend solutions to build this basic CRUD app, but having everything under the same hood helped me iterate quickly.

I attached a Postgres database to my app to persist items in my queue. Then, I used DO Spaces to host the SVGs, which would be transmitted to my API via the iPads.

I tacked on a simple static microsite for the project using Eleventy. I bundled it with the Digital Ocean mono repo feature, thus saving me the costs of spinning up another app instance.

The "No Frills" admin panel

Conclusion

Everything was running smoothly. I dragged the entire setup outside into my backyard in Vermont. I shot some videos of the interaction and drawing process.

The final steps

I would be boarding a flight to LAX in a few weeks and setting up the installation.

My art studio uses Axidraw machines due to their reliability and hassle-free setup. There's one drawback to these fully assembled machines though: they're hard to ship.

Thankfully, I was able to get support from the generative art community. Artblocks donated an Axidraw A4, and Casey Reas(famed co-creator of Processing) loaned me his own Axidraw A3, notably used by Piter Pasma for his installation works.

After five long days, participants created over 230 drawings. SIGGRAPH attracts an interesting mix of technical and creative people. I met so many of them, including some of the founders of computer graphics, like Jim Blinn. I even met a handful of other plotter artists, including one guy with a sketchbook of plots that we were excited to pull out to show me. Or another person who worked with HackClub, and made Haxidraw.

Live at SIGGRAPH

Gotchas

  • The installation requires an internet connection for the iPads to communicate with the job manager in the cloud and for a local laptop to communicate with the Axidraws. At SIGGRAPH, I was in the presence of dozens of other art installations and exhibits. Someone was saturating the network with huge amounts of data, disrupting my ability to communicate between devices. The intermittance led to some head-scratching debugging, but the IT team has SIGGRAPH eventually set up more routers. Lesson: provide your own network.
  • What felt like a reasonably automated process translated in to standing for 6-8 hours, changing papers and pens, all while having side conversations. It was exhausting, but I would do it again in a second.
  • Every day, the queue would grow to 25 orders, and we would have to shut it down and focus on getting through the drawings for that day. One more Axidraw plotter would have been nice.
  • People touch your stuff. Fortunately, nothing broke, but both the MIDI controllers and the iPads were at the mercy of the general public. After the event, I got the client running on an OrangePI and $100 display, thus eliminating the need for costly iPads.

Technologies used:

  • P5.JS
  • Canvas
  • Lua
  • Svelte
  • React
  • React Native
  • MIDI
  • Eleventy
  • Postgres
  • Digital Ocean App Platform

More