header.webp B2 Processing/Max/Syphon/Resolume Demo

Introduction

I recently had the chance to build something in the Atlas B2, a cool space that has multiple projectors, mocap, lights, and 40-speaker surround sound.

I’d done a number of cool projects in p5.js for Cacheflowe’s Creative Code class, but since that only runs in a browser I’d be forced to figure out Processing instead. Fortunately I spent many years as a java devloper and it’s only gotten better in the intervening decade.

Since I only got two hours in the space, I tried to figure out a solid plan of attack first. Here’s what I settled on:

  1. Use the Qualisys Mocap system to track an object round the space
  2. Take the OSC output from Qualisys and feed that into a Max Patch that can transform the coordinate space
  3. Feed the Max Patch data into a Java application using the OscP5 library
  4. Create a 7170x1080 off-screen canvas
  5. Draw a Sparkler/Firework effect on the canvas with a rudimentary particle system
  6. Apply a GLSL filter to create a bloom effect
  7. Offload the canvas using Syphon
  8. Pick up the resulting canvas again in Resolume Arena and send it to the projectors

Results

Let’s do the fun bit first, here’s the demo. Many thanks to Brad for working with me on getting all the fiddly bits connected quickly.

The not-so-great stuff

  • I wasn’t able to have Processing render a GLSL filter, this worked fine on my local machine (Linux with a discrete nvidia card) and worked slowly on the Mac, but it seemed like Syphon sends the frame before the filter ran and I couldn’t figure out how to resolve that. Fortunately Resolume Arena has a filter that would apply the same bloom effect so I was able to get it there.
  • The mocap system wasn’t on its best behavior and kept dropping the tracking. It did work at some periods but it was hard to get it to be reliable.
  • The frame rate was super low, even before adding a filter in Resolume Arena I was only able to render at about 24 fps in the full 7170x1080 resolution of the projector array. I’m not sure what was going on here, my laptop had no issues running the same java code at a full 4k resolution - which is more pixels (but a different shape), it’s also comparing a mobile Ryzen processor in my laptop to a mac with an i9. The mac should surely outperform here.
  • I ended up not needing the full 7170 width (because the scrim was blocked at the edges) which reduced the usable space to about 5000x1080. This was still too slow so I ended up going for a half-resolution canvas, that brought be down to a 2500x540 resolution which ran really pretty well.

The Java Code

You can check out the github repository and it should be reasonably self explanatory.

I create a syphon canvas as follows:

syphonCanvas = createGraphics(commandLineWidth, commandLineHeight, P2D);
server = new SyphonServer(this, "FreeSwim1");

and when I’m done with each fream I can offload it with

syphonCanvas.endDraw();
server.sendImage(syphonCanvas);```

This shows up magically as a source in Resolume.

Receiving OSC packets was also pretty easy in java, i pass `this` in and make the default class implement the  `OscEventListener` inteface:

```java
oscP5 = new OscP5(this, oscport);

	@Override
	public void oscEvent(OscMessage message) {
		for (int i = 0; i < emitters.length; i++) {\
            // check if the OSC path matches a simple pattern
			String emitterPattern = "/emitters/" + i;
			if (message.checkAddrPattern(emitterPattern + "/x")) {
                // parse the coordinate out of the code
				float value = message.get(0).floatValue();
				if (emitters[i] != null) {
					emitters[i].x = value * width;
				}
			}
			if (message.checkAddrPattern(emitterPattern + "/y")) {

				float value = message.get(0).floatValue();
				if (emitters[i] != null) {
					emitters[i].y = value * height;
				}
			}
		}

	}

The build is set up to use gradle and also builds automatically from my repository using a github action. This is accomplished by the .github/workflows/gradle-publish.yml configuration. Each time a change is made in the code, github automaticaly spins up a virtual machine and builds a new version of the project. I use the shadowJar target to build a jar file with (almost) all the dependencies included in it. I did have to copy the syphon JNI bindings in the same directory as this jar file and didn’t have time to troubleshoot that step.

    - name: Build with Gradle
      uses: gradle/gradle-build-action@bd5760595778326ba7f1441bcf7e88b49de61a25 # v2.6.0
      with:
        arguments: shadowJar

    - name: Get current date and time in Denver
      id: date
      run: echo "::set-output name=datetime::$(TZ="America/Denver" date +'%Y-%m-%d-%H-%M-%S')"
      shell: bash
  
    - name: Upload artifact
      uses: actions/upload-artifact@v2
      with:
        name: freeswim-${{ steps.date.outputs.datetime }}.jar
        path: app/build/libs/*.jar

The automatic build process helped me iterate quicker as i could easily make changes to the code on my laptop (while working in the B2) and a few secnonds later the built jar file would be available for download.

Max Patch

I used a Max Patch to do all the glue logic between the different parts of the system. This receives the mocap data, transforms it suitably and then sends all that data to my processing application. I have Max listening on port 55555 for data, and processing listening on 12000.

The source file is in github, but you can see how it looks here:

Max configuration

Note the atan2 configuration to convert the position of a mocap tracker to an angle (in radians) from the center of the space.

Conclusion

I look forward to getting more time in the space to actually build something real. The 2-hour slot I was given was pretty fun for building a little demo, but I’d have been lost without the generous help of Brad Gallagher.

Some large-scale dataviz project is really appealing to me, but I’d need to find a way to drive the display at full resolution and frame rate before getting too far into it.