Using and image source to animate lighting chases

Hey gang.

Part of a current project I have led me to put this workflow together, with the help of Notch support getting the custom pixellation process sorted (thanks team!).

I thought I would share the evolution of it here.

The graph takes an image and ‘pixellates’ it using a Cloner system and Render to Texture node - the number of pxiels X and Y is the number of clones, there’s a region by the Root node to enter these numbers.

The resolution is extracted and sent to other parts of the graph, where you can crop slices of your input using real pixel values (making it easier with pixelmaps etc.).

The results are repositioned to fill the original resolution and each of those is sent to a light.

Colour: If the Light nodes are white they’ll take on the input colour, if the colour is set in the Node it’ll ovewrite, so it’s just up to you.

So to recap, the incoming source is pixellated by a fixed number X and Y, each of these pixels are isolated, rescaled and sent to a volumetric spotlight.

IN THIS GRAPH!! I have changed the geometry in the cloner system from a Plane to a Circle so that the end result appears as it should, like a spot light.

CHANGE IT BACK! If you want a straight up custom pixellation of an image, change the cloner’s mesh to a Plane and disregard all the crop/transform/lighting stuff. These custom pixellations can be really useful if you’ve UV’d something to a neat grid and what to make the polys ‘light up’ one by one…

Happy graphing.

Oh P.S. this graph has an NDI input currently so you’ll need to switch it out or stream something to it - try some simple 2D patterns like the ones in the video.

DFX IS HERE

QUESTIONABLE DEMO VIDEO HERE

[RESOLUME NDI EXAMPLE SCENE]

3 Likes