Particle morphing / destination, from one video to another

Hi there !

I’m new to Notch and I haven’t been able to find a solution to this problem online, so here I am !

I want to transform a video into a particle generated fluid (this part works great thanks to the templates )
but then the fluid has to morph into another image
It doesn’t work if I try to change the emitting image over time because it’s only the source of the particles that changes,
It has to be the destination of the particles

Like I need the particles to start from a video, change into the fluid with the colors of the first video (this part is ok) and then, as a destination and not a source, change color and arrive to morph into a second video

I hope that’s clear enough

I’m using the basic video > optical flow > particle > field template which works great for the first part, but then I’m stuck

I tried to play with different image effectors but it doesn’t work

Maybe it’s very simple and I just don’t know which node would create that effect, or maybe it’s just not possible

Thanks in advance for your help

1 Like

I’m going to piggyback on this post. (Sorry @erbaf.l if this goes on a tangent!)

Destination target would be amazing, with a variable strength. This way, with a very high / infinite strength, you could pass 16 bit destination pixels into Notch if you’re using something like TouchDesigner as a container, thus letting you calculate particle positions from outside of Notch in real time but use Notch’s rendering nodes for the final visualization. (You can use a particle cache for offline, but not for real time).

So I think to dig deeper, the ability to have some sort of image effector or mode for existing affectors that works where the index of the pixel only affects that same particle index within the particle root would be a massive win for Notch. Of course, this would not work (or create fun, weird results!) with affectors and emitters that cycle or that don’t have the same pixel count. In TouchDesigner, you’d be able to specify interpolation (nearest pixel / interpolate / mipmap) of mismatching texture sizes for particle xyz data, perhaps that’s part of the puzzle here too.

1 Like

Okay so after some digging, it actually looks like the Image Displacement node can work similarly to the feature I was describing. I never noticed it, as it’s under the shading category instead of affectors!

@erbaf.l - To solve your problem, maybe you could reframe it. I don’t think you can have a ‘destination image’. However, you could fade between your source and destination videos as inputs to an Image Displacement node, but animate the displacement amount and add turbulence during the transition. Everything should be deterministic / repeatable, but ‘look’ like it’s something more random than it is. The Particles Video Emitter template was a good place to start for this sort of experiment.

2 Likes

Thanks for your reply !

What you describe seems very powerful. I don’t use TouchDesigner, but I think I should try it as well

Concerning the effect, I don’t need for it to be realtime, which might make it more doable

I don’t need it to be absolutely accurate either, just to seem that the fluid is creating the second video, so I don’t mind using ohter ways to cheat it, it’s just that none of what I tried worked, it always looks like a regular bad morphing that generates particles ( my real trouble is the ending, arriving to the final image )

I’ll look into you solution with the image displacement node it sounds promessing if I can make it look like a fluid with turbulence, and maybe cheat with other particles.

The particles Video Emitter is the template I use, with great reuslts, just lacking the final part of the transition that I’m unable to find out.

Thank you for your help, I’ll try your it might work

1 Like

If there’s no need for realtime, you can simply make video 1 dissolve into particles/fluid, do the same for the second video, render out the frames and reverse the dissolve of video 2 an blend it together with the ending of video 1. That trick worked for me on multiple occasions.

Good luck!

1 Like

Thank you both for your help !

With the ideafrom particularexperience, using a fluid affector on the particles, and the video just transitioning in alpha blend when the fluid is exploding, the trick works, it seems the fluid itself is changing and not just the emitter.

murcje I will try this solution as well, it might also work to make particles “go back to normal” which is kind of doable with fluid, do that for both videos, and then blend the two together.

Thanks again, now I can be close enough for the effect to work

Great day to you !