How to make particles Inherit Velocity

Hey , any ideas of how to recreate this effect in notch:
https://www.instagram.com/p/CIGA75EBSeC/
I tried using a kinect mesh as an attractor and adding some fluid affector to it to some effect.
Also tried the optical flow and using an image emitter , and even tried using an image field as a particle emitter. All of those work to some extent , and most of them work best with the SPH affector with some tweaks to the settings, but my problem is mostly getting the particles to inherit the velocity from the initial movement of the point it is being emitted from.

Also still having some trouble accurately controling the particle size over life.
The particle scale coeff does not seem to conform to the particle life duration. I can get a change in the size over life, but it seems as if the change happens in the beginning and then the particle stay at the altered size for a the rest of the particle life.
Scale over time mostly does the trick but I can’t control the size with a curve. And I have no idea how to use the size curve attribute . Pretty much mashed every singe-double-tripple click , alt-shift-control , RMB combo I could think to no result - any pointers on that one are more than appreciated. It seems to be completely missing from the http://manual.notch.one/0.9.23/en/topic/nodes-Particles-Rendering-Point-Renderer .

For this I think the Speed Tracking Modifier could be your friend.

I’ve tested in quite recently in some R+D for a project. I have rigged up the above node to tell me how fast a null within an FBX skeleton (driven by Kinect Skeletal Data) and used a couple of range remaps to feed it into properties such as the Y length of a shape 3D (used via Clone to Particles).

If you like I can put together a shareable version of this where instead of the Kinect Skeletal Data, the Speed Tracking Modifier is getting it’s data from say a null with a Math Modifier making it move and have that driving a Shape 3D’s properties.

1 Like

But it may be quicker for you to just grab that modifier and have a play

Depending on how you want to emit particles, most particle nodes have an emission velocity / transform weight properties which as I recall works well for taking the source transforms and getting particles to move along with them.

If your going from an image though, you need to use optical flow to read the motion in the scene, then use that on the particles using either the same image emitter or image affector. Could also take the fun route and go through fields, but why overcomplicate things :slight_smile:

The problem I suspect you are running into is just a lot of noise from optical flow - its tricky to get clean motion data from video, so it usually helps to clean it up a bit before it goes in.

the particle scale stuff… is trickier to visualise. Upload a dfx and i’ll have something to offer there, but for now i think it might just easier to use the scale affector and modify its life affect property.

1 Like

The Velocity /transform weight might be the way to go. As everything else it does produce weird results when paired with the kinect mesh, but with a little tweaking maybe it could work. As for the 2d approach , yes the main problem is the optical flow - I get some cool results with the motion vector estimate but that setting also produces a massive particle drift to the right bottom corner of the image. Since there is the NVIDIA body tracker I am also thinking I might try to rig a super low-poly cage mesh and using that as the mesh emitter.Jamie Shaw’s approach is also in line with that idea. It also be paired with the kinect mesh but the kinect mesh tracking is a little bit heavy on resources. As for the particle scale I think I kinda got the hang of it - it requires some balancing between the over life value graph , the life value and different scale/size values on the renderer and affector, but once most things are set it is quite managable. Thanks.

Cool! Sorry I haven’t shared my DFX yet been busy but I’ll be happy to share this soon, was going to start a thread about it actually

I think your idea is great if you have objects that have exposable data, but the kinect mesh is more like animated displacement map on a plane mesh so it is really not applicable in my case. But I think I might have an idea that might work. If I get something working I will share it here:)

Yeah fair enough, I’m not using the Kinect Mesh itself, only the skeletal tracking and an FBX rig (with the geometry not visible).