If I import an alembic animation and put it in a cloner inside notch, is there a way to offset the individual clones in time? What I want is to have the clones play the same animation but with different starting points in their animation!
This isn’t possible with clones in their current form - basically, clones are designed to instance geometry, which means, make exact copies of the same mesh. If they have different time values, they won’t be exact copies anymore.
We get this request a lot so we want to look into it, but it’s just a fundamental limit with the current state of things. The alternative is clones running much slower.
– Ryan
I have indeed also seen this feature requested a lot, and have the need for it myself. For non realtime it wouldn’t really matter that performance would suffer, adding a bit render time is no issue for me. Would love to see an animation effector to create for instance more convincing crowd simulations.
I used a particle system in Notch to simulate a flock of birds a while ago. Used the point renderer with texture animation to have the the same animation of a bird play with different starting points. Maybe there could be a way of doing this with a 3D model in Notch and piping it through render layers…?
With tile sheets, it’s possible because you only have a single image to load. The particles can just choose which part of the image it wants to use per frame, and switching is super faster because its just an image. To use the same technique with geometry would mean having every frame of your animation loaded simultaneously, and that would put a lot of stress on VRAM for a start, and a tonne of performance drop.
@murcje for offline it’s more reasonable as performance loss is less of a factor, but we still need to decide how to handle it under the hood - and there are far better methods than above. We will find a solution, just not in the short term.
– Ryan