Hi, It would be nice to have an option to time-shift animated clones with the time-stretch node - Is there any possibilities? Or a proper way to time-shift animated clones in time?
Or this would be a new effector category for time-delay - time stretch effects?
Is there already a solution for time-shifting incoming live camera feed (like in touch designer with the buffer/puffer methods?) or pre-animated objects (like animated .fbx or .abc with clones or even pre-animated object directly from the notch timeline?)
I’ve attached a screenshot - about the idea,
and Here I also wanted to ask - how the morph deformer works - because there are nothing about it in the documentation - and perhaps time-stretch node with a morph deformer could also do the trick?
Many thanks in advance,
What type of animation are you trying to achieve?
In the example you give, could you not just use a modifier?
Yes this is something we have thought about a few times in various ways, sadly its not quite as simple as you’d think. The cloner nodes rely on every clone being an instance, or an exact copy of the original mesh. This means for nodes with animations, they will always have the same time for each clone as they copy from the same animated source. This is also why you can’t clone lights, or other arbitrary nodes- they can’t be instanced like geometry can. the main pro of instancing is that its super efficient - you can create 1000s of clones and they barely impact the GPU, copying nodes would lose this benefit.
To make this work, we’d need to totally rewrite the cloner system to add support for copying objects, and while we do want to look into a solution for this in the future its not on any near term plans.
On the morph controller - that node allows controls for blend shapes, which also wouldn’t work for this solution.
Hi! I attach a mini-screenshot - I would like to spread animated walk-cycles in space - and I would like to control them as I could control static objects with clones. I would like to shift the starting time of the loop. The manual way would be tedious - a lot of copy/paste + time-stretch node - manually shifting the time-offset. One possible workaround is rendering the sequences-then from the sequences - rendering the tile-sheets and then using several particle systems but again -how to control the starting points - globally - time shifting?
Oh, thanks for the quick answer! - just noticed. Understand, I hope it’s not boring that I always come up with this time-shift problem. Time-shifting with particle-sprites and so on:)
What is the specific effect you want to achieve? Perhaps there is another way to approach it.
Hi Bent! I think Ryan has already answered my question about time-shifting (with the time-stretch node) animated clones. I see now, it is not possible to ‘copy’ and then clone with the cloner node.
The effect would be like a ‘classic-video-wall’ effect - but not with videos - rather with animated 3d objects - time-shifted (periodically or a random manner’) then spread in 3d space - and controlled with ‘fall-offs’ in time also. But I see, it is beyond the realtime-world - since these things need more resource (like scrubbing back-and forth in the timeline of each animated objects, needs the harddrive to work a lot - which also requires more time to calculate).
Since Notch is so quick to produce images - I try to solve all my problems with it - even these classic problems. I often find myself in the situation where the current director is asking for such video wall effects. I attach another example picture at the end of my post - where I had a 20 minutes long video footage about the gestures of the actor - and I had to fill 3 walls with them - spread in space and shifted randomly in time. This was a low-budget production (as usual) - so I had 3 days to setup the projectors, produce the content for a 2 hours theater-piece - learn the piece and the timings - and do the playback on the 4th day - and as a good client - the director of course wanted changes in the last moment - so there are no room or time to go back and re-render everything with a traditional 3d package - I have to solve on the fly in order to have at least a few minutes to think also about the aesthetics:). Another scenario: I had a one and a half long video footage about cultural inventions - and I had to do from this source a kind of a ‘virtual’ museum - where the virtual camera flies through the ‘history of art’. I put image-planes in space - in this case, I had to shift the start times in chronological order - it was fine - but the source footage had to be re-edited so the timing had changed - again no time for re-render. I’ve seen a video on youtube - I suppose it was made by Ryan - where he quickly produced an image-wall from an instagram feed - in a very clever way - I would like to do that - but with videos - without manually editing each of them. I could edit 10 videos in an hour, but not 120. My next project - I will have a live camera input - and it would be nice to store temporary in the video-memory a couple of seconds -and then feed into a ‘particle’ system to spread them in the 3d space of course with effects. (and connected with the connect-geometry renderer)
(By the way, it would be nice to have an option for curved line in the geometry-connection renderer node - with a point-re-sample option for bezier-like lines) - since these are geometry - so the smooth deformer works here also - but then you loose the connection’s start and end-points)
(the 3d walkcycle/cloner example in the previous post would be a personal project - but also with a deadline)