I’ll see Screen Projection node, but have no idea,how to use it. Read manual ,it hasn’t become clearer.
May be ay tips/
The same question about tracking features.
I’ll see Screen Projection node, but have no idea,how to use it. Read manual ,it hasn’t become clearer.
May be ay tips/
The same question about tracking features.
I have same question. Maybe someone can explain how this is relevant to xR/virtual set world.
This node output what Virtual camera see to led screen so real camera see the same. In disguise it calls perspective mapping.
I also have the same question…
Is there anyone can make a quick demo to explain this “Screen Projection” node more?
(I also read the manual, too)
Thank you.
Hey there - I’m going to take a second pass on the manual page to clear it up, but for now here’s a simple explainer:
The screen projection node uses a camera as a point for perspective, and then renders a 2d quad from that perspective. This is useful if you have a bunch of content which doesn’t conform to a 2D surface that you need to render content onto a surface lots of geometry, some particle renderers, and you have a tracked camera to get the perspective to match.
Here’s a quick example I’ve throw together:
ScreenProjection.dfx (359.2 KB)
Keep in mind its based on what the camera can see of the content, so if you move or rotate too far you’ll see the final output starts missing parts.
I also come from the documentation and the file realy helps. Thank you =)