My goal is to set up a project that alternates between realtime graphics and playback of steoreoscopic 360° video (40964096px@60fps). I currently tested only with a still image (Notch trial) using the s
Skybox node as described in the manual. It works good as long as the resolution is set to 1024 max. When I change to 2048 or higher, which I assume refers to 40962048px per eye in my case and which would be the setting to get the max sharpness from my source material, it shutters tremendously when turning my head. There is nothing in the scene but a VR Headet Camera and a Skybox node.
On my machine (980GTX) I can also play the video version in full res without problems (e.g. with DeoVR, or Whirligig player using Media Foundation), so that should not me the problem.
From reading the manual I assume it could have something to do with the Skybox node transforming the material to a cube map which costs performance?
One way that I know from other packages is to create 2 spheres with one holding the 360 image for the left eye and the other one getting the right eye assigned. And then each sphere is only visible to one eye in the VR headset. Is there a way for me to do this in Notch? And do you think it will help performance?
Thanks a lot,