Record camera tracking data for postproduction on XR projects

Hi there,

In some recent XR projects we had trouble aligning the extended reality to the LED-stage. Calibration always seems to be a bit iffy at certain angles, no matter how many points we calibrate. I know, this is more a Disguise thing than that it is Notch…

Also there’s the issue of latency which is hard to workaround, especially with sudden and rapid camera movements.

So when certain stuff is breaking on screen (misalignements) the client has a hard time understanding why this is happening, we have an even harder time trying to explain and we end up going into post to try and fix this. So for live events this is not possible.

What would make my life a lot easier is the option to record the camera movement that Disguise is feeding the Notch block, so I can re-render parts that need fixing or alteration, and use those as assets in our post production process. We can record the raw data from the camera tracking system but it is a real drag to convert, re-align, match etc to get this right (almost undoable as opposed to just post video tracking the camera path and using that) so that it works within the original notch block for re-rendering.

Is there a way to record this data as it is coming into and being interpreted by the Notch block?

Do you have any best practices re-aligning XR to what is recorded in-camera?

Thanks for your time!!

Marc

Hi Marc,

I know, this is more a Disguise thing than that it is Notch…

Yeah, I’m afraid that’s going to be the short and simple answer here. The case you bring up has been brought to our attention a few times, and you’re right in that Notch is “just” the content creation and render engine here. We simply take in the data passed on to us by the media server. In your case that’s disguise.

What would make my life a lot easier is the option to record the camera movement that Disguise is feeding the Notch block, so I can re-render parts that need fixing or alteration, and use those as assets in our post production process.

Yeah, that would be ideal. I’d make a point to them that you’d like the ability to record the tracking data. The way the media servers pass data on to Notch makes their platform the right place to record/buffer these data.

I happen to know that what at least one other user has done is use TouchDesigner to manage tracking and to record the data there. It can then be played back and rendered out from Notch in different passes, thereby bypassing the latency issue you also mention. Of course, this will only work for post-work (as in: the recording and playback – TD is of course also viable for real-time camera tracking). So while it probably won’t work well for your current project, it might be worth looking into in the future at least. I’m sorry I wasn’t able to be more helpful here.

1 Like

Thx Bent for taking time to elaborate on this. I understand what you are saying, will definitely check TD!

Best,

Marc

1 Like

Hope you find a solution!

It’s worth saying that although the workflow is pretty complicated at the moment, It is possible to calibrate a disguise XR system so that your backplate & set extension are in sync. It’s difficult to offer further advice without knowing a lot more about your system and Disguise support my be better placed to help you work through your issues. But know it is possible!

2 Likes

Hi.

You can record all incoming data to disguise already. And then add it to the time line to play it back. :slight_smile:

1 Like

Hi Ckhallas,

Thx for your response!

Ok , I guess this could work, but then I’d need to have a disguise license and also have the disguise project running offline, so not connected to the physical tracking camera and the LED screens. Right?

Need to learn me some disguise to figure this out :thinking: