Any plans to implement the new Kinect Azure sensor in the near future ?
I just bought two of them. So far I’m liking them much more than my Kinect v2’s.
Besides being much smaller and, having a little more depth resolution than the v2’s , being able to hardware sync multiple units using only an 1/8" audio cable opens up some great possibilities. Also the SDK seems to be improved including the skeletal tracking.
Good news is they are only $400 in the US and are available to everyone, not just developers.
I’ve only been playing with them for bit, but they feel like they are less taxing than the v2’s. I’m running 2 on my laptop at full frame rate.
Hopefully somehow this could make it’s way on the “To Do List” sooner than later
Thanks for all the hard work, it’s appreciated.
Yes, we’re planning to support the new Kinect in the next release which is 0.9.24.
That’s great news Will. I suspect the Azure’s will eventually replace the v2’s in the professional situations due to their size, weight, and simpler cabling. It would be nice if the price came down a bit.
Any rough ETA for 0.9.24 ?
No ETA for 9.24 yet, it’s going to be a big one.
What are you using the Kinect Azures in atm? I just got one too, and have been messing around with it. I’ve found that with the Kinect v2 I get a better track in TouchDesigner, and so I do all the creative in Notch, and take the block into touch and link up the parameters. I’d say it’d be a good workaround (although a Touch pro license is a bit steep), but Touch doesn’t seem to be working well with the Azure as of yet. I could see that changing in about a week or so.
Currently I am using the Azure in Touchdesigner. I was happy to see they already had support for it. I currently have the non-commercial license for TD and am seriously thinking of throwing down the extra $1600 to make it a Pro license so can run Notch blocks in the TD environment.
To be honest, I’m not so impressed with Notch’s implementation of the Kinect v2 inside of Notch. I was very surprised that they have not implemented the user mask (user segmentation) in their environment. Also no IR stream output, which i use when calibrating it with a high def camera. When I was reading the Notch forum their suggestion to get a user mask was to set a depth threshold and hope for the best. The user segmentation within the Microsoft SDK makes it so easy and gives me what I want.
I suspect Luke and the other guys have their hands pretty full over there and only implemented the basics into Notch and skipped the User Mask and IR stream outputs for now. I’m hoping that they eventually add those features to the Kinect v2 and the Azure, but for now TD is where I’m spending most of my time with it.
I do like Notch and it renders quite nicely and especially like that we can make blocks that can be used within D3 and some of the other servers. That being said, it’s only image data in and out. Very much like using Syphon or Spout. It would be nice if down the road they allow the passing of geometry and other data types other than just image data and control of parameters.
What problems are you having with the Azure running in TD? So far it’s been working for me. I only have had my Azure’s for about 5 days so I haven’t pushed them too hard as of yet. Maybe this weekend I will dive in deeper.
I do wish that Notch had a way of encapsulating chunks of code into objects like TD does. I find it harder to build large Notch creations with everything on the same page (level). I like to lay everything out logically in such a way to be able to absorb everything that is going on and which part is doing what. TD does a better job of that. More similar to Max/MSP and Quartz Composer, two environments I’ve been using for a long time.
But with Notch’s ability to make plugins for D3, I end up using it more often because it seems everyone is using D3’s for content playback for the most part. I am very happy to see how fast they are building up their development environment to include the bits we need.
Just chiming in here: the Kinect Azure is on our roadmap to support, and on the topic of TD as a media server, it is very capable (we’re seeing lots of interesting projects done with that team-up).
As for the implementation of the Kinect v2 in Notch: we’re supporting the must commonly used feature set, and to be completely honest: who thought it would still be in use? It’s a 8-year old device that’s been discontinued for almost 4. The fact that it’s still in use is pretty nuts (you can’t even buy them any more, unless you’re eBay lucky).
I’d say that the chances of expanded Kinect v2 support is rather low, but as mentioned: the new one is in our roadmap. I can’t say when it will be supported, but we aim to do it at some point.
I completely understand why putting more time into the Kinect v2 makes less sense than their new Kinect Azure. I think you might be mixing up the original Kinect (structured light one) with the Kinect v2 (TOF method) I believe the Kinect v2 was released in 2014 and sold and supported until recently, might even still be supported. Kinect v2’s can be easily found on Amazon and other locations. Now the Kinect 1’s are 8 years old and harder to find. I don’t use mine anymore and I bought a bunch of them. The Kinect v2’s are so big and heavy not to mention all that cabling and interconnect boxes so I’m happy to move on to the Azure’s to take the v2’s place.
I really do appreciate all the responses and am happy to hear that the Azure is on the roadmap. My big request would be to also add the User Mask and IR Streams as available outputs. The User Masks can easily be used to make a Depth Key that only includes the person and not other things in the scene at the same depth.
thanks so much for the teams efforts.
I might be wrong on the dates, I just had a hunch it was old in both regards and looked it up on Wikipedia (which lists Azure for Windows (v2) as launched in 2012 and discontinued in 2016)).
I’ll be sure to pass along the request for USer Masks and IR Streams to the dev team!
You may be right. I wouldn’t be surprised if there is some conflicting information with the names and release dates on sites like Wiki. I think we both agree that the v1 and v2 are dead or dying products. Time for the new.
Thanks for passing on those requests. I do believe other users will find them useful as well and worth the time to include.
Touch and Notch seem to work really well together. I thought they’d be redundant. But Notch is such a rendering / creative design powerhouse, and Touch is so good at gathering & manipulating data and organizing, so you can use each program to support where the other one lacks. and you can always kick NDI into a d3 (but now you’re needing a separate system).
Stuart, so you aren’t having issues with the Azure tracking in Touch Designer? (not that this is a touch designer forum). Mine is being all wonky, and I don’t seem to be able to see the hands tracking at all.
I think I’m minutes away from upgrading my TD license to Pro. It’s a big price jump, but I think it’s going to be worth it. I’m finding that Notch is so brilliant at many things, but falls short when it comes to other things I find in other environments. I guess I was trying to do everything in one environment. Notch is so much better for me to come up with art and effects. TD is much better for organizing and overall program structure. I’m going to do it Stephen, get the Pro upgrade.
As far as tracking the hands with the Azure, I spent most of my time just looking at the body and noticing it could pick me and the kids up as people even when we were all sitting down to start. The Kinect v2 and older liked to see a standing subject.
I will let you know how it goes.
Well I did it. Threw down the extra $1600 and went Pro with TD. I had a problem with blocks not loading until I realized I had my Notch builder dongle in instead of my Notch Player dongle. Since then, smooth sailing so far.
I am considering getting the TD Pro Licence.
Does this mean that you can´t just go with the notch builder licence and need to get an extra playback dongle ?
Notch Builder Pro comes with Playback of up to 4k content included, so no additional license should be required.
Update on the Azure Kinect support: it will be supported in a future patch release to 0.9.23, which will be out “soon”. However, not all advanced features will be supported in the first iteration.
Awesome news! Looking forward to it.
P.S. I don’t know if this is the right place to answer this question, but does anyone know if Azure Kinect is supported by 3D scanning software (Skanect, Artec Studio) in the same way as the old Kinect v2? Sorry for off-topic
Awesome. Great news. Thanks Bent.
It’s out now in the 0.9.23.097 (or newer) release.
I have been using the Azure in Notch the past few weeks and enjoy the integration. Thank you for the work that has been done so far. I saw the note about not all advanced features being available right away. Is using multiple Azure Kinects in the same project a possibility in future releases? I havent seemed to be able to get that to work and would love to have atleast two devices running for a couple different interactive regions.