Hi All,
I’m new to Notch and I’m trying to do a simple interactive installation, I already have a Hokuyo UST-10LX sensor connected and working on Notch, but I have some questions:
The interactive area is 8m by 3m, I’m aiming to detect people’s movement, the projection will be on the ground. There are any sweet attributes settings to avoid jumping on and off detection points?
how I can avoid detecting points of statist objects in the area, right now is throwing data for static objects.
If I need to add another sensor, can I connect the 2 sensors to the same computer?
I managed to connect 2 sensors to my computer and tested them on Notch, they work fine(I’m using a router, changed the ip address of one sensor on the URGBenri app)
Now the issue is that I have 2 Hokuyo Array nodes(one for each sensor) when the sensors detect one object, each array node create their own locator for that object. There is any way to merge the two locators points of each array; so at the end, I just have a single locator point for each detected object?
We don’t support combining data from multiple Hokuyo sensors at present, but its is something on our radar. You can use the bounding regions to limit where the Hokuyo sensor can track, but fair warning - If two sensors have line of sight to each other they can beam interfering data into each other and cause problems.
After 15-20min of running with people walking and emitting flowers. The emitters that are connected to the Hokuyo sensors Array stop emitting flowers, sensors are working but no more emission of flowers.
The emission works again until I stop, reset timeline and then play again(Everything go back as normal)
Any Idea how I can leave the program running for 3 hours and be constantly emitting particles(Flowers) every time a person walks through the sensors area?