Official Quick Tips

This is a thread where we collect all the quick tip posts.

Quick tips are great little nuggets of advice that we want to share with our great community!

Combining procedural primitives is a great effect, in this quick tip, we will show you how to generate individual colours for each of the combined primitives.

These colours will take on both the procedural blending and material properties.

Here’s a video that demonstrates how it’s done:

The Math Modifier is a great and simple way to animate your nodes in Notch.

This quick tip video shows you how to create a seemingly complicated animation of a bouncing ball in 3 very simple steps.

Buying 3D meshes is a great way to populate your Notch scenes.

Occasionally you’ll come across assets that have vertex colours applied.

This quick tip video explains why some meshes may have vertex colours on them and how to turn them off if you don’t need them.

Baking the lighting in your Notch scenes is a great performance optimisation for real-time projects.

Understanding UV sets can be tricky, in this quick tip video, we’ll teach you the simple concept of how UVs work and why you need a 2nd set of unique UVs for your baked lighting.

1 Like

Notch has a few different ways to scale the viewport as you work.

This tip will show you the options for resizing, scaling, and cropping. They can be found under “Project Settings” / “Rendering”.

Here’s a video that demonstrates how it’s done:

Can we save our own preset resolutions?
I imagine a large portion of projects, like my own, are custom screen sizes
On that note can we create defaults for new projects?

Save Custom/Default Resolutions, not just yet. But a reasonable idea, I’ll talk to the team and see what we think about it.

1 Like

The viewport shading options are extremely useful for looking into your scene to get a better understanding of what’s going on under the hood.

Whether it’s optimising a scene, or just inspecting the various texture passes, these tools make it easy to dissect your scenes.

Here’s a video that demonstrates how it’s done:

1 Like

The Target Input is a great way to position the direction of nodes in a scene.

By connecting nulls or other objects to the target input on a node, you can control the focus and direction of that node.

This technique, mainly used on lights and cameras, can be used on any node with a target input.

Here’s a video that demonstrates how it’s done:

The Tweening Null is a great way to smooth the transforms of nodes, either to clean up erratic motion or to smoothly blend between multiple input positions.

Just add a node (or nodes) to the tweening nulls input, connect it to the root, and you can connect the output to wherever you want the smoothed transforms to go.

Here’s a video that demonstrates how it’s done:

When working on the timeline in Notch, it can be practical to loop certain segments that are shorter than your layer.

The Range tool lets you set a short (or long) loop and when enabled, the timeline will play back that segment instead of your full project or layer. This is handy when you want to refine a certain section of your timeline.

Here’s a video that demonstrates how to do it:

1 Like

Procedural systems are a great way to make complex collisions for complex geometry that is fast and accurate.

This method is not unique to physics nodes you can also use it with particles too!

Here’s a video that demonstrates how it’s done:

1 Like

Translucent shadows from RT Glass materials are a super nice effect to use with raytraced scenes and are only one checkbox away.

It’s a simple matter of setting up the raytraced lights correctly and adding some absorption colours to the glass material.

Here’s a video that demonstrates how it’s done:

Don’t need certain layers to be the full length of your composition?

No problem!

In this short video, we’ll show you how to trim your layer length which can be useful for many things.

In this example, we’ll show you how to use it to edit multiple cameras to create simple camera cuts easily and quickly.

Here’s a video that demonstrates how to do it:

Switching between materials on an asset or surface is a great feature.

Doing this in Notch couldn’t be simpler using the Select Input node.

Use this node to switch between materials manually or automatically while taking advantage of Notch’s vast range of modifiers.

Here’s a video that demonstrates how to do it:

The select Child node is a great way to dynamically change between objects in your scene.

In this example we have three different primitives parented to the select child node, this makes it possible to dynamically switch between each shape using the Child node index parameters along with a math modifier.

Here’s a video that demonstrates how it’s done:

Node stacking order is an integral part of how Notch works and understanding this is important.

Put it simply, Notch evaluates nodes from top to bottom and from left to right, so where a node is placed on the node graph will determine in which order it will be drawn compared to other nodes on that same graph.

Here’s a video that demonstrates how it’s done:

Are there any general rules of thumb with regards the best stacking order for different types of nodes? I mean for example, should lights generally be processed after geometry and particles before fields.

I understand in most cases it will come down to the nature of the scene itself as to whether a particle system comes before the Field system that is looking at it etc. but just curious if there are any more globally applicable orders to remember.

Technically, everything gets evaluated from the top left downards, by position and heirarchy. In practical terms, this doesnt impact everything - shapes and lights depth test properly, so you don’t have to worry about their position (aside from keeping a clean nodegraph). With Post-FX the effect is most obvious, Particle emitters can also sort based on the processing order if the emitter counts are the same, and procedurals are very dependant on it too.

With that said, that doesn’t mean it can be solely forgotten about outside of those examples. You can create systems where the order of processing can be important. for example a particle system that uses a field as an emission source, it will be a frame late if you process the particles first. Not the end of the world, but its something to keep in mind :slight_smile:

– Ryan

1 Like