Posts: 13
Threads: 8
Joined: Feb 2024
Reputation:
0
19-01-2026, 02:26 AM
(This post was last modified: 19-01-2026, 02:26 AM by CptnFabulous.)
I’m trying to get cloth garments on a character model to behave looser without entirely slipping off, and from my research I think modifying the skin constraints will be the best solution. I’m following the documentation (https://obi.virtualmethodstudio.com/manu...aints.html) and these look like exactly the right settings I need to alter, but I’m having trouble understanding how to change them, since there are no clearly exposed properties or functions.
I had a look at the documentation for scripting constraints (https://obi.virtualmethodstudio.com/manu...aints.html), but I haven’t really figured out how to access specific settings. From what I can tell, constraint batches are created only when a particle is created with different settings, and they’re grouped into batches at runtime to limit the number of different settings in memory. But I don’t really know how to add new ones. For now I only need one set of constraints for the entire cloth actor, and they don’t really need to change after being generated (although if they can that might help with debugging). The cloth shapes are being generated procedurally from procedural meshes, so it needs to all work through code.
I looked at the example in that documentation (which is conveniently already for setting the skin constraints), and tried copying it and turning it into a standalone function with parameters instead of hardcoded values. But I got null errors when trying to retrieve the batches, so there didn’t seem to be values there to edit. I realised that the example code uses an ObiSkinnedCloth, whereas our current system just uses regular ObiCloth components, so I switched the cloth, renderer and mesh renderer components to the right ones. But now I'm getting an out of range error when trying to get the index offset and access the correct batches.
I’m sure I’m going about this completely the wrong way. Maybe I’m running this code at the wrong point during the generation, or I’m meant to modify values in the blueprint rather than the cloth itself. Any help would be greatly appreciated.
Thanks!
Posts: 6,716
Threads: 28
Joined: Jun 2017
Reputation:
435
Obi Owner:
19-01-2026, 11:10 AM
(This post was last modified: 19-01-2026, 11:51 AM by josemendez.)
(19-01-2026, 02:26 AM)CptnFabulous Wrote: I’m trying to get cloth garments on a character model to behave looser without entirely slipping off, and from my research I think modifying the skin constraints will be the best solution.
Hi!
You need to increase the skinRadius property of the constraints. This property controls how far from the animated position the cloth can get, so it's literally the "looseness" of the cloth.
(19-01-2026, 02:26 AM)CptnFabulous Wrote:
From what I can tell, constraint batches are created only when a particle is created with different settings, and they’re grouped into batches at runtime to limit the number of different settings in memory.
Batches are *always* created, their purpose is not to limit the number of different settings (particle or constraint setting deduplication is not related to this at all, you might as well have the exact same properties for all particles and constraints and you'd still get batches). The reason batches are created is to prevent race conditions, which are a basic problem in multithreaded code:
When you have multiple threads working on something, it's fine if the data they're working on is completely independent: they can go on reading and writing data without stepping on each other's toes. However, as soon as two or more threads are reading from/writing to the same memory location you must ensure they do it in a specific, predictable order or otherwise you'd get incorrect results. For instance if you have two threads trying to naively increment the same variable (let's say, A = 5), you may get this order of operations:
- thread #1 reads value of variable A, 5.
- thread #2 reads value of variable A, 5.
- thread #1 does A = A+1, so A = 6.
- thread #2 does A = A+1, so A = 6.
Since thread #2 didn't wait for thread #1 to write its result into A before reading A and incrementing it again, you end up with A = 6 even though the expected result would be A = 7, since 2 threads have incremented the variable.
There's 3 ways to solve race conditions like these (from worse to better performant):
- Use thread synchronization primitives like mutexes or semaphores to make sure only one thread can be executing a specific section of code at any given time.
- Use atomic/interlocked instructions (in many cases, basic operations like integer addition/subtraction or compare and swap (CAS) are the only atomics available, so while faster than mutexes they're a bit limiting). An atomic operation performed by a thread is guaranteed to complete before any other thread steps in.
- Avoid having race conditions in the first place.
Batches are a way of implementing option #3, that is, ensuring race conditions aren't possible. They're based on a concept called graph coloring, which strives to segregate a bunch of items into as few groups as possible, in such a way that items in the same group don't share any specific property. This makes it safe to process all items in a group in parallel.
In Obi's case, the items are constraints, and they're batched so that constraints in the same batch they don't affect the same particles. All constraints in a batch can be processed by multiple threads, and we know no two threads will simultaneously attempt to operate on the same data (particles). For physics, batching also has additional benefits (minimum jittering since the order in which constraint corrections are applied to particles is always the same every frame).
Skin constraints are a bit of a special case regarding batches, since each skin constraint only affects one particle. So typically you'd only need one batch of skin constraints as they don't share particles with other skin constraints. Other constraints (distance constraints, which operate on 2 particles each, or bend constraints which operate on 3 particles) do need multiple batches in most cases.
Batches are created during blueprint generation and stored in the blueprint, since graph coloring is a lengthy process (can take a few seconds). At runtime, batch data is simply copied over to the solver when the actor is inserted into the simulation.
(19-01-2026, 02:26 AM)CptnFabulous Wrote:
But I don’t really know how to add new ones. For now I only need one set of constraints for the entire cloth actor, and they don’t really need to change after being generated (although if they can that might help with debugging). The cloth shapes are being generated procedurally from procedural meshes, so it needs to all work through code.
You don't have to add new batches to loosen the cloth. Assuming you already have a skinned mesh and a cloth blueprint generated from it, you just need to change the skin radius of already existing constraints in already existing batches. In your case it might be possible to modify the blueprint itself after generating it (write to blueprint.skinConstraints.skinRadiiBackstop) before the cloth gets loaded into the solver.
(19-01-2026, 02:26 AM)CptnFabulous Wrote: I looked at the example in that documentation (which is conveniently already for setting the skin constraints), and tried copying it and turning it into a standalone function with parameters instead of hardcoded values. But I got null errors when trying to retrieve the batches, so there didn’t seem to be values there to edit. I realised that the example code uses an ObiSkinnedCloth, whereas our current system just uses regular ObiCloth components, so I switched the cloth, renderer and mesh renderer components to the right ones. But now I'm getting an out of range error when trying to get the index offset and access the correct batches.
I’m sure I’m going about this completely the wrong way. Maybe I’m running this code at the wrong point during the generation, or I’m meant to modify values in the blueprint rather than the cloth itself. Any help would be greatly appreciated.
Thanks!
The main issue with skinned cloth is that it needs to be, well, skinned. In cases where you have lots of different clothing that must be worn by a single character/mannequin (or even worse, multiple characters of varying body shapes/sizes) the amount of artist work required skyrockets, as they must now skin every clothing piece to every possible body that may wear it. This typically makes it an impractical approach unless you can procedurally skin the cloth mesh to the character's skeleton upon deciding which cloth/body combination you want, which presents its own set of hurdles. If you've already figured out a solution to this or if it's not a problem in your case, forget what I just said.
It would help taking a look at your code. If you're not comfortable sharing it here, a PM or an email is fine, I'm sure it's a relatively quick fix.
cheers!
Posts: 13
Threads: 8
Joined: Feb 2024
Reputation:
0
(19-01-2026, 11:10 AM)josemendez Wrote: The main issue with skinned cloth is that it needs to be, well, skinned. In cases where you have lots of different clothing that must be worn by a single character/mannequin (or even worse, multiple characters of varying body shapes/sizes) the amount of artist work required skyrockets, as they must now skin every clothing piece to every possible body that may wear it. This typically makes it an impractical approach unless you can procedurally skin the cloth mesh to the character's skeleton upon deciding which cloth/body combination you want, which presents its own set of hurdles. If you've already figured out a solution to this or if it's not a problem in your case, forget what I just said.
Gotcha. I presume that the ObiSkinnedCloth is not generating the necessary batches, because there’s no skin data present on the mesh for it to work with?
In this case we’ll have to make code to procedurally skin the cloth meshes, as they’re entirely procedurally generated (edited by the user) at runtime. I imagine the simplest solution is to, for each vertex, calculate bones within a certain distance (or the points inbetween the bones for things like limbs), and reference those bones’ index in the added bone weights. Weightings could be calculated on either proximity, or perhaps vertical position, as the higher fabric would potentially stretch to support the weight of the lower one.
Can this be done after simulation starts? The way our system works currently, the garments are made of several flat meshes, that are wrapped around the body and stitched together through the simulation. If not, I’ll have to make code that modifies the meshes themselves to fit around the body. Fortunately I already spent some time working on this to try and solve a different problem, but I’ll still need to allocate time to finish it. I don't think it'll be practical to show our existing code, as the necessary functionality is stretched out amongst a lot of different classes and it'll be hard to do so without violating NDA.
Cheers!
Posts: 6,716
Threads: 28
Joined: Jun 2017
Reputation:
435
Obi Owner:
23-01-2026, 09:11 AM
(This post was last modified: 23-01-2026, 10:26 AM by josemendez.)
(20-01-2026, 11:35 AM)CptnFabulous Wrote: Gotcha. I presume that the ObiSkinnedCloth is not generating the necessary batches, because there’s no skin data present on the mesh for it to work with?
In this case we’ll have to make code to procedurally skin the cloth meshes, as they’re entirely procedurally generated (edited by the user) at runtime. I imagine the simplest solution is to, for each vertex, calculate bones within a certain distance (or the points inbetween the bones for things like limbs), and reference those bones’ index in the added bone weights.
Yes, that's the basic way to do it: for each bone, pick the N closest vertices (typically N = 4) and assign a weight based on some normalized function of distance (could be the distance itself or some smooth falloff function that gives more relative importance to closer vertices). The way to normalize the weights so that they add up to 1 and vertices don't move more than they should is to calculate the sum of the distances or falloffs for each vertex, and then divide each one by the sum. Something like:
Code: foreach vertex v:
foreach bone b:
vertexWeights[v].Add(FancyFalloffFunction(Distance(v, b)));
foreach vertex v:
vertexWeights[v].Sort(); // sort weights
vertexWeights[v].Resize(N); // keep only the N highest weights (N closest bones)
foreach vertex v:
sum = 0;
foreach weight w in v:
sum += vertexWeights[v][w]; // accumulate sum of all weights for this vertex
foreach weight w in v:
vertexWeights[v][w] /= sum; // divide by sum to make sure all weights add up to 1.
Caveats:
#1) - note that this may cause the cloth to clip trough the character, since body vertex weights and cloth vertex weights won't necessarily match and they'll deform differently. This isn't much of a problem if you have a full-body collider setup that can prevent the clipping.
It's worth noting that in the *ideal* case of having a skinned garment that doesn't clip trough the body, skin constraints can prevent clipping without the need for colliders, since each cloth vertex is constrained to move "in front of" the skinned position: if the skinned cloth mesh doesn't clip the body, the simulation won't either.
#2) - using euclidean distance between vertices and bones may not work well in some cases, for example pants: vertices belonging to one leg might actually be closer to the opposite leg's bones and become skinned to the wrong leg. In cases like these you'd want a more sophisticated solution like voxel heat diffusion:
https://www.wolfire.com/blog/2009/11/vol...-skinning/
https://bronsonzgeb.com/index.php/2021/0...-skinning/
This measures distances between bones and vertices inside the volume of the body, preventing "gap closing" in areas like armpits and legs.
(20-01-2026, 11:35 AM)CptnFabulous Wrote:
Weightings could be calculated on either proximity, or perhaps vertical position, as the higher fabric would potentially stretch to support the weight of the lower one.
Don't make vertex/bone weights depend on vertical position. You can however make the skin constraints' "skin radius" depend on it, forcing cloth vertices near the shoulders to keep closer to their skinned position, and giving more freedom of movement to cloth vertices further down the body.
(20-01-2026, 11:35 AM)CptnFabulous Wrote: [color=#333333][size=small][font=Tahoma, Verdana, Arial, sans-serif]Can this be done after simulation starts? The way our system works currently, the garments are made of several flat meshes, that are wrapped around the body and stitched together through the simulation. If not, I’ll have to make code that modifies the meshes themselves to fit around the body. Fortunately I already spent some time working on this to try and solve a different problem, but I’ll still need to allocate time to finish it.
You can do this by allowing the simulation to run for a while against a t-posed mannequin, then baking the resulting cloth mesh(es) (look for the BakeMesh() method), then performing skinning and adjusting the skin constraints to use the newly generated skin data.
let me know if you need further help,
kind regards
|