(12-03-2025, 08:14 PM)Jawsarn Wrote: I'm not sure I follow here. We have the Solver on the root transform, and then one of it's children has a skinned mesh renderer which has a scale of 100, some rotation and offset.
Oh, sorry. So you want to scale a
specific child transform of the character, not the entire character. What I suggested is to scale the solver, which will scale the character and all cloth actors inside of it.
(12-03-2025, 08:14 PM)Jawsarn Wrote: and things move around with inertia.
You can control how much inertia the solver's movement injects into the actors it manages by setting the
world linear/angular inertia scale parameters. This is useful for characters since it allows you to explicitly control how much the cloth moves when the character moves. For instance if cloth flails around too much when the character turns around 180º, reducing the angular inertia will result in less movement.
On the other hand, if you had a static solver sitting on the scene and the character moving
within it, you'd have no control over inertial forces.
(12-03-2025, 08:14 PM)Jawsarn Wrote: The only way I've found to get correct default physics behavior right now is to scale up the mesh in the importer, and then inject a parent the hierarchy to the skinned mesh renderer with a scalar of 0.01.
This sounds about right: if your cloth object has a transform scale of 100, it means its mesh is 100 times smaller than it should be so it needs to be scaled x100 to be rendered at a normal size. Creating the blueprint from it will result in a tiny cloth since blueprints are generated
from a mesh alone, regardless of the transform hierarchy it will be used in.
By scaling the model up to a normal size at import time, you'll get normal-sized bones, a normal-sized mesh, hence a normal-sized cloth blueprint. But you'll still have an incorrect x100 scale at runtime due to its transform hierarchy, which you're "undoing" by adding a parent with a scale of 1/100: the children all have their scale multiplied by 0.01, so you end up with unit scale
and proper mesh sizes.
You can't fix this by changing the scale value of the blueprint, since
bones in the hierarchy will still have incorrect scales: scaling the physical representation of the cloth but not the bones used to drive it will lead to weird results.
My two cents: a scale of *exactly* 100 makes me suspect this mesh has been
incorrectly exported, possibly from Blender: Blender's FBX exporter default settings will often lead to transforms with a scale of 100 and a rotational offset. Maybe it's not the case but if it is, the cleanest solution is probably to export it correctly. We also have a
video on this subject. Having consistent scaling is quite important in general, as it keeps you out of headache-inducing situations.
(12-03-2025, 08:14 PM)Jawsarn Wrote: but this solution will break once we want physics on another object to with a different scale.
You can have multiple blueprints with different scales, then skin meshes that also have different scales to it
as long as the cloth and the renderer components aren't on the same object. You can even have multiple meshes skinned to the same cloth. See:
https://obi.virtualmethodstudio.com/manu...modes.html
(12-03-2025, 08:14 PM)Jawsarn Wrote: A tangent question to what we want to achieve. Are skinned cloth more constrainted in movement compared to normal cloth?
They can be, and they usually are. Skinned cloth makes use of
skin constraints, which limit particle movement by keeping them in an area close to the mesh's skinned vertices. This allows you to blend simulation and animation, making sure the simulation doesn't stray too far from the reference animation. Most realtime character clothing engines work this way (Unity's built-in cloth also does), it's particularly useful for garments that are close to the body such as shirts/pants: using the skinned/animated vertices to constrain the simulation is much cheaper and way more robust than relying on cloth collision detection against the character's body.
(12-03-2025, 08:14 PM)Jawsarn Wrote: Our case of a scarf has danling parts, and I'm trying to figure out which solution (skinned, cloth, rope, rod, softbody) etc. Tangent to that tangent is that we want to have a character tail later as well, and similarly there are rope, rod, bone, softbody to choose from as well. Is there some general matrix of pro cons for each solution?
Rope is a 1D chain of particles: best for things that are long and thin. Particles have no rotation (they're just points), so the rope can twist around its length freely.
Rods are also a 1D chain of particles, but they do consider orientation: You can control how they twist, and give them a specific rest orientation.
ObiBone is a particle
tree , created from a bone hierarchy and driven by its animation: best for pre-rigged stuff like tails, braids, limbs, etc.
Cloth is a 2D net of particles: best for things that are mostly flat and have little or no volume.
Softbodies are a 3D network of particles: best for things that have some volume.
Which one to choose depends on your use case's specifics. So in case of say, a scarf, which is a long and relatively thin piece of cloth: you could be tempted to use a rod, rope or bone (softbody wouldn't make much sense since a scarf has basically zero volume), but the problem with these is that a 1D chain of particles won't orient itself properly when colliding with a surface, it may roll and the scarf may end up sideways on a surface.
You need at least 2 rows of particles for it to sit correctly on a surface, so the obvious choice is cloth. Poor diagram I drew, top half illustrates a rod/rope scarf colliding against a plane (may rotate and sit incorrectly on a surface), bottom one a cloth scarf:
kind regards,