Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Help  Artifacts on simulated cloth but not original mesh (something to do with normals)
#1
Hi,

We have a system that procedurally generates cloth garments, then generates blueprints and simulates them on a virtual mannequin. One issue we’ve noticed with one of our garments is strange, dark artifacting on the bust and one of the thighs, that isn’t present on the regular mesh but appears once simulation starts. The artifacting shifts and changes when the garment is dragged (using ObiParticlePicker), which suggests it has something to do with the cloth particles’ positions and normals. The dark spots must be the shader's lighting rendering those section as in shadow.

Pictures (I can't show too much because of NDAs)
The first two are the problem areas, the second two are the same areas with ObiSkinnedClothRenderer disabled: (so rendering the unsimulated mesh using SkinnedMeshRenderer).


My theory was that the particles might be too constrained and forced to bunch up in unnatural ways, but I tried increasing the skin radius and that didn’t help (it just made the rest of the garment fall away too much). I'm not sure what else could be doing it that's specifically because of the cloth simulation. If it was because of the normals not being calculated properly, I imagine the artifacts would appear even without simulation. Any idea of what could be doing this? I tried searching but wasn't sure how to articulate the question.

Additional details:
  • Obi Cloth 7.0.3, in Unity 2022.3.62f3
  • The material’s shader uses custom code to calculate the texture, but the lighting data is default for Unity’s URP so that shouldn't be causing any weird lighting behaviour. The artifacts also occur when using the completely standard URP Lit shader, and don't appear when rendering with a shader that doesn't have any lighting code.
  • I also made a slight modification to ObiSkinnedClothBlueprint.CommitBlueprintChanges(). It ran the base function followed by an override that’s basically identical except for one different value (CreateDefaultSkinmap's 'mapBonesToParticles' bool being true instead of false), resulting in severe lag spikes when generating complex garment shapes. I modified the ObiClothBlueprint’s base function to replace the one boolean with a reference to a property that’s overridden in ObiSkinnedClothBlueprint, and deleted the override. This seems to perform the same functions with considerably better performance, but could this change be affecting the normals in some way?
Thanks!
Reply
#2
Hi!

By the looks of it, I'd say normals are the culprit too (specially since unlit shaders don't exhibit the issue). Would it be possible for you to bake the simulated mesh (right click on ObiClothRenderer, "BakeMesh" on the context menu - or, "BakeMesh" button on the component itself if you're using an older version of the asset) and share both the original and the exported simulation meshes so that I can take a closer look at them?

During simulation, the mesh deforms. Both the deformed mesh vertex positions and normals are calculated by linear blend skinning the original mesh data (ClothRenderingJobs.cs, UpdateClothMeshJob). There's 3 things that can go wrong:
- Original mesh data is incorrect.
- Bind poses are incorrect.
- Current particle poses during simulation are incorrect. These are calculated in the RenderableOrientationFromNormals job at the end of each frame, by simply deriving a quaternion from each normal. Normals are a simple area-weighted cross product sum (UpdateTriangleNormalsJob). This assumes triangle winding order is consistent throughout the original mesh: if some triangles have opposite winding order, the normals will be flipped during simulation. This is consistent with what you're getting, so I'd look here first.

kind regards,
Reply
#3
(04-05-2026, 09:39 AM)josemendez Wrote: Hi!

By the looks of it, I'd say normals are the culprit too (specially since unlit shaders don't exhibit the issue). Would it be possible for you to bake the simulated mesh (right click on ObiClothRenderer, "BakeMesh" on the context menu - or, "BakeMesh" button on the component itself if you're using an older version of the asset) and share both the original and the exported simulation meshes so that I can take a closer look at them?

During simulation, the mesh deforms. Both the deformed mesh vertex positions and normals are calculated by linear blend skinning the original mesh data (ClothRenderingJobs.cs, UpdateClothMeshJob). There's 3 things that can go wrong:
- Original mesh data is incorrect.
- Bind poses are incorrect.
- Current particle poses during simulation are incorrect. These are calculated in the RenderableOrientationFromNormals job at the end of each frame, by simply deriving a quaternion from each normal. Normals are a simple area-weighted cross product sum (UpdateTriangleNormalsJob). This assumes triangle winding order is consistent throughout the original mesh: if some triangles have opposite winding order, the normals will be flipped during simulation. This is consistent with what you're getting, so I'd look here first.

kind regards,

Whoops, forgot to update earlier! I ended up figuring it out myself, turns out some of the original vertex positions were pretty horribly mangled, and the strict skin constraints on some parts of the generated mesh meant those parts weren't able to flatten out naturally. Since the meshes are procedurally generated, I had to make some tweaks to the generation code, to ensure that some changes were applied with less intensity.
Honestly there's not much interesting stuff I can say without giving away NDAs, but I appreciate the response anyway. The thing about the triangle winding order is interesting, but the normals at those regions seemed totally fine anyway before the simulation started.

Thanks!
Reply