Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Character + cloth
#1
So, if I want to simulate softbody character with socks for example. What is the best approach? Should I use separate mesh with the same bones or just one character mesh without leg parts but with socks instead? Or maybe I can somehow skin wrap (attach) socks to soft body character even without bones?

Sorry for that many questions  Triste
But is there any way to forcefully update blend shapes for a softbody mesh? I don't care for performance so I can wait for the update.
Reply
#2
(20-12-2021, 05:08 PM)aqualonix Wrote: So, if I want to simulate softbody character with socks for example. What is the best approach? Should I use separate mesh with the same bones or just one character mesh without leg parts but with socks instead? Or maybe I can somehow skin wrap (attach) socks to soft body character even without bones?

Mixing softbodies and cloth -as in a softbody simulation driving a cloth simulation- isn't trivial at all, but can be done. Ideally if you want the cloth simulation on top of the softbody simulation, you should skin the cloth (in this case the socks) to the same set of bones used by the softbody. So a separate mesh with the same bones. Then at runtime you will have to write some code that copies the extra bones generated by the ObiSoftbodySkinner to the cloth, and another script that updates softbody and cloth in that specific order: softbody first, cloth second.

(20-12-2021, 05:08 PM)aqualonix Wrote: But is there any way to forcefully update blend shapes for a softbody mesh? I don't care for performance so I can wait for the update.

There will be no update to add support for this, since blend shapes and softbodies (or any deformable physics, really) go basically against each other: blend shapes change the rest state of an object. Deformable physics precompute the rest state of an object to achieve realtime performance. So once you add blend shapes to cloth or softbody, you need to recalculate the rest state every frame and forgo any realtime simulation.

Edit: just realized that by "wait for the update" you didn't mean an asset update to support blend shapes, but that you didn't care if the blend shape update is slow. This is not just for performance reasons, it isn't doable without basically rewriting the entire engine.

When you change a blend shape, the volume/shape of the mesh changes, so does the amount of particles in the softbody and their connectivity. So in addition to resampling the mesh with particles and re-generating all connections (which in a nutshell is the blueprint's "rest shape" information) you would have to transfer motion from the pre-blend shape representation to the post-blend shape one.


Anyway: if you're not doing this in realtime, why not using an offline simulator and importing the results into Unity using Alembic? this is both much simpler and faster.
Reply
#3
(21-12-2021, 08:21 AM)josemendez Wrote: Mixing softbodies and cloth -as in a softbody simulation driving a cloth simulation- isn't trivial at all, but can be done. Ideally if you want the cloth simulation on top of the softbody simulation, you should skin the cloth (in this case the socks) to the same set of bones used by the softbody. So a separate mesh with the same bones. Then at runtime you will have to write some code that copies the extra bones generated by the ObiSoftbodySkinner to the cloth, and another script that updates softbody and cloth in that specific order: softbody first, cloth second.


There will be no update to add support for this, since blend shapes and softbodies (or any deformable physics, really) go basically against each other: blend shapes change the rest state of an object. Deformable physics precompute the rest state of an object to achieve realtime performance. So once you add blend shapes to cloth or softbody, you need to recalculate the rest state every frame and forgo any realtime simulation.

Edit: just realized that by "wait for the update" you didn't mean an asset update to support blend shapes, but that you didn't care if the blend shape update is slow. This is not just for performance reasons, it isn't doable without basically rewriting the entire engine.

When you change a blend shape, the volume/shape of the mesh changes, so does the amount of particles in the softbody and their connectivity. So in addition to resampling the mesh with particles and re-generating all connections (which in a nutshell is the blueprint's "rest shape" information) you would have to transfer motion from the pre-blend shape representation to the post-blend shape one.


Anyway: if you're not doing this in realtime, why not using an offline simulator and importing the results into Unity using Alembic? this is both much simpler and faster.

Thank you for the answers! So I painted the character in the ObiSoftbodySkinner to edit what parts are jiggling and I have the stockings that are using same bones that the character. I can use ObiSoftbodySkinner to attach them to the character's softbody, but it's almost impossible to paint stockings with the same values manually. Is there any method so I can copy ObiSoftbodySkinner "map" to the another object so the stockings will follow the character at the same jiggle zones?
Reply
#4
(24-01-2022, 01:03 PM)aqualonix Wrote: Thank you for the answers! So I painted the character in the ObiSoftbodySkinner to edit what parts are jiggling and I have the stockings that are using same bones that the character. I can use ObiSoftbodySkinner to attach them to the character's softbody, but it's almost impossible to paint stockings with the same values manually. Is there any method so I can copy ObiSoftbodySkinner "map" to the another object so the stockings will follow the character at the same jiggle zones?

Hi!

You can write a script that copies data over between constraints, based on particle distance or any other metric. The "scripting constraints" manual page explains how to deal with constraints:
http://obi.virtualmethodstudio.com/manua...aints.html

Each constraint batch has several arrays that contain per-constraint data. This data is stored as tuples: for instance cloth skin constraints have a skinRadiiBackstop array which contains 3 values per constraint (radius, backstop sphere radius, and backstop distance). So to access the backstop sphere radius for constraint N, you do:

Code:
var value = solverSkinBatch.skinRadiiBackstop[N*3+1];

All constraint batches contain a "particleIndices" array that indicates which particles are part of each constraint. Again the amount of particles per constraint varies for each specific constrain type. Check the API docs for details:
http://obi.virtualmethodstudio.com/api.html

let me know if I can be of further help!
Reply
#5
(26-01-2022, 03:07 PM)josemendez Wrote: Hi!

You can write a script that copies data over between constraints, based on particle distance or any other metric. The "scripting constraints" manual page explains how to deal with constraints:
http://obi.virtualmethodstudio.com/manua...aints.html

Each constraint batch has several arrays that contain per-constraint data. This data is stored as tuples: for instance cloth skin constraints have a skinRadiiBackstop array which contains 3 values per constraint (radius, backstop sphere radius, and backstop distance). So to access the backstop sphere radius for constraint N, you do:

Code:
var value = solverSkinBatch.skinRadiiBackstop[N*3+1];

All constraint batches contain a "particleIndices" array that indicates which particles are part of each constraint. Again the amount of particles per constraint varies for each specific constrain type. Check the API docs for details:
http://obi.virtualmethodstudio.com/api.html

let me know if I can be of further help!
Well, I'm not using cloth for the stockings, it uses the skinned mesh renderer. I just need that ObiSoftbodySkinner for body and stockings can be exactly the same. Or the stockings follow the body with the jiggle like Obisoftbody but on top of another one. Is it possible with scripting constraints?
It's like in this video, but
for the whole body/object https://youtu.be/n2OxesYuZtU  
Reply
#6
(29-01-2022, 07:10 AM)aqualonix Wrote: Well, I'm not using cloth for the stockings, it uses the skinned mesh renderer. I just need that ObiSoftbodySkinner for body and stockings can be exactly the same. Or the stockings follow the body with the jiggle like Obisoftbody but on top of another one. Is it possible with scripting constraints?
It's like in this video, but
for the whole body/object https://youtu.be/n2OxesYuZtU  

Simply binding both meshes to the same softbody should give you the results you want. No need to write any code.

Keep in mind though, that unless both meshes have identical topology the resulting skin weights will not be identical. It’s just how linear blend skinning works: two vertices will not have the same skin weights unless they’re at the exact same position. Also, using the same skin weights for vertices at different positions will result in garbled deformation.
Reply
#7
(30-01-2022, 05:37 PM)josemendez Wrote: Simply binding both meshes to the same softbody should give you the results you want. No need to write any code.

Keep in mind though, that unless both meshes have identical topology the resulting skin weights will not be identical. It’s just how linear blend skinning works: two vertices will not have the same skin weights unless they’re at the exact same position. Also, using the same skin weights for vertices at different positions will result in garbled deformation.
Yes, they have the same topology and almost the same position. Stockings can't be on the same position as the body. I can bind them both to the same softbody but I can't manually paint them the same way for the parts that needed for the jiggle. Painting method is very approximated.
Reply
#8
(30-01-2022, 06:54 PM)aqualonix Wrote: Yes, they have the same topology and almost the same position. Stockings can't be on the same position as the body. I can bind them both to the same softbody but I can't manually paint them the same way for the parts that needed for the jiggle. Painting method is very approximated.

You can then copy weights between both. These are directly accessible in the ObiSoftbodySkinner component:

Code:
skinner.m_softbodyInfluences //<-- array of weights for each vertex. Values range from 0 to 1.

If you can't rely on vertex ordering to be the same for both meshes, you will have to use some other metric to determine corresponding vertices. Distance is a simple and good one, for each vertex in the mesh A look for the closest vertex in mesh B and copy its weight.

kind regards,
Reply