21-12-2021, 08:21 AM
(This post was last modified: 21-12-2021, 12:06 PM by josemendez.)
(20-12-2021, 05:08 PM)aqualonix Wrote: So, if I want to simulate softbody character with socks for example. What is the best approach? Should I use separate mesh with the same bones or just one character mesh without leg parts but with socks instead? Or maybe I can somehow skin wrap (attach) socks to soft body character even without bones?
Mixing softbodies and cloth -as in a softbody simulation driving a cloth simulation- isn't trivial at all, but can be done. Ideally if you want the cloth simulation on top of the softbody simulation, you should skin the cloth (in this case the socks) to the same set of bones used by the softbody. So a separate mesh with the same bones. Then at runtime you will have to write some code that copies the extra bones generated by the ObiSoftbodySkinner to the cloth, and another script that updates softbody and cloth in that specific order: softbody first, cloth second.
(20-12-2021, 05:08 PM)aqualonix Wrote: But is there any way to forcefully update blend shapes for a softbody mesh? I don't care for performance so I can wait for the update.
There will be no update to add support for this, since blend shapes and softbodies (or any deformable physics, really) go basically against each other: blend shapes change the rest state of an object. Deformable physics precompute the rest state of an object to achieve realtime performance. So once you add blend shapes to cloth or softbody, you need to recalculate the rest state every frame and forgo any realtime simulation.
Edit: just realized that by "wait for the update" you didn't mean an asset update to support blend shapes, but that you didn't care if the blend shape update is slow. This is not just for performance reasons, it isn't doable without basically rewriting the entire engine.
When you change a blend shape, the volume/shape of the mesh changes, so does the amount of particles in the softbody and their connectivity. So in addition to resampling the mesh with particles and re-generating all connections (which in a nutshell is the blueprint's "rest shape" information) you would have to transfer motion from the pre-blend shape representation to the post-blend shape one.
Anyway: if you're not doing this in realtime, why not using an offline simulator and importing the results into Unity using Alembic? this is both much simpler and faster.