Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Cutting softbodies at runtime
#1
Pregunta 
I'm quite aware that Obi doesn't really support mesh topological changes out of the box, but I'm thinking of using this 6 step process:

1. Disabling Obi Updater.
2. Slicing the mesh through another tool.
3. Checking the cut mesh size, if it is smaller than a certain amount, it will be discarded to preserve performance.
4. Add Obi softbody components to the new mesh object.
5. Generate runtime blueprints for the bigger meshes and skin it.
6. Re-enabling Obi Updater.

Would this theoretically work? And if so, how would I go about generating runtime blueprints (since I haven't found any documentation on this and I'm pretty sure runtime blueprint generation isn't supported due to its performance implications) as well as the other stuff?

Thanks a lot in advance,
woffles

:)
Reply
#2
(23-08-2022, 12:15 AM)woffles Wrote: Would this theoretically work?

Hi!

It would work, but would also be extremely slow. See below.

(23-08-2022, 12:15 AM)woffles Wrote: And if so, how would I go about generating runtime blueprints (since I haven't found any documentation on this and I'm pretty sure runtime blueprint generation isn't supported due to its performance implications) as well as the other stuff?

Blueprints can be generated at runtime just fine, but doing this is not a realtime operation. Generating a blueprint is very costly, so it's exposed as a coroutine: you're supposed to start the coroutine, forget about it, and it will be finished at some point later in time (usually several seconds later). Showing the user a progress bar is a good idea, which is exactly what the editor does when you generate a blueprint in-editor. For this purpose, the coroutine returns an object which contains a human-readable message and a completion percentage value.

Check the manual for details ("ScriptingActors"-->"Creating Blueprints" section), generating a blueprint is as simple as calling its "Generate" coroutine:
http://obi.virtualmethodstudio.com/manua...ctors.html

kind regards,
Reply
#3
hey, thanks for the quick reply!

I should've clarified a little more I guess. I'm working with a flat '2D' mesh (currently experimenting with unity's inbuilt quads), and eventually I'm going to turn those squares into rectangles or other square-like objects.

My current in-editor blueprint settings for the quad are voxels with a generated resolution of 6. Considering I will only be generating one layer of particles for all meshes after slicing, will this dramatically lighten the load and improve performance, or would this have next to no effect? I'm guessing that I can also cover it up with some graphics shenanigans.

I did find a topic from two years ago that touched on this, but I'm not really sure if newer Obi versions have a more optimal way of dealing with this.
--> http://obi.virtualmethodstudio.com/forum...p?tid=2269

On the topic of 2D and 3D, I've found that somewhat accurately simulating around 28 (at most) 3D softbodies with my 3090 with a voxel count of 14 would be about the limit before my frames crash landed to about 2 fps. Would any performance optimizations be possible, that could perhaps break the 30+ barrier? (it was actually due to this performance limitation that I switched to a mostly 2D focus).

thank you :)

regards,
-woffles
Reply
#4
(23-08-2022, 08:13 AM)woffles Wrote: hey, thanks for the quick reply!

I should've clarified a little more I guess. I'm working with a flat '2D' mesh (currently experimenting with unity's inbuilt quads), and eventually I'm going to turn those squares into rectangles or other square-like objects.

My current in-editor blueprint settings for the quad are voxels with a generated resolution of 6. Considering I will only be generating one layer of particles for all meshes after slicing, will this dramatically lighten the load and improve performance, or would this have next to no effect? I'm guessing that I can also cover it up with some graphics shenanigans.

The costlier part of blueprint generating is graph coloring. The cost of coloring is quadratic on the number of constraints in the softbody (100 operations for 10 constraints, 10000 operations for 100 constraints, 1 million operations for 1000 constraints, etc). So as the softbody gets larger, the cost rises quite quickly. The topology of the softbody - that is, its shape, or whether it's 2D or 3D - does have an effect on the amount of constraints, and hence blueprint generation cost.

You can test this by generating your softbody in-editor: If generation is almost instant, then it will be too at runtime (the process is exactly the same). If it takes a while, it will also take a while at runtime.

(23-08-2022, 08:13 AM)woffles Wrote: I did find a topic from two years ago that touched on this, but I'm not really sure if newer Obi versions have a more optimal way of dealing with this.
--> http://obi.virtualmethodstudio.com/forum...p?tid=2269

Pretty much everything said on that thread applies to the latest version as well.

(23-08-2022, 08:13 AM)woffles Wrote: On the topic of 2D and 3D, I've found that somewhat accurately simulating around 28 (at most) 3D softbodies with my 3090 with a voxel count of 14 would be about the limit before my frames crash landed to about 2 fps. Would any performance optimizations be possible, that could perhaps break the 30+ barrier? (it was actually due to this performance limitation that I switched to a mostly 2D focus).

Regarding performance, the total amount of particles in the solver is only half of the equation: your spatial discretization resolution.
The other half is your temporal discretization resolution: that is, your timestep. Using less substeps or a larger timestep size will also improve performance, by updating the simulation less often. This will also make your softbodies softer, too. The manual explains the internal workings of the engine as well as the impact of timestep size on the results: http://obi.virtualmethodstudio.com/manua...gence.html

kind regards,
Reply
#5
hello again!

Thanks for all the information, it looks like I accidentally skipped over the runtime blueprint generation documentation while I was searching through the doco website ._.

in regards to raycasting, the documentation mentions using a solver to do so. Would the resulting raycast(s) come from the solver, or could I specify a certain place for a ray to start and stop? If it does come from the solver, how would I set up a script to have multiple raycasts all checking for softbodies in different places?

thanks so much!

regards,
-woffles
Reply
#6
(23-08-2022, 12:19 PM)woffles Wrote: in regards to raycasting, the documentation mentions using a solver to do so.
Would the resulting raycast(s) come from the solver, or could I specify a certain place for a ray to start and stop?

What do you mean exactly? The solver defines a group of actors and the space they're simulated in, raycasts are a point + a direction. A ray cannot "come from a solver". You must define a ray using a point and a direction, just like you do in Unity.

Think of solvers as "worlds": when you do solver.Raycast(), you're raycasting over all actors in that solver/world.

(23-08-2022, 12:19 PM)woffles Wrote: If it does come from the solver, how would I set up a script to have multiple raycasts all checking for softbodies in different places?[/font][/size]

See the multi raycast example in the manual:
http://obi.virtualmethodstudio.com/manua...eries.html

That's a script that sets up and launches multiple raycasts in parallel over all actors in a solver, and returns a list of hits.

kind regards,
Reply
#7
ah, I think I understand the system now. I thought that the ray would come from the solver, but I'm aware that they work basically the same way as Unity's inbuilt ryacasts.

thank you!
Reply