All the assets share the same core physics solver, and the same philosophy: everything (cloth, fluids, ropes...) is made out of particles, which are small lumps of matter that relate to each other trough the use of constraints. However, most often there's no use for a unified physics engine in games. So instead of having a single, very expensive asset that can simulate all of the above, we split Obi in multiple smaller assets. This way you only pay for what you need, and it is still possible to use all assets together if you own them all.
Simply restart Unity. Sometimes, Unity fails to load new native libraries upon importing them. The only way to force a reload is to restart Unity.
It depends on what your goals are. If you want to make something more complex than a simple flag or a water faucet, most likely the answer is no. Obi is a powerful, complex system. If it makes sense for a particular parameter to be exposed to the user, Obi does expose it. No compromises were made in its design to trade flexibility for ease of use. A lot of effort was put into making it 'as simple as possible, but not simpler'.
Throughout the manual, API documentation and engine design, basic 3D math/simulation concepts (like vector spaces, inertial forces, mass or fixed timestepping) aren't explained in detail. It's a good idea to familiarize yourself with these before attempting to use Obi in a real-world project.
The good news is that if you devote enough time to it, it will pay off. Also it will help you grasp a lot of essential simulation concepts that can be easily transferred to other physics simulators. You'll wonder why these other simulators impose so many artificial limits on what you can do with them.
No. The core physics solver runs 100% in the CPU. Only rendering is done using the GPU.
Yes. It is extremely well optimized. A task-based multithreading scheme is used to split workload into evenly sized chunks. All the math-heavy routines are SIMD accelerated.
Depends on the simulation backend used (see backends). If using Burst, you can build for all platforms supported by Unity's Burst compiler, including consoles. If using Oni, you can build for Windows, Linux, OSX, iOS, and Android.
Cloth, Rope and Softbodies support VR. For Obi Fluid, you will have to use separate cameras for each eye since the renderer does not support single-pass stereo rendering.
Cloth, Rope and Softbodies support all SRPs out of the box, since they do not perform any custom rendering. Obi Fluid does include a custom rendering pipeline based on screen-space ellipsoid splatting, that currently supports the built-in pipeline and the URP (no HDRP support yet).
No, currently it doesn't as this feature is deemed experimental as of Unity 2020.2. Disabling domain reloading requires special handling of static data, which is not yet implemented in Obi.
The full C# source code for the Burst backend is included. The source code for the Oni backend is not. Oni is written in highly portable C++14. We do license it on a per-case basis, so contact us if you're interested. Sources for everything else (C# MonoBehaviours, classes, shaders, etc.) are included.
The rope inspector is a sceneview window, and Unity won't draw sceneview windows if any inspector is in debug mode. Close all inspectors in debug mode (easiest, quickest way is to revert the window layout to the default) and it will become visible again.
Most likely the cause is that your colliders and your Obi actors (cloth, rope, etc.) are using the same phase. Obi requires actors and colliders to be in separate phases in order to generate contacts, as this allows for more fine-grained collision control. Also, keep in mind that PolygonCollider2D is not currently supported. See the collisions manual page for detailed info.