(30-06-2021, 12:56 PM)dfierro Wrote: Hello Lidia,
Thanks a lot for sharing the video, it's a very creative way of using Obi.
I have a problem and I would like to ask for some advice as you might have encountered it too. I'm using the Unity Recorder to capture some simulations in 360 but I'm having problems with the fluids (cloth seems to work fine). If I set everything as adviced in the youtube videos I can see the beautiful result in the Unity screen, wich is the image of the only camera in the scene, but when I try to record it in 360 the video doesn't show the fluid. I tried activating the option render fluid in the emiter and the particles do appear in the video but it's not beautiful as you see the particles and not the final result. I wanted to ask you if you had some troubles recording it or if there is some configuration I could be missing. Thanks a lot !
(If I take a normal "2d" video the fluids are shown)
Hi there!
Fluid is rendered using a technique known as
screen-space ellipsoid splatting. You can find a simpler version of the technique described here:
https://developer.download.nvidia.com/pr...ffects.pdf
The manual also describes the process, see the "how it works" section of the "Fluid Rendering" page:
http://obi.virtualmethodstudio.com/manua...ering.html
In order to do this, the renderer
has to have knowledge about the projection being used by the camera. Obi supports only
perspective and
orthographic projections, which are the only ones supported by Unity cameras out of the box, and the ones used in pretty much all games.
The Unity Recorder package hijacks camera rendering and uses a custom projection matrix (probably spherical/cubemap/dual paraboloid, etc) to render a 360º view. Since Obi does not support projection types other than perspective or orthographic, fluid isn't rendered in 360º.
As of now there are no plans to support custom projections in the fluid renderer, so this is something that can't be worked around without writing some shader code. If you have some programmer(s) in your team, getting this to work would involve: taking a look at Unity Recorder's source code, work out what type of projection they use for 360º, then implement the exact same projection in Obi's fluid renderer (ObiFluids.cging file is the main one you're interested in: it contains methods that convert from Z value to eye space depth, and reconstruct eye space position from depth, all depend on the kind of projection used).
Cloth/Ropes/Softbodies do not perform any rendering of their own, they simply output a mesh. So 360º projections and any other custom rendering stuff will just work with them.