Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Help  Using Fluid in VR (HTC vive)
#11
(07-02-2022, 01:09 PM)josemendez Wrote: URP VR is not supported as of now, since Unity does not support using renderer features on a per-camera basis which would be required for this to work.
Ah. That's... disappointing.
EDIT:Is there any ETA/hope for this to change?
Reply
#12
(07-02-2022, 02:15 PM)Epineurien Wrote: Ah.
That's... disappointing.

I know, sorry. There's no full feature parity between URP and built-in yet. For instance, URP's renderer features don't allow to filter rendering to specific objects in the scene, since renderer features are assets and assets cannot reference scene objects. This is why URP also doesn't allow you to render multiple fluids at once each with its own rendering parameters. All fluid particles are piped trough the same renderer.

Quote:Is there any ETA/hope for this to change?

It's entirely up to Unity. When/if we get the required support from renderer features (being able to apply them to specific cameras and get scene info) this won't be a problem.

If you want a refund for this reason, write to support(at)virtualmethodstudio.com.
Reply
#13
(07-02-2022, 01:48 PM)locque Wrote: Here are some pictures showing what happens when you switch the target from the main display to the left eye:

https://imgur.com/a/gV2cVfy

The shadow is still being rendered for the left eye, but the fluid itself is missing.

That's quite weird. Will try to reproduce and get back to you.
Reply
#14
Can now confirm this is not an issue with 2019.4 and Legacy OpenVR. Anything that can be done to get this working in 2020.3?

You don't actually need two cameras, it works fine with a single one targeting both eyes for me. Lighting looks wrong though, no matter whether you use two cameras or one. The reflections of a light source are way too far apart on each eye:

https://imgur.com/a/xrCeK79
Reply
#15
(08-02-2022, 09:06 AM)locque Wrote: Can now confirm this is not an issue with 2019.4 and Legacy OpenVR. Anything that can be done to get this working in 2020.3?

You don't actually need two cameras, it works fine with a single one targeting both eyes for me. Lighting looks wrong though, no matter whether you use two cameras or one. The reflections of a light source are way too far apart on each eye:

https://imgur.com/a/xrCeK79

Hi there,

Could reproduce the issue with fluid not rendering on separate cameras in 2020.3 and above. Not sure what has changed in Unity's side for this to stop working, going to investigate and get back to you.

All fluid lighting us calculated using the eye-space position of the fluid. Since the fluid's surface is not a mesh but an implicit function, positions are  reconstructed from the depth buffer and the camera's frustum data (which is passed to the fluid shaders from the CPU). For this reason, fluid rendering can't possibly work using single-pass stereo or multipass, since the frustum data passed to the shader will be wrong and so will be all lighting calculations. You do need two cameras for it to work correctly.

I couldn't reproduce the lighting issues you describe when using two cameras, though. I'm on Unity 2019.4.30f1, using Oculus Quest.
Reply
#16
(08-02-2022, 01:23 PM)josemendez Wrote: Hi there,

Could reproduce the issue with fluid not rendering on separate cameras in 2020.3 and above. Not sure what has changed in Unity's side for this to stop working, going to investigate and get back to you.

All fluid lighting us calculated using the eye-space position of the fluid. Since the fluid's surface is not a mesh but an implicit function, positions are  reconstructed from the depth buffer and the camera's frustum data (which is passed to the fluid shaders from the CPU). For this reason, fluid rendering can't possibly work using single-pass stereo or multipass, since the frustum data passed to the shader will be wrong and so will be all lighting calculations. You do need two cameras for it to work correctly.

I couldn't reproduce the lighting issues you describe when using two cameras, though. I'm on Unity 2019.4.30f1, using Oculus Quest.

Are you using Legacy VR or XR Plugin Management? The Fluid Renderer doesn't work with XR Plugin Management for me, not even in 2019.4. With Legacy VR I can get the fluid rendered, but the Lighting is definitely broken just as badly on a two-camera setup as one a single one.

I also noticed that using two cameras more than halves my framerate, and shows a massive CPU load compared to single camera stereo pass. Is the physics simulation also being done twice with two cameras? That doesn't make sense to me, but what other explanation could there be?

I cant even get a stable 45 frames for using 90Hz with reprojection in the faucet sample scene with only 1200 particles. I can't throw any faster hardware at it either, I'm already using a 5950X and a 6900XT.
Reply
#17
(09-02-2022, 11:09 AM)locque Wrote: Are you using Legacy VR or XR Plugin Management? The Fluid Renderer doesn't work with XR Plugin Management for me, not even in 2019.4. With Legacy VR I can get the fluid rendered, but the Lighting is definitely broken just as badly on a two-camera setup as one a single one.

I'm using the XR plugin in 2019.4  Huh . It also works ok in Legacy.

Works and lighting is fine for me, are you using any other asset or special setup that might interfere with lighting?

(09-02-2022, 11:09 AM)locque Wrote: I also noticed that using two cameras more than halves my framerate, and shows a massive CPU load compared to single camera stereo pass. Is the physics simulation also being done twice with two cameras? That doesn't make sense to me, but what other explanation could there be?

Simulation is only performed once no matter how many times you render it. However, rendering is not free at all for the CPU: you're doubling the amount of drawcalls, the amount of pipeline state changes, culling work, etc. That's why single-pass stereo was conceived, to ease the burden on the CPU. The GPU always has to render the same thing two times, no matter if you're using single-pass, multi-pass, or two cameras. This thread is related: https://forum.unity.com/threads/vr-singl...ge.425280/

Then, there's death spiraling: if your frame rate is low, physics will have to be updated more than once per frame (remember that FixedUpdate can be called, 0, 1, 2 or as many times per frame as needed to keep a constant timestep). If rendering is dragging performance enough, physics will be updated multiple times per frame and drag performance down even more. Check the amount of FixedUpdate() calls in your profiler and if there's more than 1 per frame, you need to reduce Unity's max fixed timestep setting (found in the Time manager).

(09-02-2022, 11:09 AM)locque Wrote: I cant even get a stable 45 frames for using 90Hz with reprojection in the faucet sample scene with only 1200 particles. I can't throw any faster hardware at it either, I'm already using a 5950X and a 6900XT.

45 fps is way too low, I'd recommend using the profiler to see what the culprit is. Might be simulation, but most likely it's rendering. You can share a profiler pic if you want help interpreting its data.

It would help to know your fluid renderer settings too. Some have a noticeable impact on performance (dowscaling, for example).
Reply
#18
Quote:Works and lighting is fine for me, are you using any other asset or special setup that might interfere with lighting?


I'm only using the SteamVR plugin, nothing else.
Reply
#19
I have been trying to get the lighting working in VR for more than a week on my own now, no success.

Could you please take a look at this test project to see if I did something wrong? It's got nothing other than the Obi dependencies and the bare minimum of packages to get PC VR working in Unity 2019.4:

https://we.tl/t-xtbIxx5U83

There's an XR Rig with two cameras and fluid renderers in the faucet sample scene, and the OpenVR settings are set to multi pass. The lighting is still completely off though.

Do I also have to set the tracked pose driver to the individual eyes instead of the center eye point for each camera? If I do that the eyes become completely misaligned and you get a headache as soon as you put on the headset.
Reply
#20
If you don't want to look at my test project, could you please provide me with a project that you set up properly for VR instead?

I just want to know if I'm actually doing something wrong and just can't figure out what or if there simply is no way to fix the lighting.
Reply