Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Suggestion / Idea  OBI7 Surface Texture Rendering (Lava) discussion
#1
Hi, I'm unable to edit the Foam shader, but I have a suggestion, try this if you're interested.

Using 1 Alpha + Life time info as threshold (0-1) will result in dissolve effect
If you can change the foam shader into this, we can have 5x-10x bigger foam, more coverage and dissolve effect over time with lesser foam count.
I've pack a texture that have parabol for main alpha ® + foam alpha circle (G) + foam alpha directional (B) as dissolve step.
The texture is made in Substance Designer, is free to pack into the final Obi package if you want.
https://drive.google.com/file/d/1aLCCs3I...sp=sharing

If you can pass the foam movement direction as UV, can use the B channel, else can use G channel.

Reply
#2
My texture maybe not enough to do this, but essentially what I aim for is this effect
This cannot be done with current small foam particles since it will need an absurd amount of them.
They're very big foam card with dissolve over time technique.
Wishlist for Foam shader graph and custom node next.

[Image: 62dfb1e19938c.gif]

I wonder why do you not use VFX graph for foam? Is this about compatibility with low-end devices?
Unity vfx graph can read Compute Buffer since 2021 LTS so I think it's very well suit for Obi7.
Reply
#3
(08-05-2024, 04:47 AM)spikebor Wrote: My texture maybe not enough to do this, but essentially what I aim for is this effect
This cannot be done with current small foam particles since it will need an absurd amount of them.
They're very big foam card with dissolve over time technique.
[Image: 62dfb1e19938c.gif]

Those in your gif are not foam cards/particles (or at least, not just particles), that's crossfaded texture coordinate advection.

This is a technique that advects two sets of texture coordinates using fluid velocity, crossfading between them and  periodically resetting them to avoid too much stretching. We have this in FluXY, maybe this is what you meant in your other post about advecting mesh vertices?

For heightmap-based fluids this is very easy to do, since UV coords can be just top-down projected. For arbitrary 3D meshes however one must device a way to procedurally generate base texture coordinates for the mesh - triplanar mapping would be the basic way to approach this.

Obi does not support this yet, but likely will in the future: all that's needed to be able to do this in a shader is for the fluid mesh to contain velocity information in its vertices, which is very simple to do (at the cost of increased surface chunk memory usage).

Compared to particle based foam, this is quite limited though since it confines foam to the surface: foam can't be underwater, and can't splash out of the water. The good news is that both methods can be combined.

(08-05-2024, 04:47 AM)spikebor Wrote: I wonder why do you not use VFX graph for foam? Is this about compatibility with low-end devices?
Unity vfx graph can read Compute Buffer since 2021 LTS so I think it's very well suit for Obi7.

Tried using it, but the workflow for reading/writing data to/from compute shaders is honestly not very good. Coupled with the fact that we also needed support for CPU simulation for Burst and precise control over how velocity-based billboard stretching is done (in VFX graph it results in weird distortion when the velocity vector faces the camera forward vector, which breaks the illusion of foam splashing around) we decided it would be easier and faster to roll our own GPU/CPU particle system and include it with the asset.
Reply
#4
(07-05-2024, 03:40 PM)spikebor Wrote: Wishlist for Foam shader graph and custom node next.

Unfortunately, the foam shader can't be done with ShaderGraph (I wish it could!) because ShaderGraph does not currently support channel masking: rendering only to a specific channel on a render target, like you can with shader lab's ColorMask instruction. This is required to blend foam with transparent fluids as we must only modify the fluid's alpha channel, leaving RGB color intact.

It also does not support blend modes, outputting fragment depth, indirect instancing, and lots other stuff that would allow us to use ShaderGraph for pretty much every shader in Obi. What's baffling to me is that most of these are trivial to support and they're very commonly used, so not sure why Unity hasn't added them yet.

So for the time being, some shaders must be handwritten since making them with SG is impossible Triste.
Reply
#5
(08-05-2024, 07:43 AM)josemendez Wrote: Those in your gif are not foam cards/particles (or at least, not just particles), that's crossfaded texture coordinate advection.

This is a technique that advects two sets of texture coordinates using fluid velocity, crossfading between them and  periodically resetting them to avoid too much stretching. We have this in FluXY, maybe this is what you meant in your other post about advecting mesh vertices?

For heightmap-based fluids this is very easy to do, since UV coords can be just top-down projected. For arbitrary 3D meshes however one must device a way to procedurally generate base texture coordinates for the mesh - triplanar mapping would be the basic way to approach this.

Obi does not support this yet, but likely will in the future: all that's needed to be able to do this in a shader is for the fluid mesh to contain velocity information in its vertices, which is very simple to do (at the cost of increased surface chunk memory usage).

Compared to particle based foam, this is quite limited though since it confines foam to the surface: foam can't be underwater, and can't splash out of the water. The good news is that both methods can be combined.


Tried using it, but the workflow for reading/writing data to/from compute shaders is honestly not very good. Coupled with the fact that we also needed support for CPU simulation for Burst and precise control over how velocity-based billboard stretching is done (in VFX graph it results in weird distortion when the velocity vector faces the camera forward vector, which breaks the illusion of foam splashing around) we decided it would be easier and faster to roll our own GPU/CPU particle system and include it with the asset.

Hi, that's not what you think. The technique you used and same as that Catlike Coding's blog post result in an unpleasing periodically resetting effect in the foam life duration, while the gif I show you does not have that.
Those actually are splash cards, same as advected particle, but bigger, and have orientation follow the movement.
You can take a read on this blog
Working with Niagara Fluids to Create Water Simulations (80.lv)

H
aft way through the blog, they reveal the splash card representation
Water surface + caustic
[Image: 62dfae9267d17.gif]
Water surface+caustic+splash cards
[Image: 62dfaea46cf4f.gif]
splash cards visualized.
[Image: 62dfaecca9831.gif]

If you can do this, the advected particles number can be reduced by a lot since their size can be very big, like that 3rd gif.
Reply
#6
(08-05-2024, 08:18 AM)spikebor Wrote: Hi, that's not what you think. The technique you used and same as that Catlike Coding's blog post result in an unpleasing repeated fade in/ fade out effect in the foam life duration, while the gif I show you does not have that.

It is texture coordinate advection, in the article you shared he goes on to explain it right after the cards part (he calls it "dual rest field", "dual" because there's two sets of coordinates). Here's the advected texture coordinates:

[Image: 62dfb02554b69.gif]

Here's a video of it using cat textures, you can clearly see the repetition. He just uses a very long period to avoid making it noticeable:

https://twitter.com/Vuthric/status/15129...imulations

So it's actually both techniques: particles for the "hazy" foam (which are barely noticeable in your original gif), and texture advection for the solid-looking foam on top of the fluid mesh.
Reply
#7
(08-05-2024, 08:25 AM)josemendez Wrote: It is texture coordinate advection, in the article you shared he goes on to explain it right after the cards part (he calls it "dual rest field", dual because there's two sets of coordinates). Here's the advected texture coordinates:

[Image: 62dfb02554b69.gif]

Here's a video of it using cat textures, you can clearly see the repetition. He just uses a very long period to avoid making it noticeable:

https://twitter.com/Vuthric/status/15129...imulations

So it's actually both techniques: particles for the "hazy" foam (which are barely noticeable in your original gif), and texture advection for the solid-looking foam on top of the fluid mesh.

Yes, can ObiFluid have this effect? those foam look very very good and big.
The smaller advected particles can be used as bubble though.
Reply
#8
(08-05-2024, 08:30 AM)spikebor Wrote: Yes, can ObiFluid have this effect? those foam look very very good and big.
The smaller advected particles can be used as bubble though.

Quoting myself here Guiño:

Quote:Obi does not support this yet, but likely will in the future: all that's needed to be able to do this in a shader is for the fluid mesh to contain velocity information in its vertices, which is very simple to do (at the cost of increased surface chunk memory usage).

Once the fluid mesh has velocity information, this is doable. We'll also throw in a ShaderGraph node to do the actual crossfading of texture coordinates, so you can easily implement this with minimal complexity.
Reply
#9
Thanks bro, I'll go into my time capsule and fast forward to the future  Gran sonrisa
Reply
#10
(08-05-2024, 08:35 AM)spikebor Wrote: Thanks bro, I'll go into my time capsule and fast forward to the future  Gran sonrisa

Done! Just need to clean it up and expose parameters in the UI. This will also allow for textured opaque fluids, which is interesting for lava/mud and similar stuff. Will be included in the RC around next monday, maybe you won't need a time capsule Guiño.

Reply