Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Suggestion / Idea  Stretching scale each axis
#1
Bombilla 
Hi, by the way, great asset.
It would be cool if we can stretch along separate axis related to the proxy configuration position.

[Image: B2MiiOTsDGogN-fJle0IgvRSVlC4ZDeJWqY3x8Fg...authuser=1]

For instance, I'm using the same blueprint to populate clothes on the scene and I'm using stretching scale to make some size variations.
If for example was possible to stretch along x axis it would be nice to make like a long pillow instead of everything variated but same proportion.

It's just an idea, if not too hard to implement, it would be cool.
Reply
#2
Hi,

The link you posted does not seem to contain an image, and it's not visible Triste

Deformable objects can't be non-uniformly scaled at runtime. In few words, the reason for this is that non-uniform scaling does not preserve euclidean distances. I'll copy-paste a more detailed explanation I gave in a different thread:

Quote:There's several kinds of transforms: affine, rigid, perspective, etc. The transforms used in Unity (and most 3D packages) are affine transforms. This means they have translation, rotation, scale (and shear/skew). If they had only translation and rotation, they'd be rigid transforms.

Now, here's the caveat: rigid transforms preserve euclidean distance, affine transforms do not. This means that if you have a line and you rigidly transform it (rotate or translate it) its length remains the same. But if you scale it (applying an affine transform) its length changes. If scaling is uniform, you can get the new length by multiplying the unscaled length by the scale. But if scaling is not uniform, there's no quick way to recover the new length from the old one (you'd have to do the full math of subtracting both ends of the line, dot product the resulting vector with itself and take the square root to recalculate the post-scale length).

What does this have to do with cloth and deformable objects in general? well, part of the information stored by cloth to remember its rest shape is the length of all edges in the mesh. This is so that the cloth can stretch/compress/bend, but still keep its original shape and return to it. So if you scale cloth, its rest shape remains unscaled, otherwise the rest shape would need to be recalculated every frame which is very costly. Scale is applied to the object after deformation, which can result in weird spiky meshes and all sorts of visual glitches.

For this reason, when you scale a piece of cloth (or any deformable object) it won't grow/shrink like a rigid object would. Only rigid transforms can be applied to deformable objects, affine transforms cannot. So if your cloth transform in the scene has a value of 200,200,200, you need to use a scale of 1/200, 1/200, 1/200 when generating the cloth blueprint, so that they compensate each other once an affine transform is applied to it. It can also lead to floating point precision issues, since the simulation would be performed using very tiny numbers (depending on how large your scale is, 1/200 is smaller than 1/100). This is cumbersome and not immediately obvious for most people. It's also not exclusive to Obi, all deformable physics engines I've ever used or heard of precalculate and store the rest shape of objects.

The easiest workaround for this is to generate several blueprints of your object at different scales, then at runtime pick between them at random.
Reply
#3
(28-02-2021, 07:16 PM)josemendez Wrote: Hi,

The link you posted does not seem to contain an image, and it's not visible Triste

Deformable objects can't be non-uniformly scaled at runtime. In few words, the reason for this is that non-uniform scaling does not preserve euclidean distances. I'll copy-paste a more detailed explanation I gave in a different thread:


The easiest workaround for this is to generate several blueprints of your object at different scales, then at runtime pick between them at random.

I hope this one is as public view↓

[Image: Screenshot_2.png]
I started with about 10 diferent blueprints and I noticed it consumes a lot of processing leaving my game about 10fps, even with optimizations of interations, unity timesteps and so on... I used those blueprints, made prefabs and populated with about 50 clothing objs throughout the scene, but suffocating 10fps, oh, btw using skinnedmaps, since the cloth slave meshes are thick.

I had to kill some diferent meshes, I made a lot of tests and found out that using 1 blueprint for 5 different slaves and populating with about 45 cloths it can reach ~58fps. Artistically not my ideal but performance X art is a really really though fight.

So, to get more variations at this situation Im putting from .85f to 1.5f stretch scale in some prefabs to get size variations, it got some like physics artifacts but playing with constraints i got some acceptable results. 

[Image: unknown.png]
If axis stretching is possible, it could be cool like stretching the pillow more along X then along Y/Z.. making some proporting variations u know?

Oh, I also figured out that the more vertices/particles on the driven mesh more expensive to process. In my mind it should be obvious but my artist side made me made the driven mesh 3-4x times resolution dense then know, that I'm using 9 vertex... some penetrations occur but not that much, between performance X art I had to leave like this.

[Image: unknown.png]

And btw I also have 2 curtains and 1 blanket at the scene, which are very broadwide meshes but I also made their driven mesh resolution lower with some acceptable results.
Reply