27-12-2018, 09:36 AM
(27-12-2018, 01:30 AM)ObiWan6565 Wrote: Hi Jose,
Thank you for the quick response. I tried switching to the local space simulation but with a FixedUpdate of 30fps the program just crashes, so maybe it's way too many simulations/sec for ObiSolver to handle? Below is a more detailed description of what I'm trying to achieve:
In my AR game, I've managed to attach a virtual ball and chain to a person's hand by using the RGB feed of a simple webcam. The chain rendering mode works perfectly for this and I'm really happy with the results. However when the person walks back and forth from the webcam, I want the chain to scale accordingly:
Rigidbodies scale perfectly but when I scale the rope object, the links either separate out (when scale is decreased) or move closer to each other (when scale increased) and I was wondering if there's a simpler way to achieve this procedurally, during runtime without changing the length or the number of particles in the rope?
Thanks,
Imho if you're going for a perspective effect, you should use a perspective camera that matches the webcam's parameters (fov). This way you'd get correct scaling automatically, no need to fake it.
If you still want to go the scaling route, the only way to correctly scale a simulation as a whole is to use local space mode. You cannot scale a rope in world space mode, as that would only scale fixed particles. Make sure you've set it up correctly (rope must sit under the solver transform).