Let's take a sphere of 30 kg on a plane. I want to make the ball jump. The gravity is set to -10 in Unity or -1000 in Unreal.
In Unity, I use this line to make my ball jump: AddForce(Vector3.up * 9 * 30, ForceMode.Impulse)
. The ball rises 4 meters when jumping.
In Unreal, I use this instead: AddImpulse(FVector::UpVector * 900 * 30)
. The ball rises 18 cm when jumping.
I'm currently taking a project from Unity to Unreal and I would like to know why there is such a big difference even if values from Unreal have been multiplied by 100 because Unreal is in cm.
According to my test, I need to call AddImpulse(FVector::UpVector * 4200 * 30)
to achieve the same kind of result, so approximatly 4.5 times more force.
I finally found the issue. In Unreal, the total mass of an actor is the sum of the mass of each component of that actor.
Because I had a collider and a mesh on my ball, the mass of the sphere collider was set to 30, and the mass of the mesh was auto-calculated because I didn't override the value. By setting the mass of the mesh to 0, I now have similar behavior with both engines.