From developer.apple.com, it is described as follows,
ARKit: Integrate iOS device camera and motion features to produce augmented reality experiences in your app or game.
and
RealityKit: Simulate and render 3D content for use in your augmented reality apps.
As I understand, ARKit is AR SDK acts on object detection and tracking, RealityKit is rendering module.
In my project, I imported only RealityKit framework into Xcode for making the AR Application. Plus, I used Reality Composer for creating the experience attached to the image. Finally, my AR application can work completely.
However, I have 2 questions:
The RealityKit's ARSession is based on the ARKit's AR session, but has its own add-ons
that allow you to automate the tracking of several anchors' types. However, the difference between two is that RealityKit has engines for rendering, physics and animation. And do not forget that ARKit is umbrella containing SceneKit and SpriteKit.