I'm trying to create an ADF pipeline for copying data from azure data explorer Kusto to dataverse, from my source side table has 0.9 million records.
I'm trying to run the pipeline it is failed.
How to copy 900k records to sink side?
How much data load in single pipeline?
Kusto client failed reading the result stream from the service: 'Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host
The below might be the reasons for the above error.
It can be because of Firewall blockage or temporary network issue. Publish your resources and try again. Go to Networking of the ADX cluster and make sure it checked on the right option like below.
As the data is large, try by scaling up your cluster memory and check.
After these, make sure the No truncation is checked in the copy activity source. This allows to copy more than 0.5 million records.
I have followed the same steps, and you can see I am able to copy data up to 1 million records without any error.