I am doing a POC on moving data from Oracle db to Hadoop file system using Kafka(new at this). I have configured my extract on oracle db and it publishes the changes done in my source table to my kafka handler consumer and I can see the data on my topic. Now i want to do the same on a kafka connect handler and i want the schema and data on a file in my hdfs . I know i can configure the kafka handler to read my extract trail files. But where do i put the details of my target hdfs so that kafka can dump data into hdfs?