apache-kafkaapache-kafka-connectsnowflake-cloud-data-platform

Snowflake Kafka connector config issue


I'm following the steps in this guide Snowflake Connector for Kafka

The error message I'm getting is

BadRequestException: Connector config {.....} contains no connector type

I am running the command as

sh kafka_2.12-2.3.0/bin/connect-standalone.sh connect-standalone.properties snowflake_kafka_config.json

my config files are

connect-standalone.properties

bootstrap.servers=localhost:9092
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter=org.apache.kafka.connect.json.JsonConverter

key.converter.schemas.enable=true
value.converter.schemas.enable=true

offset.storage.file.filename=/tmp/connect.offsets
offset.flush.interval.ms=10000

plugin.path=/Users/kafka_test/kafka

jar file snowflake-kafka-connector-0.5.1.jar is in plugin.path

snowflake_kafka_config.json

{
  "name":"Kafka_Test",
  "Config":{
    "connector.class":"com.snowflake.kafka.connector.SnowflakeSinkConnector",
    "tasks.max":"8",
    "topics":"test",
    "snowflake.topic2table.map": "",
    "buffer.count.records":"1",
    "buffer.flush.time":"60",
    "buffer.size.bytes":"65536",
    "snowflake.url.name":"<url>",
    "snowflake.user.name":"<user_name>",
    "snowflake.private.key":"<private_key>",
    "snowflake.private.key.passphrase":"<pass_phrase>",
    "snowflake.database.name":"<db>",
    "snowflake.schema.name":"<schema>",
    "key.converter":"org.apache.kafka.connect.storage.StringConverter",
    "value.converter":"com.snowflake.kafka.connector.records.SnowflakeJsonConverter",
    "value.converter.schema.registry.url":"",
    "value.converter.basic.auth.credentials.source":"",
    "value.converter.basic.auth.user.info":""
  }
}

Kafka is running on local, I have a producer and consumer up, can see the data flowing.


Solution

  • This is the same question I answered over on the Confluent community Slack, but I'll post it here for reference too :-)


    The connect worker log shows that the connector JAR itself is being loaded, so the 'contains no connector type` is because your config formatting is fubar.

    You're running in Standalone mode, but passing in a JSON file which won't. My personal opinion is always use distributed, even if just a single node of it.

    If you must use standalone then you need your connector config (snowflake_kafka_config.json) to be a properties file like this:

    param1=argument1
    param2=argument2
    

    You can see valid JSON examples (if you use distributed mode) here: https://github.com/confluentinc/demo-scene/blob/master/kafka-connect-zero-to-hero/demo_zero-to-hero-with-kafka-connect.adoc#stream-data-from-kafka-to-elasticsearch