amazon-web-servicesapache-kafkaapache-kafka-connect

Kafka Connect docker image not recognizing AWS environment variable credentials?


I'm currently learning Kafka Connect on my own by trying to set up some basic pipelines in Docker. The main issue is that it appears that when I try to use the connect REST API to spin up a new connector with a POST request, no errors appear right away. However, when sending a request to the status endpoint for the connector I get an error that states "org.apache.kafka.connect.errors.ConnectException: com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain: with a list of all the places that AWS looks for credentials. My config + curl for the connector looks like the following:

curl -s -X POST -H "Content-Type: application/json" --data '{
    "name": "S3Connector",
    "config": {
        "connector.class": "io.confluent.connect.s3.S3SinkConnector",
        "flush.size": "1",
        "name": "S3Connector", 
        "topics": "inputTopic",
        "key.converter": "org.apache.kafka.connect.storage.StringConverter",
        "value.converter": "org.apache.kafka.connect.storage.StringConverter",
        "s3.bucket.name": "bucket-name-here",
        "s3.region": "us-east-2",
        "storage.class": "io.confluent.connect.s3.storage.S3Storage",
        "s3.credentials.provider.secret_access_key": "secret-access-key-here",
        "s3.credentials.provider.access_key_id": "access-key-id-here",
        "format.class": "io.confluent.connect.s3.format.bytearray.ByteArrayFormat",
        
    }
}' http://localhost:8083/connectors

I initially attempted to just use the s3.credentials config lines to pass in the AWS credentials, but got this error. At that point, I attempted to manually set the environment variables listed in the error with values set to the two keys in the config above using export. When I deleted the connector and tried to recreate it, I get the same issue. Why would it have trouble reading the keys when they're in place?

Update: So setting the environment variable in docker-compose seems to have fixed this. I am assuming that the confluent hub install for the S3 connector (in this case confluent-hub install --no-prompt confluentinc/kafka-connect-s3:latest) somehow uses what are supposed to be the existing environment variables at the time of install but can't pick them up once it's already installed without some update of a separate config inside the install?


Solution

  • use the s3.credentials config lines

    You don't need the provider prefix for those - https://docs.confluent.io/kafka-connectors/s3-sink/current/configuration_options.html

    using export

    Docker doesn't read your exported shell environment by default

    setting the environment variable in docker-compose seems to have fixed this

    Yes, this is the correct solution

    somehow uses what are supposed to be the existing environment variables at the time of install

    This is not the case. The AWS SDK is only invoked when you actually create the connector, not install it