corsfetch-apiaws-amplifysvelteconfluent-cloud

API request blocked by CORS policy with Confluent Cloud and Kafka


I'm trying to post a message on a Kafka cluster on Confluent cloud. It works fine on Postman, but when I try on my Svelte app, I got this error:

Access to fetch at <ENDPOINT> from origin 'http://localhost:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

The request looks like this:

var apiHeaders = new Headers();
apiHeaders.append("Content-Type", "application/json");
apiHeaders.append("Authorization", "Basic <BASE64>");

const clusterId = '<CLUSTERID>';
const restEndpoint = '<ENDPOINT>';

const postMessage = (data, topic) => {
    let raw = JSON.stringify({
        "value": {
            "type": "JSON",
            "data": {
                data
            }
        }
    });

    let requestOptions = {
        method: 'POST',
        headers: apiHeaders,
        body: raw,
        redirect: 'follow'
    };

    fetch(`${restEndpoint}/kafka/v3/clusters/${clusterId}/topics/${topic}/records`, requestOptions)
        .then(response => response.text())
        .then(result => console.log(result))
        .catch(error => console.log('error', error));
}



export const sendLogMessage = (data) => {
    postMessage(data, '<TOPICNAME>');
}

My headers are set like this on AWS Amplify:

enter image description here

And also my headers are set like this on svelte.config.js:

import adapter from '@sveltejs/adapter-static';
// import adapter from '@sveltejs/adapter-node';
// import adapter from '@sveltejs/adapter-auto';
// import firebase from "svelte-adapter-firebase";

/** @type {import('vite').Plugin} */
const viteServerConfig = {
    name: 'log-request-middleware',
    configureServer(server) {
        server.middlewares.use((req, res, next) => {
            res.setHeader("Access-Control-Allow-Origin", "*");
            res.setHeader("Access-Control-Allow-Methods", "GET, HEAD, POST");
            res.setHeader("Cross-Origin-Resource-Policy", "cross-origin");
            res.setHeader("Cross-Origin-Opener-Policy", "same-origin");
            res.setHeader("Cross-Origin-Embedder-Policy", "require-corp");
            res.setHeader("Access-Control-Allow-Headers", "Content-Type,X-Amz-Date,Authorization,X-Api-Key,x-requested-with");
            next();
        });
    }
};

/** @type {import('@sveltejs/kit').Config} */
const config = {
    kit: {
        adapter: adapter(), // for the firebase adapter use "firebase()" instead of "adapter()"
        vite: {
            plugins: [viteServerConfig]
        },
        prerender: {
            default: true
        },
        trailingSlash: 'never'
    }
};

export default config;


Solution

  • Kafka was meant to be used in 3+ tier architecture solutions. From the beginning one of its core selling points was scalability. This is an opinion, but everything I can find online about Kafka, including tutorials, articles, and documentation seem to support this intention.

    That being said, if you were to inspect the OPTIONS call that your browser makes to Confluent Cloud, or explicitly make an OPTIONS call via Postman, you will see that there are no CORS response headers. This means that by default the REST API which Confluent Cloud provides to interact with your Kafka cluster is not intended to be accessed by a browser. The only way you could get it to work would be to get your Confluent Cloud bootstrap server on the same domain as your Web UI, and given the abstraction which Confluent Cloud provides as a managed service it would definitely be more trouble than it's worth, if it's even possible.

    Consider this: let's say you were successfully able to produce records into your Kafka topic directly from your Web UI. This would mean your api key and secret would be bundled into your front-end code which would be trivial for a bad actor to obtain and then use to stage a DOS attack. See this StackOverflow question.

    Your best bet is to call some secure API from your front-end, and let the server contain the credentials. From there it's trivial to produce records into Confluent Cloud as they have many SDKs available. If you didn't already have an API set up or wanted to skip creating a dedicated endpoint for this, you can use the Confluent REST Proxy. It's simple to set up:

    1. On the server you will run the proxy from, install Java
    2. Install docker
    3. Follow the tutorial instructions found here
    4. If the proxy server will be running on a different domain than your front-end (like in the case of development), add the following environment variables to the rest-proxy.yml
    KAFKA_REST_ACCESS_CONTROL_ALLOW_ORIGIN: "*"
    KAFKA_REST_ACCESS_CONTROL_ALLOW_METHODS: "GET,POST,PUT,DELETE"
    KAFKA_REST_ACCESS_CONTROL_ALLOW_HEADERS: "origin,content-type,accept,authorization"