I am working on a small worker that logs to an external service (Kinesis firehose) and then does a redirect
I am trying to batch the external calls together to avoid hitting ingestion limits. Is this the correct way to do it or is there a better way? (p.s i looked at using queues but the cost would be high for what we need)
It seems to work locally but not when deployed
Thank you
Josh
import { createUrl, getFinalUrl, getUserId, isValidUrl } from "./lib/url";
import { sendToFirehose } from "./lib/sendToFirehose";
export interface Env {
AWS_ACCESS_KEY_ID: string;
AWS_SECRET_ACCESS_KEY: string;
}
let batch: Record<any, any>[] = [];
export default {
async fetch(request: Request, env: Env): Promise<Response> {
const { pathname, searchParams } = new URL(request.url);
////
const data = {
date: new Date(),
url: request.url,
};
batch.push(data);
if (batch.length === 100) {
console.log("Batch size reached, sending to firehose");
await sendToFirehose(batch);
batch = [];
}
return Response.redirect("http://newurl.com", 302);
},
};
it seems the best way to do it is using cloudflare queue feature