I’m currently working on a project that uses Supabase with multiple environments (local, staging, and production). My development workflow involves:
It seems that Supabase webhooks retain the URL of the environment they were created in, causing issues when migrating between environments. For instance, if I create a webhook locally, it continues to point to the local edge function even after deploying to staging or production. Is there a way to dynamically set the webhook URLs based on the environment to ensure they always point to the correct edge function?
The best way to manage webhooks and edge functions across multiple environments in Supabase is by leveraging the Vault for environment-specific configurations, as suggested by various discussions in the community.
Step 1: Set Up Vault Secrets
For production, navigate to your Supabase project’s settings, then go to the Vault section. Here, you can add the secret that holds the base URL for your edge functions. This URL will vary based on the environment.
For local development, you can create this secret during the seeding stage of your setup. Add the following SQL to your seed file:
SELECT vault.create_secret(
'http://host.docker.internal:54321',
'supabase_project_url',
'URL to be used for calling edge functions, this is set here because we want to develop edge functions with webhohks from database triggers in multiple environments');
);
Step 2: Replace Supabase Webhooks with Custom Triggers
Instead of using Supabase’s built-in webhooks, which can be cumbersome in a multi-environment setup, you can create custom triggers that call your edge functions directly.
First, create a trigger for your local development environment and include it in your migration files. You can generate a migration file automatically using supabase db diff -f <new-migration>
:
CREATE OR REPLACE TRIGGER name_of_my_trigger
AFTER INSERT ON "public"."words"
FOR EACH ROW
EXECUTE FUNCTION public.call_edge_function(
'/functions/v1/name-of-my-edge-function',
'POST',
'{"Content-type":"application/json","x-supabase-webhook-source":"whatever"}',
'{}',
'5000'
);
Step 3: Implement the Custom Function to Handle Edge Function Calls
I’ve extracted the function that Supabase uses for webhooks to maintain compatibility and added an extra step to read the Vault secret and construct the full URL dynamically. Here’s the function:
CREATE OR REPLACE FUNCTION public.call_edge_function()
RETURNS trigger
LANGUAGE plpgsql
SECURITY DEFINER
SET search_path TO 'supabase_functions'
AS $function$
DECLARE
request_id bigint;
payload jsonb;
url_path text := TG_ARGV[0]::text;
method text := TG_ARGV[1]::text;
headers jsonb DEFAULT '{}'::jsonb;
params jsonb DEFAULT '{}'::jsonb;
timeout_ms integer DEFAULT 1000;
supabase_project_url text;
full_url text;
BEGIN
IF url_path IS NULL OR url_path = 'null' THEN
RAISE EXCEPTION 'url_path argument is missing';
END IF;
-- Retrieve the base URL from the Vault
SELECT decrypted_secret INTO supabase_project_url
FROM vault.decrypted_secrets
WHERE name = 'supabase_project_url';
IF supabase_project_url IS NULL OR supabase_project_url = 'null' THEN
RAISE EXCEPTION 'supabase_project_url secret is missing or invalid';
END IF;
-- Construct the full URL by concatenating base_url with url_path
full_url := supabase_project_url || url_path;
IF method IS NULL OR method = 'null' THEN
RAISE EXCEPTION 'method argument is missing';
END IF;
IF TG_ARGV[2] IS NULL OR TG_ARGV[2] = 'null' THEN
headers = '{"Content-Type": "application/json"}'::jsonb;
ELSE
headers = TG_ARGV[2]::jsonb;
END IF;
IF TG_ARGV[3] IS NULL OR TG_ARGV[3] = 'null' THEN
params = '{}'::jsonb;
ELSE
params = TG_ARGV[3]::jsonb;
END IF;
IF TG_ARGV[4] IS NULL OR TG_ARGV[4] = 'null' THEN
timeout_ms = 1000;
ELSE
timeout_ms = TG_ARGV[4]::integer;
END IF;
CASE
WHEN method = 'GET' THEN
SELECT http_get INTO request_id FROM net.http_get(
full_url,
params,
headers,
timeout_ms
);
WHEN method = 'POST' THEN
payload = jsonb_build_object(
'old_record', OLD,
'record', NEW,
'type', TG_OP,
'table', TG_TABLE_NAME,
'schema', TG_TABLE_SCHEMA
);
SELECT http_post INTO request_id FROM net.http_post(
full_url,
payload,
params,
headers,
timeout_ms
);
ELSE
RAISE EXCEPTION 'method argument % is invalid', method;
END CASE;
INSERT INTO supabase_functions.hooks
(hook_table_id, hook_name, request_id)
VALUES
(TG_RELID, TG_NAME, request_id);
RETURN NEW;
END;
$function$;
Execute this function on your local database and include it in your migration files as well.
Step 4: Integrate into Your CI/CD Pipeline
Finally, automate the deployment of your database migrations and edge functions to the production environment using a CI/CD pipeline. For GitHub Actions, your workflow might look like this:
jobs:
deploy-to-supabase:
runs-on: ubuntu-latest
env:
SUPABASE_ACCESS_TOKEN: ${{ secrets.SUPABASE_ACCESS_TOKEN }}
SUPABASE_DB_PASSWORD: ${{ secrets.PRODUCTION_DB_PASSWORD }}
SUPABASE_PROJECT_ID: ${{ secrets.PRODUCTION_PROJECT_ID }}
steps:
- uses: actions/checkout@v4
- uses: supabase/setup-cli@v1
with:
version: latest
- run: supabase link --project-ref $SUPABASE_PROJECT_ID
- run: supabase db push
- run: supabase functions deploy
With these steps, you’ll be all set to conquer the multi-environment madness in Supabase.