Supabase
Integrating Supabase as a Source with GlassFlow Using Webhook Connector
Last updated
Integrating Supabase as a Source with GlassFlow Using Webhook Connector
Last updated
© 2023 GlassFlow
In this guide, you will learn how to integrate Supabase as a data source with GlassFlow using the Webhook connector. This integration allows you to detect and stream data changes from your Supabase database directly into a GlassFlow pipeline in real-time. Read more about the use case with Supabase and GlassFlow.
Before you begin, make sure you have the following:
A Supabase account with a project set up.
A GlassFlow account. Sign up for a free GlassFlow account.
A database table in Supabase that you want to monitor for changes.
Log in to the GlassFlow WebApp.
Navigate to GlassFlow WebApp and log in with your credentials.
Create a New Pipeline.
Go to the "Pipelines" section and click on "Create New Pipeline."
Provide a name for your pipeline, for example, Supabase-Webhook-Pipeline
.
Select the "Space" you want the pipeline to reside in.
Choose "Webhook" as the Data Source.
During the pipeline creation process, select "Webhook" as your data source connector.
GlassFlow will provide you with a unique Webhook URL. This is where Supabase will send data whenever a specified event occurs.
Configure the Transformation Stage.
In the pipeline setup, you will see an option to add a transformation function. This is where you can define how the data should be processed or transformed before it reaches the final destination.
You can upload a Python script (transform.py
) or write your transformation logic directly in the GlassFlow WebApp.
For example, if you want to enrich the incoming data or filter out specific fields, you can implement this logic here.
Choose Dependencies (if needed).
If your transformation requires external libraries (e.g., pandas
, openai
), you can select them from the dependency menu in GlassFlow.
Select Your Data Sink.
After setting up the transformation stage, you need to choose where the transformed data will be sent.
You can choose an existing built-in integration like AWS S3, Google Pub/Sub, or use the SDK to connect to a custom endpoint or another service.
Configure the Data Sink.
If using a built-in integration, follow the prompts to enter the required details (e.g., bucket name for S3, topic for Pub/Sub).
If using the SDK, you will configure the sink in your code to send the data to a specific API or service.
Confirm Pipeline Settings.
Review your pipeline configuration in GlassFlow and click "Create Pipeline."
Your pipeline is now active and will start receiving data from Supabase.
Copy the Webhook URL.
Copy the Webhook URL and pipeline Access Token provided by GlassFlow. You will need this in the next steps to configure Supabase.
Log in to Supabase.
Go to your Supabase account and navigate to your project.
Navigate to the "API" Section.
In your Supabase project, find the "API" section under "Project Settings" where you can manage webhooks and other API settings.
Create a New Webhook.
Set up a new Webhook to monitor the database table(s) you want.
Paste the Webhook URL you copied from GlassFlow into the Webhook URL field in Supabase.
Add the following headers X-Pipeline-Access-Token
set to a valid Access Token you copied from GlassFlow and Content-Type
set to application/json
.
Select Events to Monitor.
Choose the events you want to monitor (Insert, Update, Delete).
Optionally, you can filter the events based on certain conditions.
Save the Webhook Configuration.
Save your webhook settings in Supabase. Now, whenever an event occurs in the specified table(s), Supabase will trigger the webhook, sending data to your GlassFlow pipeline.
Monitor the Data Stream.
Use the GlassFlow WebApp Pipeline Logs section to monitor incoming data and ensure your pipeline is functioning correctly.
View Transformed Data.
Depending on the chosen data sink, you can now view the transformed data in the configured destination (e.g., AWS S3, Pub/Sub, or a custom Python-consuming service using Python SDK).
To get started with consuming events from your pipeline using Python SDK, you can use the code snippet provided on the Pipeline Details page.
You’ve successfully set up a real-time data pipeline from Supabase to GlassFlow using the Webhook connector. This setup is ideal for applications that require immediate data processing, such as real-time analytics, notifications, and more.