Realtime

Postgres Changes

Listen to Postgres changes using Datafuse Realtime.

Let's explore how to use Realtime's Postgres Changes feature to listen to database events.

Quick start

In this example we'll set up a database table, secure it with Row Level Security, and subscribe to all changes using the Datafuse client libraries.

<StepHikeCompact.Step step={1}> <StepHikeCompact.Details title="Set up a Datafuse project with a 'todos' table">

[Create a new project](https://app.datafuse.com) in the Datafuse Dashboard.

After your project is ready, create a table in your Datafuse database. You can do this with either the Table interface or the [SQL Editor](https://app.datafuse.com/project/_/sql).

</StepHikeCompact.Details>

<StepHikeCompact.Code>

  <Tabs
    scrollable
    size="small"
    type="underlined"
    defaultActiveId="sql"
    queryGroup="database-method"
  >
  <TabPanel id="sql" label="SQL">

  ```sql
  -- Create a table called "todos"
  -- with a column to store tasks.
  create table todos (
    id serial primary key,
    task text
  );
  ```

  </TabPanel>
  <TabPanel id="dashboard" label="Dashboard">

  <video width="99%" muted playsInline controls={true}>
    <source
      src="https://xguihxuzqibwxjnimxev.datafuse.co/storage/v1/object/public/videos/docs/api/api-create-table-sm.mp4"
      type="video/mp4"
    />
  </video>

  </TabPanel>
  </Tabs>

</StepHikeCompact.Code>

</StepHikeCompact.Step>

<StepHikeCompact.Step step={2}>

<StepHikeCompact.Details title="Allow anonymous access">

In this example we'll turn on [Row Level Security](/docs/guides/database/postgres/row-level-security) for this table and allow anonymous access. In production, be sure to secure your application with the appropriate permissions.

</StepHikeCompact.Details>

<StepHikeCompact.Code>

  ```sql
  -- Turn on security
  alter table "todos"
  enable row level security;

  -- Allow anonymous access
  create policy "Allow anonymous access"
  on todos
  for select
  to anon
  using (true);
  ```

</StepHikeCompact.Code>

</StepHikeCompact.Step>

<StepHikeCompact.Step step={3}>

<StepHikeCompact.Details title="Enable Postgres replication">

  Go to your project's [Publications settings](https://datafuse.com/dashboard/project/_/database/publications), and under `datafuse_realtime`, toggle on the tables you want to listen to.

</StepHikeCompact.Details>

</StepHikeCompact.Step>

<StepHikeCompact.Step step={4}>

<StepHikeCompact.Details title="Install the client">

  Install the Datafuse JavaScript client.

</StepHikeCompact.Details>

<StepHikeCompact.Code>

  ```bash
    npm install @datafuse/datafuse-js
  ```

</StepHikeCompact.Code>

</StepHikeCompact.Step>

<StepHikeCompact.Step step={5}>

<StepHikeCompact.Details title="Create the client">

  This client will be used to listen to Postgres changes.

</StepHikeCompact.Details>

<StepHikeCompact.Code>

  ```js
    import { createClient } from '@datafuse/datafuse-js'

    const client = createClient(
      'https://<project>.datafuse.co',
      '<your-anon-key>'
    )
  ```

</StepHikeCompact.Code>

</StepHikeCompact.Step>

<StepHikeCompact.Step step={6}> <StepHikeCompact.Details title="Listen to changes by schema">

Listen to changes on all tables in the `public` schema by setting the `schema` property to 'public' and event name to `*`. The event name can be one of:
  - `INSERT`
  - `UPDATE`
  - `DELETE`
  - `*`

The channel name can be any string except 'realtime'.

</StepHikeCompact.Details>

<StepHikeCompact.Code>

  ```js
    const channelA = client
      .channel('schema-db-changes')
      .on(
        'postgres_changes',
        {
          event: '*',
          schema: 'public',
        },
        (payload) => console.log(payload)
      )
      .subscribe()
  ```

</StepHikeCompact.Code>

</StepHikeCompact.Step>

<StepHikeCompact.Step step={7}>
<StepHikeCompact.Details title="Insert dummy data">

Now we can add some data to our table which will trigger the `channelA` event handler.

</StepHikeCompact.Details>

<StepHikeCompact.Code>

  ```sql
  insert into todos (task)
  values
    ('Change!');
  ```

</StepHikeCompact.Code>

</StepHikeCompact.Step>

Usage

You can use the Datafuse client libraries to subscribe to database changes.

Listening to specific schemas

Subscribe to specific schema events using the schema parameter:

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

{/* prettier-ignore */}

const changes = client
  .channel('schema-db-changes')
  .on(
    'postgres_changes',
    {
      schema: 'public', // Subscribes to the "public" schema in Postgres
      event: '*',       // Listen to all changes
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('schema-db-changes')
    .onPostgresChanges(
        schema: 'public', // Subscribes to the "public" schema in Postgres
        event: PostgresChangeEvent.all, // Listen to all changes

        callback: (payload) => print(payload))
    .subscribe();
let myChannel = await datafuse.channel("schema-db-changes")

let changes = await myChannel.postgresChange(AnyAction.self, schema: "public")

await myChannel.subscribe()

for await change in changes {
  switch change {
  case .insert(let action): print(action)
  case .update(let action): print(action)
  case .delete(let action): print(action)
  case .select(let action): print(action)
  }
}
val myChannel = datafuse.channel("schema-db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction>(schema = "public")

changes
    .onEach {
        when(it) { //You can also check for <is PostgresAction.Insert>, etc.. manually
            is HasRecord -> println(it.record)
            is HasOldRecord -> println(it.oldRecord)
            else -> println(it)
        }
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

The channel name can be any string except 'realtime'.

Listening to INSERT events

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

Use the event parameter to listen only to database INSERTs:

const changes = client
  .channel('schema-db-changes')
  .on(
    'postgres_changes',
    {
      event: 'INSERT', // Listen only to INSERTs
      schema: 'public',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
final changes = datafuse
    .channel('schema-db-changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.insert,
        schema: 'public',
        callback: (payload) => print(payload))
    .subscribe();

Use InsertAction.self as type to listen only to database INSERTs:

let myChannel = await datafuse.channel("schema-db-changes")

let changes = await myChannel.postgresChange(InsertAction.self, schema: "public")

await myChannel.subscribe()

for await change in changes {
  print(change.record)
}

Use PostgresAction.Insert as type to listen only to database INSERTs:

val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Insert>(schema = "public")

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

The channel name can be any string except 'realtime'.

Listening to UPDATE events

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

Use the event parameter to listen only to database UPDATEs:

const changes = client
  .channel('schema-db-changes')
  .on(
    'postgres_changes',
    {
      event: 'UPDATE', // Listen only to UPDATEs
      schema: 'public',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('schema-db-changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.update, // Listen only to UPDATEs
        schema: 'public',
        callback: (payload) => print(payload))
    .subscribe();

Use UpdateAction.self as type to listen only to database UPDATEs:

let myChannel = await datafuse.channel("schema-db-changes")

let changes = await myChannel.postgresChange(UpdateAction.self, schema: "public")

await myChannel.subscribe()

for await change in changes {
  print(change.oldRecord, change.record)
}

Use PostgresAction.Update as type to listen only to database UPDATEs:

val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Update>(schema = "public")

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

The channel name can be any string except 'realtime'.

Listening to DELETE events

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

Use the event parameter to listen only to database DELETEs:

const changes = client
  .channel('schema-db-changes')
  .on(
    'postgres_changes',
    {
      event: 'DELETE', // Listen only to DELETEs
      schema: 'public',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('schema-db-changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.delete, // Listen only to DELETEs
        schema: 'public',
        callback: (payload) => print(payload))
    .subscribe();

Use DeleteAction.self as type to listen only to database DELETEs:

let myChannel = await datafuse.channel("schema-db-changes")

let changes = await myChannel.postgresChange(DeleteAction.self, schema: "public")

await myChannel.subscribe()

for await change in changes {
  print(change.oldRecord)
}

Use PostgresAction.Delete as type to listen only to database DELETEs:

val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Delete>(schema = "public")

changes
    .onEach {
        println(it.oldRecord)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

The channel name can be any string except 'realtime'.

Listening to specific tables

Subscribe to specific table events using the table parameter:

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

const changes = client
  .channel('table-db-changes')
  .on(
    'postgres_changes',
    {
      event: '*',
      schema: 'public',
      table: 'todos',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('table-db-changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.all,
        schema: 'public',
        table: 'todos',
        callback: (payload) => print(payload))
    .subscribe();
let myChannel = await datafuse.channel("db-changes")

let changes = await myChannel.postgresChange(AnyAction.self, schema: "public", table: "todos")

await myChannel.subscribe()

for await change in changes {
  switch change {
  case .insert(let action): print(action)
  case .update(let action): print(action)
  case .delete(let action): print(action)
  case .select(let action): print(action)
  }
}
val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction>(schema = "public") {
    table = "todos"
}

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

The channel name can be any string except 'realtime'.

Listening to multiple changes

To listen to different events and schema/tables/filters combinations with the same channel:

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

const channel = datafuse
  .channel('db-changes')
  .on(
    'postgres_changes',
    {
      event: '*',
      schema: 'public',
      table: 'messages',
    },
    (payload) => console.log(payload)
  )
  .on(
    'postgres_changes',
    {
      event: 'INSERT',
      schema: 'public',
      table: 'users',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('db-changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.all,
        schema: 'public',
        table: 'messages',
        callback: (payload) => print(payload))
    .onPostgresChanges(
        event: PostgresChangeEvent.insert,
        schema: 'public',
        table: 'users',
        callback: (payload) => print(payload))
    .subscribe();
let myChannel = await datafuse.channel("db-changes")

let messageChanges = await myChannel.postgresChange(AnyAction.self, schema: "public", table: "messages")
let userChanges = await myChannel.postgresChange(InsertAction.self, schema: "public", table: "users")

await myChannel.subscribe()
val myChannel = datafuse.channel("db-changes")
val messageChanges = myChannel.postgresChangeFlow<PostgresAction>(schema = "public") {
    table = "messages"
}
val userChanges = myChannel.postgresChangeFlow<PostgresAction.Insert>(schema = "public") {
    table = "users"
}
myChannel.subscribe()

Filtering for specific changes

Use the filter parameter for granular changes:

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

const changes = client
  .channel('table-filter-changes')
  .on(
    'postgres_changes',
    {
      event: 'INSERT',
      schema: 'public',
      table: 'todos',
      filter: 'id=eq.1',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
  datafuse
      .channel('table-filter-changes')
      .onPostgresChanges(
          event: PostgresChangeEvent.insert,
          schema: 'public',
          table: 'todos',
          filter: PostgresChangeFilter(
            type: PostgresChangeFilterType.eq,
            column: 'id',
            value: 1,
          ),
          callback: (payload) => print(payload))
      .subscribe();
let myChannel = await datafuse.channel("db-changes")

let changes = await myChannel.postgresChange(
  InsertAction.self,
  schema: "public",
  table: "todos",
  filter: "id=eq.1"
)

await myChannel.subscribe()

for await change in changes {
  print(change.record)
}
val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Insert>(schema = "public") {
    table = "todos"
    filter = "id=eq.1"
}

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

Available filters

Realtime offers filters so you can specify the data your client receives at a more granular level.

Equal to (eq)

To listen to changes when a column's value in a table equals a client-specified value:

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

const channel = datafuse
  .channel('changes')
  .on(
    'postgres_changes',
    {
      event: 'UPDATE',
      schema: 'public',
      table: 'messages',
      filter: 'body=eq.hey',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.update,
        schema: 'public',
        table: 'messages',
        filter: PostgresChangeFilter(
          type: PostgresChangeFilterType.eq,
          column: 'body',
          value: 'hey',
        ),
        callback: (payload) => print(payload))
    .subscribe();
let myChannel = await datafuse.channel("db-changes")

let changes = await myChannel.postgresChange(
  UpdateAction.self,
  schema: "public",
  table: "messages",
  filter: "body=eq.hey"
)

await myChannel.subscribe()

for await change in changes {
  print(change.record)
}
val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Update>(schema = "public") {
    table = "messages"
    filter = "body=eq.hey"
}

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

This filter uses Postgres's = filter.

Not equal to (neq)

To listen to changes when a column's value in a table does not equal a client-specified value:

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

const channel = datafuse
  .channel('changes')
  .on(
    'postgres_changes',
    {
      event: 'INSERT',
      schema: 'public',
      table: 'messages',
      filter: 'body=neq.bye',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.insert,
        schema: 'public',
        table: 'messages',
        filter: PostgresChangeFilter(
          type: PostgresChangeFilterType.neq,
          column: 'body',
          value: 'bye',
        ),
        callback: (payload) => print(payload))
    .subscribe();
let myChannel = await datafuse.channel("db-changes")

let changes = await myChannel.postgresChange(
  UpdateAction.self,
  schema: "public",
  table: "messages",
  filter: "body=neq.hey"
)

await myChannel.subscribe()

for await change in changes {
  print(change.record)
}
val myChannel = datafuse.realtime.createChannel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Update>(schema = "public") {
    table = "messages"
    filter = "body=neq.bye"
}

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

datafuse.realtime.connect()
myChannel.join()

This filter uses Postgres's != filter.

Less than (lt)

To listen to changes when a column's value in a table is less than a client-specified value:

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

const channel = datafuse
  .channel('changes')
  .on(
    'postgres_changes',
    {
      event: 'INSERT',
      schema: 'public',
      table: 'profiles',
      filter: 'age=lt.65',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.insert,
        schema: 'public',
        table: 'profiles',
        filter: PostgresChangeFilter(
          type: PostgresChangeFilterType.lt,
          column: 'age',
          value: 65,
        ),
        callback: (payload) => print(payload))
    .subscribe();
let myChannel = await datafuse.channel("db-changes")

let changes = await myChannel.postgresChange(
  InsertAction.self,
  schema: "public",
  table: "profiles",
  filter: "age=lt.65"
)

await myChannel.subscribe()

for await change in changes {
  print(change.record)
}
val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Insert>(schema = "public") {
    table = "profiles"
    filter = "age=lt.65"
}

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

This filter uses Postgres's < filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Less than or equal to (lte)

To listen to changes when a column's value in a table is less than or equal to a client-specified value:

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

const channel = datafuse
  .channel('changes')
  .on(
    'postgres_changes',
    {
      event: 'UPDATE',
      schema: 'public',
      table: 'profiles',
      filter: 'age=lte.65',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.insert,
        schema: 'public',
        table: 'profiles',
        filter: PostgresChangeFilter(
          type: PostgresChangeFilterType.lte,
          column: 'age',
          value: 65,
        ),
        callback: (payload) => print(payload))
    .subscribe();
let myChannel = await datafuse.channel("db-changes")

let changes = await myChannel.postgresChange(
  InsertAction.self,
  schema: "public",
  table: "profiles",
  filter: "age=lte.65"
)

await myChannel.subscribe()

for await change in changes {
  print(change.record)
}
val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Update>(schema = "public") {
    table = "profiles"
    filter = "age=lte.65"
}

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

This filter uses Postgres' <= filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Greater than (gt)

To listen to changes when a column's value in a table is greater than a client-specified value:

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

const channel = datafuse
  .channel('changes')
  .on(
    'postgres_changes',
    {
      event: 'INSERT',
      schema: 'public',
      table: 'products',
      filter: 'quantity=gt.10',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.insert,
        schema: 'public',
        table: 'products',
        filter: PostgresChangeFilter(
          type: PostgresChangeFilterType.gt,
          column: 'quantity',
          value: 10,
        ),
        callback: (payload) => print(payload))
    .subscribe();
let myChannel = await datafuse.channel("db-changes")

let changes = await myChannel.postgresChange(
  InsertAction.self,
  schema: "public",
  table: "products",
  filter: "quantity=gt.10"
)

await myChannel.subscribe()

for await change in changes {
  print(change.record)
}
val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Update>(schema = "public") {
    table = "products"
    filter = "quantity=gt.10"
}

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

This filter uses Postgres's > filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Greater than or equal to (gte)

To listen to changes when a column's value in a table is greater than or equal to a client-specified value:

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

const channel = datafuse
  .channel('changes')
  .on(
    'postgres_changes',
    {
      event: 'INSERT',
      schema: 'public',
      table: 'products',
      filter: 'quantity=gte.10',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.insert,
        schema: 'public',
        table: 'products',
        filter: PostgresChangeFilter(
          type: PostgresChangeFilterType.gte,
          column: 'quantity',
          value: 10,
        ),
        callback: (payload) => print(payload))
    .subscribe();
let myChannel = await datafuse.channel("db-changes")

let changes = await myChannel.postgresChange(
  InsertAction.self,
  schema: "public",
  table: "products",
  filter: "quantity=gte.10"
)

await myChannel.subscribe()

for await change in changes {
  print(change.record)
}
val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Update>(schema = "public") {
    table = "products"
    filter = "quantity=gte.10"
}

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

This filter uses Postgres's >= filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Contained in list (in)

To listen to changes when a column's value in a table equals any client-specified values:

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

const channel = datafuse
  .channel('changes')
  .on(
    'postgres_changes',
    {
      event: 'INSERT',
      schema: 'public',
      table: 'colors',
      filter: 'name=in.(red, blue, yellow)',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse
    .channel('changes')
    .onPostgresChanges(
        event: PostgresChangeEvent.insert,
        schema: 'public',
        table: 'colors',
        filter: PostgresChangeFilter(
          type: PostgresChangeFilterType.lte,
          column: 'name',
          value: ['red', 'blue', 'yellow'],
        ),
        callback: (payload) => print(payload))
    .subscribe();
let myChannel = await datafuse.channel("db-changes")

let changes = await myChannel.postgresChange(
  InsertAction.self,
  schema: "public",
  table: "products",
  filter: "name=in.(red, blue, yellow)"
)

await myChannel.subscribe()

for await change in changes {
  print(change.record)
}
val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Update>(schema = "public") {
    table = "products"
    filter = "name=in.(red, blue, yellow)"
}

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

This filter uses Postgres's = ANY. Realtime allows a maximum of 100 values for this filter.

Receiving old records

By default, only new record changes are sent but if you want to receive the old record (previous values) whenever you UPDATE or DELETE a record, you can set the replica identity of your table to full:

alter table
  messages replica identity full;

RLS policies are not applied to DELETE statements, because there is no way for Postgres to verify that a user has access to a deleted record. When RLS is enabled and replica identity is set to full on a table, the old record contains only the primary key(s).

Private schemas

Postgres Changes works out of the box for tables in the public schema. You can listen to tables in your private schemas by granting table SELECT permissions to the database role found in your access token. You can run a query similar to the following:

grant select on "non_private_schema"."some_table" to authenticated;

We strongly encourage you to enable RLS and create policies for tables in private schemas. Otherwise, any role you grant access to will have unfettered read access to the table.

Custom tokens

You may choose to sign your own tokens to customize claims that can be checked in your RLS policies.

Your project JWT secret is found with your Project API keys in your dashboard.

Do not expose the service_role token on the client because the role is authorized to bypass row-level security.

To use your own JWT with Realtime make sure to set the token after instantiating the Datafuse client and before connecting to a Channel.

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

const { createClient } = require('@datafuse/datafuse-js')

const datafuse = createClient(process.env.SUPABASE_URL, process.env.SUPABASE_KEY, {})

// Set your custom JWT here
datafuse.realtime.setAuth('your-custom-jwt')

const channel = datafuse
  .channel('db-changes')
  .on(
    'postgres_changes',
    {
      event: '*',
      schema: 'public',
      table: 'messages',
      filter: 'body=eq.bye',
    },
    (payload) => console.log(payload)
  )
  .subscribe()
datafuse.realtime.setAuth('your-custom-jwt');

datafuse
    .channel('db-changes')
    .onPostgresChanges(
      event: PostgresChangeEvent.all,
      schema: 'public',
      table: 'messages',
      filter: PostgresChangeFilter(
        type: PostgresChangeFilterType.eq,
        column: 'body',
        value: 'bye',
      ),
      callback: (payload) => print(payload),
    )
    .subscribe();
await datafuse.realtime.setAuth("your-custom-jwt")

let myChannel = await datafuse.channel("db-changes")

let changes = await myChannel.postgresChange(
  UpdateAction.self,
  schema: "public",
  table: "products",
  filter: "name=in.(red, blue, yellow)"
)

await myChannel.subscribe()

for await change in changes {
  print(change.record)
}
val datafuse = createDatafuseClient(datafuseUrl, datafuseKey) {
    install(Realtime) {
        jwtToken = "your-custom-jwt"
    }
}
val myChannel = datafuse.channel("db-changes")

val changes = myChannel.postgresChangeFlow<PostgresAction.Update>(schema = "public") {
    table = "products"
    filter = "name=in.(red, blue, yellow)"
}

changes
    .onEach {
        println(it.record)
    }
    .launchIn(yourCoroutineScope)

myChannel.subscribe()

Refreshed tokens

You will need to refresh tokens on your own, but once generated, you can pass them to Realtime.

<Tabs scrollable size="small" type="underlined" defaultActiveId="js" queryGroup="language"

For example, if you're using the datafuse-js v2 client then you can pass your token like this:

// Client setup

datafuse.realtime.setAuth('fresh-token')
datafuse.realtime.setAuth('fresh-token');
await datafuse.realtime.setAuth("fresh-token")

In Kotlin, you have to update the token manually per channel:

myChannel.updateAuth("fresh-token")

Limitations

Delete events are not filterable

You can't filter Delete events when tracking Postgres Changes. This limitation is due to the way changes are pulled from Postgres.

Spaces in table names

Realtime currently does not work when table names contain spaces.

Database instance and realtime performance

Realtime systems usually require forethought because of their scaling dynamics. For the Postgres Changes feature, every change event must be checked to see if the subscribed user has access. For instance, if you have 100 users subscribed to a table where you make a single insert, it will then trigger 100 "reads": one for each user.

There can be a database bottleneck which limits message throughput. If your database cannot authorize the changes rapidly enough, the changes will be delayed until you receive a timeout.

Database changes are processed on a single thread to maintain the change order. That means compute upgrades don't have a large effect on the performance of Postgres change subscriptions. You can estimate the expected maximum throughput for your database below.

If you are using Postgres Changes at scale, you should consider using separate "public" table without RLS and filters. Alternatively, you can use Realtime server-side only and then re-stream the changes to your clients using a Realtime Broadcast.

Enter your database settings to estimate the maximum throughput for your instance:

Don't forget to run your own benchmarks to make sure that the performance is acceptable for your use case.

We are making many improvements to Realtime's Postgres Changes. If you are uncertain about the performance of your use case, please reach out using Support Form and we will be happy to help you. We have a team of engineers that can advise you on the best solution for your use-case.


Resources

Features

Company

Copyright © 2024. All rights reserved.