Postgres destination for batch exports

Last updated:

|Edit this page

Batch exports can be used to export data to a Postgres table.

Setting up Postgres access

  1. Make sure PostHog can access your Postgres database.

Note: Wherever your Postgres database is hosted, make sure the host is set to accept all incoming connections so that PostHog can connect to the database and insert events. PostHog does not guarantee a static list of IP addresses to whitelist. If this is not possible in your case, consider exporting data to a different destination (like S3) and then setting up your own system for getting data into your Postgres database.

  1. Create a Postgres user with table creation privileges.

When executing a batch export, if the destination table doesn't exist, it will be created. CREATE TABLE and USAGE permissions are required for this reason. You can and should block PostHog from doing anything else on any other tables. In particular, we recommend creating a new schema and only granting PostHog CREATE TABLE and USAGE access limited to that schema:

SQL
CREATE USER posthog WITH PASSWORD 'insert-a-strong-password-here';
CREATE SCHEMA posthog_exports;
GRANT CREATE ON SCHEMA posthog_exports TO posthog;
GRANT USAGE ON SCHEMA posthog_exports TO posthog;

Event schema

This is the schema of all the fields that are exported to Postgres.

FieldTypeDescription
uuidVARCHAR(200)The unique ID of the event within PostHog
eventVARCHAR(200)The name of the event that was sent
propertiesJSONBA JSON object with all the properties sent along with an event
elementsJSONBThis field is present for backwards compatibility but has been deprecated
setJSONBA JSON object with any person properties sent with the $set field
set_onceJSONBA JSON object with any person properties sent with the $set_once field
distinct_idVARCHAR(200)The distinct_id of the user who sent the event
team_idINTEGERThe team_id for the event
ipVARCHAR(200)The IP address that was sent with the event
site_urlVARCHAR(200)This field is present for backwards compatibility but has been deprecated
timestampTIMESTAMP WITH TIME ZONEThe timestamp associated with an event

Creating the batch export

  1. Navigate to the exports page in your PostHog instance (Quick links if you use PostHog Cloud US or PostHog Cloud EU).
  2. Click "Create export workflow".
  3. Select Postgres as the batch export destination.
  4. Fill in the necessary configuration details.
  5. Finalize the creation by clicking on "Create".
  6. Done! The batch export will schedule its first run on the start of the next period.

Postgres configuration

Configuring a batch export targeting Postgres requires the following Postgres-specific configuration values:

  • User: User for your Postgres database with CREATE TABLE access used by PostHog to login to your database.
  • Password: The password of the username provided.
  • Host: The host name of the server on which your Postgres database is running.
  • Port: The TCP port on which the Postgres database server is listening for connections.
  • Table name: The name of a Postgres table where to export the data.
  • Database: The name of the Postgres database where the table provided to insert data is located.
  • Schema: The name of the Postgres database schema where the table provided to insert data is located.
  • Does your Postgres instance have a self-signed SSL certificate?: In most cases, Heroku and RDS users should check this box.

Questions?

Was this page useful?

Next article

Segment

Send events to PostHog, via Segment. Segment allows you to easily manage data and integrations with services across your Growth, Product, and Marketing stack. By tracking events and users via Segment’s API and libraries, you can send your product’s data to all of your analytics/marketing platforms, with minimal instrumentation code. They offer support for most platforms, including iOS, Android, JavaScript, Node.js, PHP, and more. Requirements This requires either PostHog Cloud, or a self-hosted…

Read next article