# Collect events from Segment ## Start collecting events from Segment via webhook. Segment can forward every tracked event to Krenalis through a webhook destination. This setup keeps your data pipeline unified and ensures that each event reaches Krenalis in a consistent, structured format—ready for storage, enrichment, and activation. ## How it works 1. **Add a webhook destination in Segment.** Create a webhook connection in Segment that sends all your events to Krenalis in real time. 2. **Filter events in Krenalis.** Optionally configure filters that apply before event processing to control which data is stored in your warehouse. 3. **Ingest and store events.** Krenalis receives the incoming payloads, validates their schema, and loads them into your warehouse for unification and analysis. 4. **Activate your data.** Use these events in Krenalis Activation to build audiences, trigger workflows, and sync enriched data to external tools. ## Steps If you've already added a source connection for the same Segment account, you can reuse it and start with the [Debug the events](#3-debug-the-events) step. ### 1. Connect Krenalis with Segment 1. In your Krenalis workspace, open the **Sources** page. 2. Click **Add a new source ⊕**, then select the **Segment** card. 3. Click **Add source...**. 4. Click **Add** to complete the connection. 5. On the connection page, click **Settings → Event write keys** and copy the **event write key** and the **endpoint URL** ### 2. Add a webhook on Segment dashboard Then proceed to create a webhook destination in Segment: 1. From the [Segment dashboard](https://app.segment.com/workspaces), go to **Connections → Destinations**. 2. Click **Add destination**, search for **Webhook**, then select it. 3. Click **Add destination** on the webhook page. 4. Choose the data sources to send to Krenalis, and enter a destination name and click **Next**. 5. Enter a name for the destination and click **Create destination**. 6. In **Mappings**, click **New Mapping**, select **Send**, and set: | Field | Value | |----------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | URL | The endpoint copied earlier. | | Method | `POST` | | Batch Size | `500` | | Headers | Click **Add Mapping Field**, enter `Authorization` as the header name, and `Bearer EVENT_WRITE_KEY` as the value, where `EVENT_WRITE_KEY` is the write key you copied earlier. | | Enable Batching | `Yes` | Refer to the following screen: [Segment webhook destination configuration example](https://www.krenalis.com/docs/collect-events/images/segment-webhook.png)! 7. Save and enable the pipeline. ### 3. Debug the events Use the **Event debugger** in your source connection in the **Krenalis Admin console** to verify that events are received correctly. 1. In Krenalis, open your source connection. 2. Go to the **Event debugger** tab. [Krenalis Event debugger showing Segment events](https://www.krenalis.com/docs/collect-events/images/event-debugger.segment.png)! It shows a live sample of the most recent events received for this source connection. Use it whenever you need to quickly confirm that events are arriving as expected and to inspect their contents in real time. 3. Generate a `track` event in Segment. The event should appear within a few seconds: [Krenalis Event debugger showing Segment events](https://www.krenalis.com/docs/collect-events/images/event-debugger-track.segment.png)! 4. Click the collected event in the **Event debugger** list to view its JSON payload, which contains the data sent by Segment. The following example shows what a typical event payload looks like: **Example —** The following JSON is a sample event payload: ```json { "anonymousId": "b27c5d9f-92a7-4d30-b21a-4df21a6872c2", "context": { "browser": { "name": "Safari", "version": "17.2.1" }, "ip": "172.91.24.57", "library": { "name": "krenalis.js", "version": "1.0.0" }, "locale": "en-US", "os": { "name": "macOS", "version": "14.5" }, "page": { "path": "/dashboard", "title": "User Dashboard", "url": "https://app.example.com/dashboard" }, "screen": { "width": 3024, "height": 1964, "density": 2 }, "session": { "id": "1766272512048" }, "timezone": "America/Los_Angeles", "userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 14_5) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.2.1 Safari/605.1.15" }, "event": "Workout Completed", "messageId": "f9189a52-b37b-4d7d-9f2d-08b91d85fa9c", "properties": { "workout_type": "Cardio", "duration_minutes": 45, "calories_burned": 380, "device": "Smartwatch" }, "receivedAt": "2025-10-20T16:15:24.340Z", "sentAt": "2025-10-20T16:15:24.327Z", "originalTimestamp": "2025-10-20T16:15:23.992Z", "timestamp": "2025-10-20T16:15:24.010Z", "type": "track", "userId": "703991475" } ``` If no event appears after a few seconds: - Check your application logs for errors. - Ensure your **event write key** and **endpoint** are correct. 💡 See the [Event spec](spec) for full details on the event schema. ### 4. Add a pipeline to import events 1. Click on the **Pipelines** tab of the connection for Segment. 2. Next to the **Import events into warehouse** pipeline, click **Add pipeline...**. [Import events into warehouse](/docs/collect\-events/images/import\-events\-into\-warehouse\.segment\.png)! This is an "Import events" pipeline type, which transfers events received from Segment into your warehouse. > Each pipeline defines how and when events flow from Segment into the warehouse. You can add multiple pipelines per connection to handle different data segments or destinations. ### 5. Filter events If you don't want to import all events from Segment, define a filter in Krenalis to control which events are imported into the warehouse. If no filter is set, all events from Segment will be imported. For more information on how to use filters, see the [Filters documentation](/docs/filters). [Filter events](/docs/collect\-events/images/filter\-events\.png)! In the example above, only `track` events are imported. ### 6. Save your changes When you're done, click **Add** (or **Save** if you're editing an existing pipeline). The new pipeline will appear in your source connection and can be edited or disabled at any time. ## Pipelines Once saved, the new pipeline appears in the pipelines list for Segment. From here, you can monitor imports and adjust filters. Each pipeline defines how and when events flow from Segment into your warehouse. [Pipeline to import events](/docs/collect\-events/images/pipelines\.segment\.png)! | Column | Description | |--------------|--------------------------------------------------------------------------------------------| | **Pipeline** | Name and description of the pipeline. | | **Filter** | Condition that determines which events are processed. If not set, all events are included. | | **Enable** | Switch to activate or deactivate the pipeline. When disabled, no events are processed. | | **Manage** | Edit settings such as filter. | | **⋮ (More)** | Additional options, such as deleting the pipeline. |