Quickstart Guide

Get your analytics pipeline running in under 5 minutes. This guide walks you through initializing the DataPulse SDK, configuring your first data stream, and validating event ingestion.

Prerequisites
Ensure you have Node.js 18+ installed, an active DataPulse account, and your project API keys from the dashboard.

1. Install the SDK

DataPulse provides official SDKs for JavaScript, Python, and Go. We'll use the npm package for this guide:

bash
npm install @datapulse/analytics-sdk
# or with yarn
yarn add @datapulse/analytics-sdk

2. Initialize & Configure

Import the client and initialize it with your project credentials. Securely store your keys using environment variables.

javascript
import { DataPulseClient } from '@datapulse/analytics-sdk';

const client = new DataPulseClient({
  apiKey: process.env.DATAPULSE_API_KEY,
  projectId: process.env.DATAPULSE_PROJECT_ID,
  environment: 'production', // or 'staging'
  batchSize: 50,
  flushInterval: 10000 // ms
});

// Validate connection
await client.ping();
console.log('✓ Connected to DataPulse edge network');

3. Track Your First Event

Events are the core unit of data in DataPulse. Use the track() method to push structured events to your pipeline.

javascript
await client.track({
  event: 'user_signed_up',
  timestamp: new Date(),
  properties: {
    user_id: 'usr_8f3k29d',
    plan: 'enterprise',
    source: 'organic_search',
    metadata: { region: 'us-east-1', device: 'mobile' }
  }
});

console.log('✓ Event queued for ingestion');
Schema Enforcement
DataPulse validates all incoming events against your defined JSON Schema. Mismatched payloads will be quarantined for review. Define schemas in the Schema Registry before tracking.

4. Event Payload Structure

All events must conform to the following base structure. Optional fields enable advanced segmentation and attribution modeling.

Field Type Required Description
event string Yes Unique event identifier (kebab-case recommended)
timestamp ISO 8601 Yes Exact time of event occurrence
properties object Yes Arbitrary key-value pairs for event context
user_id string No Anonymous or authenticated user identifier
session_id string No Groups events into user sessions

Next Steps

Now that your pipeline is live, explore the Data Pipeline Architecture guide to understand ingestion flows, or head to the REST API Reference to programmatically manage datasets.