CoolR Data Integration Specification
- Version: 1.0
- Last Updated: February 24, 2026
- Audience: Data Engineers, Business Analysts, Integration Teams
Table of Contents
- Overview
- Integration Methods
- Export Types
- File Formats
- Data Structure
- Common Data Dictionary
- Export File Specifications
- Master Data Considerations
- Delta Processing
- Setup & Alerts
Overview
CoolR Data Publisher provides automated data exports from your CoolR system to your preferred storage and analysis platform. This replaces complex API pagination with simple file-based integration.
Key Benefits
- Simplified Integration: No API pagination complexity
- Automated Delivery: Scheduled exports run automatically
- Multiple Access Methods: From highly secure (Managed Identity) to flexible (SFTP)
- Efficient Delta Processing: Only process changed data using
ModifiedOnfield - Multiple Formats: Parquet (recommended), CSV, JSON
Integration Methods
CoolR Data Publisher is optimized for Azure using Parquet format for maximum performance and security.
Recommended: Azure + Parquet
Why This Combination?
| Benefit | Impact |
|---|---|
| 50-80% smaller files | Reduced storage costs and faster transfers |
| 5-10x faster queries | Columnar storage enables faster analytics |
| Zero credential management | No passwords or keys to rotate with Managed Identity |
| Native Azure integration | Works seamlessly with Synapse, Databricks, Power BI |
| Data type preservation | No manual conversion needed (dates, numbers stay typed) |
Integration Methods Comparison
| Method | Security | Setup Effort | Credentials | Best For |
|---|---|---|---|---|
| Managed Identity | Highest | Low | None | Azure VMs, Functions, Synapse Analytics |
| Workload Identity Federation | Highest | Medium | None | Azure Kubernetes (AKS), GitHub Actions |
| Webhook Notification | High | Low | Your endpoint | Real-time event triggers, Logic Apps |
| SAS Token | Medium | Low | Time-limited token | Testing, temporary access only |
| SFTP | Lower | Medium | Username/password | Legacy systems (not recommended) |
Setup Decision Guide
Choose your integration method based on your infrastructure:
-
Using Azure services? → Managed Identity (Recommended)
- Azure Virtual Machines, Azure Functions, Azure Synapse Analytics
- Zero configuration, highest security, no credential management
-
Using Kubernetes or GitHub Actions? → Workload Identity Federation
- Azure Kubernetes Service (AKS), GitHub Actions workflows
- No credentials needed, federated trust model
-
Need real-time notifications? → Webhook Notification
- Azure Logic Apps, custom endpoints
- Get notified immediately when new exports are available
-
Quick testing or temporary access? → SAS Token
- Development, testing, proof-of-concept
- Use short expiration periods (24-48 hours max)
-
Legacy non-Azure systems? → SFTP (consider migrating to Azure)
- Existing on-premises systems
- Higher credential management overhead
SFTP Configuration Requirements
If you're using SFTP as your delivery method, you'll need to provide the following configuration to CoolR:
Required Configuration Fields
| Field | Description | Example |
|---|---|---|
| StorageType | Must be set to "SFTP" | SFTP |
| Host | SFTP server hostname or IP address | sftp.yourcompany.com |
| Port | SFTP server port (default: 22) | 22 |
| Username | SFTP authentication username | coolr_export_user |
| HomeDirectory | Landing directory for files on SFTP server | /exports or /data/coolr |
Authentication Methods
CoolR Data Publisher supports two authentication methods for SFTP:
1. Password Authentication (Simple)
| Field | Description | Security Note |
|---|---|---|
| Password | SFTP user password | Store securely, rotate regularly |
2. Private Key Authentication (Recommended)
| Field | Description | Format |
|---|---|---|
| SftpPrivateKey | PEM-encoded private key | RSA, ECDSA, or Ed25519 |
| SftpPrivateKeyPassphrase | Passphrase to decrypt key (optional) | Only if key is encrypted |
Private Key Example:
-----BEGIN RSA PRIVATE KEY-----
MIIEpAIBAAKCAQEA...
...
-----END RSA PRIVATE KEY-----
File Delivery Behavior
Important SFTP Limitations:
Due to permission restrictions on most client SFTP servers, the CoolR Data Publisher cannot create directories on your SFTP server. To work around this:
- All files are uploaded to your HomeDirectory
- Directory paths are encoded in the filename using double underscores (
__)
File Naming Format:
{clientcode}-{clientid}__{timestamp}__{filename}
Example:
clientcode-464__20260210133506__outlet_dump_01.01.parquet
clientcode-464__20260210133506__manifest.json
clientcode-464__20260210133506__assetpurity_delta_01.01.parquet
What This Means:
- Instead of creating
/exports/20260210133506/outlet_dump_01.01.parquet - Files are placed as
/exports/clientcode-464__20260210133506__outlet_dump_01.01.parquet
This flattened naming convention allows you to:
- Parse the filename to extract the timestamp and export name
- Process files in order by parsing the timestamp segment
- Organize files in your downstream processing as needed
SFTP Connection Details
Connection Behavior:
- Timeout: 30 seconds per connection attempt
- Retry Logic: 3 attempts with exponential backoff (1s, 2s delays)
- Error Handling: Detailed error messages include host, port, and username for troubleshooting
Security Best Practices
- Use Private Key Authentication instead of passwords when possible
- Encrypt Private Keys with a strong passphrase
- Restrict SFTP User Permissions to only the necessary directories
- Use Non-Standard Ports if your security policy requires it
- Monitor Failed Connection Attempts for security incidents
- Rotate Credentials Regularly (passwords every 90 days, keys every 12 months)
Example Configuration
Password Authentication:
{
"ClientId": 1003,
"StorageType": "SFTP",
"Host": "sftp.example.com",
"Port": 22,
"Username": "coolr_exports",
"Password": "[SECURE_PASSWORD]",
"HomeDirectory": "/data/exports"
}
Private Key Authentication:
{
"ClientId": 1003,
"StorageType": "SFTP",
"Host": "sftp.example.com",
"Port": 22,
"Username": "coolr_exports",
"SftpPrivateKey": "<SFTP_PRIVATE_KEY_PEM>",
"SftpPrivateKeyPassphrase": "[KEY_PASSPHRASE]",
"HomeDirectory": "/data/exports"
}
Common SFTP Issues & Troubleshooting
| Issue | Possible Cause | Solution |
|---|---|---|
| Connection timeout | Firewall blocking port 22 | Whitelist CoolR IP addresses, verify port is open |
| Authentication failed | Invalid credentials or key | Verify username/password or test private key locally |
| Permission denied | User lacks write permissions | Grant write access to HomeDirectory for SFTP user |
| Files not appearing | Wrong HomeDirectory path | Verify path exists and is accessible to SFTP user |
| Connection refused | SFTP service not running | Verify SFTP server is running and accessible |
Export Types
| Aspect | Full Data Dump | Incremental Data Delta |
|---|---|---|
| Purpose | Complete snapshot of all data | Only changed/new records since last export |
| Frequency | Less frequent (daily, weekly) | More frequent (hourly, every 4-6 hours) |
| Data Volume | Large (all records) | Small (only changes) |
| Use Case | Initial load, complete refresh, backup | Incremental updates, real-time sync |
| Processing Time | Longer | Faster |
| Key Field | All records included | Only records modified since last export |
| IsDeleted Handling | All records regardless of status | Includes deleted records (IsDeleted=1) for sync |
When to Use Each Type
Full Data Dump:
- Initial data warehouse setup
- Monthly/quarterly complete refresh
- Disaster recovery backups
- Data validation and reconciliation
Incremental Data Delta:
- Daily operational updates
- Real-time dashboards and reporting
- Event-driven data pipelines
- Efficient incremental ETL processes
File Formats
CoolR exports support three file formats. Parquet is strongly recommended for performance and efficiency.
| Feature | Parquet ⭐ (Recommended) | CSV | JSON |
|---|---|---|---|
| Compression | Built-in (50-80% smaller) | No compression | No compression |
| Performance | ⭐⭐⭐⭐⭐ Fastest | ⭐⭐⭐ Moderate | ⭐⭐ Slow for large data |
| Data Types | Preserved (dates, numbers) | All text (need conversion) | Preserved |
| Columnar Storage | Yes (faster queries) | No | No |
| Excel Compatible | No (use Power BI/Tableau) | ✅ Yes | No |
| Best For | Data warehouses, analytics | Excel, legacy systems | APIs, web apps |
| File Size (1M rows) | ~50-100 MB | ~500 MB | ~800 MB |
| Load Speed (1M rows) | ~5 seconds | ~30 seconds | ~60 seconds |
Format Selection Guide
Choose Parquet if:
✓ Using modern analytics tools (Power BI, Tableau, Snowflake, Databricks)
✓ Processing large datasets (>100K rows)
✓ Need fast query performance
✓ Want automatic compression
Choose CSV if:
✓ Need Excel compatibility
✓ Working with legacy systems
✓ Small datasets (<50K rows)
✓ Human readability is important
Choose JSON if:
✓ Feeding web applications or APIs
✓ Need nested data structures
✓ Working with JavaScript/Node.js
Data Structure
File Naming Convention
Container Level:
{clientcode}-{clientid}
Example format:
clientcode-123
clientcode-987
clientcode-000
Folder Structure:
{container-name}/{yyyyMMddHHmmss}/{exportname}_{type}_{version}.parquet/{guid}_0_0_0.parquet
Example format:
clientcode-123/20260210133506/outlet_dump_01.00.parquet/019c47c3-55ac-7069-b007-fd4826b5983f_0_0_0.parquet
clientcode-123/20260210133506/assetpurity_delta_01.00.parquet/019c47c3-6476-7f09-b47f-b5c391282b16_0_0_0.parquet
Naming Components:
{exportname}: outlet, asset, assetpurity, consumptionlog, etc.{type}: dump (full export) or delta (incremental export){version}: Schema version (01.00){guid}: Unique identifier for the parquet file
Directory Structure
Azure Blob Storage Container (per client)
{container-name}/ e.g., clientcode-123, clientcode-456, clientcode-789
└── {yyyyMMddHHmmss}/ e.g., 20260210133506
├── manifest.json
├── outlet_dump_01.00.parquet/
│ └── {guid}_0_0_0.parquet
├── asset_dump_01.00.parquet/
│ └── {guid}_0_0_0.parquet
├── assetpurity_delta_01.00.parquet/
│ └── {guid}_0_0_0.parquet
├── consumptionlog_delta_01.00.parquet/
│ └── {guid}_0_0_0.parquet
└── ...
Note: Each export creates a folder named {exportname}_{type}_01.00.parquet/
containing one or more parquet files with GUID-based names.
Common Data Dictionary
Many fields appear across multiple exports with consistent meaning. Instead of repeating these in every schema, we've consolidated them into a Common Schema document.
For complete field definitions, see the Common Schema which includes:
- Client Information
- Location Information
- Product Information
- Asset Information
- Smart Device Information
- Audit Fields
- Geographic Information
Below is a quick summary for reference:
Client & Organization Fields
| Field Name | Description |
|---|---|
ClientId | Unique identifier for the client organization |
ClientName | Client organization name |
ClientCode | Client organization code |
Location Fields (Common across Outlet, Asset, Alert, etc.)
| Field Name | Description |
|---|---|
LocationId | Unique location/outlet identifier |
Name | Location name |
Code | Location code |
ExternalLocationCode | External system location code |
Street | Primary street address |
Street2 | Street address line 2 |
Street3 | Street address line 3 |
City | City name |
PostalCode | Postal/ZIP code |
Latitude | GPS latitude coordinate for mapping |
Longitude | GPS longitude coordinate for mapping |
StateId | State/province identifier |
State | State/province name |
CountryId | Country identifier |
Country | Country name |
MarketId | Market identifier |
MarketName | Market name |
ChannelId | Location type/channel identifier |
Channel | Location type/channel name (e.g., Retail, Wholesale) |
ClassificationId | Location classification identifier |
Classification | Location classification name |
TimeZone | Timezone display name |
Asset Fields (Common across Asset, AssetPurity, Alert, etc.)
| Field Name | Description |
|---|---|
AssetId | Unique asset identifier |
AssetSerialNumber | Physical asset serial number printed on device |
ExternalAssetCode | External system asset code |
AssetTypeId | Asset type identifier |
AssetType | Asset type name (e.g., Cooler, Refrigerator) |
PlanogramId | Assigned planogram identifier |
PlanogramName | Planogram name |
ExternalPlanogramCode | External system planogram code |
Product Fields (Common across Product, ConsumptionLog, etc.)
| Field Name | Description |
|---|---|
ProductId | Unique product identifier |
Product | Product name |
ShortName | Product short name |
SKU | Stock Keeping Unit code |
ExternalProductCode | External system product code |
BrandId | Brand identifier |
Brand | Brand name |
ProductCategory | Product category name |
Smart Device Fields (Common across SmartDevice, AssetPurity, etc.)
| Field Name | Description |
|---|---|
SmartDeviceId | Unique smart device identifier |
SmartDeviceSerialNumber | Smart device serial number |
SmartDeviceType | Smart device type name (e.g., Camera, Sensor) |
MacAddress | Device MAC address |
Standard Audit & Tracking Fields
| Field Name | Description |
|---|---|
IsDeleted | Soft delete flag (0=Active, 1=Deleted) - See Master Data Considerations |
CreatedOn | Timestamp when record was created |
ModifiedOn | Timestamp when record was last modified ⭐ Critical for delta processing |
CreatedBy / CreatedByUser | Username who created the record |
ModifiedBy / ModifiedByUser | Username who last modified the record |
⭐ ModifiedOn is critical for delta processing - see Delta Processing section.
Note: All export specifications below include complete field lists. Common fields are repeated in each export to ensure you have all field definitions in one place without cross-referencing.
Export File Specifications
The CoolR Data Publisher provides 18 export files organized into two categories:
🔄 Incremental Data Delta - Recommended
For ongoing data synchronization with efficient, real-time updates. View Incremental Data Delta Documentation →
| Export Name | Primary Purpose | Update Frequency |
|---|---|---|
| Asset Purity | Image analysis results showing cooler contents | Every 1-6 hours |
| Asset Day KPI | Daily asset performance metrics | Daily |
| Asset Purity Product | Product positions in images | Every 1-6 hours |
| Consumption Log | Product consumption/sales data | Every 1-6 hours |
| Scheduled Order Item | Order line items | Every 1-6 hours |
📦 Full Data Dump (Master Data)
Complete snapshots of master data tables for initial setup. View Full Data Dump Documentation →
| Export Name | Primary Purpose |
|---|---|
| Outlet | Location/outlet master data |
| Asset | Cooler/equipment master data |
| Product | Product catalog master data |
| Brand | Brand and manufacturer data |
| Market | Market hierarchy data |
| Planogram | Planogram configurations |
| Smart Device | IoT sensor/camera master data |
| Product Pricing | Product pricing by case size |
| Location Product | Location-specific pricing |
| Location Classification | Location classification lookup |
| Channel | Location type/channel lookup |
| Alert | Alert history (last 90 days) |
| User | User account data |
📋 Common Schema
To avoid duplication, common fields that appear across multiple exports are documented in the Common Schema.
📝 Version History
All schema changes across versions are tracked in the Version History.
Master Data Considerations
Understanding IsDeleted (Soft Deletes)
CoolR uses soft deletes instead of physically removing records from the database. This preserves data integrity and audit trails.
What is IsDeleted?
IsDeleted = 0: Record is active and currentIsDeleted = 1: Record has been deleted but preserved for history
When to Filter IsDeleted:
- For current/active data only (most common use case): Filter records where
IsDeleted = 0 - For full history including deleted records: Include all records without filtering
- For deleted records only (audit/compliance): Filter records where
IsDeleted = 1
Incremental Data Delta Behavior:
⚠️ Important: Incremental Data Delta exports INCLUDE deleted records (IsDeleted = 1) so you can sync deletions to your system. This ensures your target system stays in sync with deletions made in CoolR.
Best Practices:
-
Initial Load (Full Data Dump):
- Load only active records by filtering
IsDeleted = 0 - This gives you the current state of all active data
- Load only active records by filtering
-
Incremental Updates:
- Process both updates and deletions from incremental data delta files
- When
IsDeleted = 1: Remove or mark the record as deleted in your system - When
IsDeleted = 0: Update existing records or insert new ones - This keeps your system synchronized with CoolR
-
Reporting:
- Always filter for active records (
IsDeleted = 0) in reports and dashboards - Use
ModifiedOnto focus on recent changes if needed
- Always filter for active records (
Delta Processing
Using ModifiedOn for Incremental Updates
Every record includes a ModifiedOn timestamp that updates whenever the record changes. This enables efficient incremental processing.
How Delta Processing Works:
The CoolR Data Publisher uses the timestamp of the last successful export to determine which records to include in incremental data delta exports. This ensures you receive all changes since your last successful import.
Incremental Data Delta Logic:
- Timestamp Source: Incremental data delta exports use the timestamp from the last successful export run for that client and export type
- First Run or Long Gaps: If this is the first incremental export OR the last successful export was more than 1 month ago, the export is capped to 1 month of data
- Filter Applied: Records are filtered to include only those modified after your last export date (for AssetDayKPI, uses the KPI date instead)
This 1-month cap prevents excessively large incremental data delta exports on first run or after long periods without exports, while still ensuring you get recent data.
Your Processing Steps:
- Retrieve Current Watermark: Get the timestamp of your last successful import for each table
- Load Incremental Records: Read the incremental data delta export file containing only changed records
- Apply Changes:
- Update existing records that have changed
- Insert new records that were created
- Delete or mark as deleted records where
IsDeleted = 1
- Update Watermark: Save the maximum
ModifiedOntimestamp as your new watermark - Track Completion: Record when the import completed for monitoring purposes
Key Concepts:
- Server-Side Filtering: CoolR filters records server-side, so incremental data delta files only contain relevant changes
- Watermark per Table: Maintain separate watermarks for each export file (location, asset, etc.)
- Transaction Safety: Process incremental imports within a transaction to ensure all-or-nothing updates
- Idempotency: Incremental data delta processing should be safe to re-run if a previous attempt failed partway through
- 1-Month Cap: First incremental export or gaps > 1 month will only include the most recent month of data
Incremental Data Delta Processing Recommendations:
| Frequency | Use Case | Example Schedule |
|---|---|---|
| Every 4-6 hours | Operational dashboards | 00:00, 06:00, 12:00, 18:00 |
| Every 1 hour | Real-time inventory | Every hour on the hour |
| Daily | Standard reporting | 02:00 AM daily |
| Weekly | Analytics refresh | Sunday 03:00 AM |
Setup & Alerts
Recommended Alert Configuration
Configure alerts in your data pipeline to monitor export health:
| Alert Type | Threshold | Action |
|---|---|---|
| Export Failure | Any failed export | Immediate notification to data team |
| Export Delay | >2 hours past schedule | Warning notification |
| File Size Anomaly | >50% deviation from average | Investigate data quality |
| No New Data | Delta export with 0 records for 24h | Check source system |
| SAS Token Expiry | less than 24 hours remaining | Download files immediately |
Getting Started
Before going live with your integration:
- Choose Integration Method: Select from Managed Identity, Workload Identity Federation, Webhook, SAS Token, or SFTP
- Select File Format: Parquet (recommended), CSV, or JSON
- Define Export Schedule: Determine Full export and Delta export frequencies
- Configure Notifications: Set up email or webhook notifications
- Test Initial Load: Successfully import a Full export
- Test Delta Processing: Successfully process an incremental Delta export
Questions or Issues? Contact your CoolR integration specialist or email: support@coolrgroup.com