Concepts
Integration Broker Architecture Understanding Field Nation's middleware system that powers all pre-built connectors with reliable, scalable data synchronization.
Loading diagram...
The primary processing engine that:
Consumes messages from Redis queues
Executes field mappings and transformations
Dispatches API calls to target systems
Handles retries and error recovery
Redis-based queue system providing:
Inbound Queue : Webhook triggers from external systems
Processing Queue : Active operations being executed
Retry Queue : Failed operations with exponential backoff
Dead Letter Queue (DLQ) : Operations exceeding max retries
Dynamic mapping system that:
Translates data schemas between systems
Applies transformation actions (sync, static, date convert, etc.)
Supports conditional logic and complex mappings
Handles bidirectional field mapping
JSONNET-based transformation engine for:
Complex data manipulation logic
Conditional field population
Custom business rules
Advanced array and object transformations
Learn more about JSONNET →
External system sends webhook notification when record is created/updated. Each connector has a unique trigger URL with embedded authentication token.
Broker worker retrieves message, validates token, fetches complete record data from external system using configured API credentials.
System applies configured mappings to transform external data into Field Nation work order schema. Includes static values, date conversions, JSONNET transformations.
Transformed data is validated and submitted to Field Nation API. System stores correlation between external record ID and Field Nation work order ID.
Failed operations are logged, retried with exponential backoff, and moved to DLQ if max retries exceeded. Email notifications sent for persistent failures.
Field Nation publishes work order lifecycle events (status changes, assignments, completions, messages) to integration event stream.
Broker identifies originating external system using correlation ID. Applies reverse field mappings to transform Field Nation data to external schema.
System constructs API request (REST, SOAP, or platform SDK) and submits update to external platform using configured authentication.
Successful updates logged, work order sync status updated. Failed updates trigger retry with exponential backoff.
The broker uses a poll-and-acknowledge pattern :
Worker retrieves message from queue
Worker processes message completely
Worker acknowledges successful processing
If worker crashes, message remains unacknowledged and retries after timeout
Each integration respects external API quotas:
Maximum requests per minute
Concurrent connection limits
Burst allowances
Per-platform rate limit configuration
Failed operations retry with exponential backoff:
Attempt Delay Action 1 0s Immediate 2 30s First retry 3 60s Second retry 4 120s Third retry 5 240s Fourth retry 6+ 480s Final attempts
After maximum retries, operation moves to Dead Letter Queue for manual review.
Encryption at Rest : AES-256 encryption for all credentials
In-Memory Only : Decrypted only during active use
No Logging : Credentials never logged or exposed in errors
Token Refresh : Automatic OAuth token refresh
Basic Auth OAuth 2.0 API Keys SOAP/Token
Authorization: Basic base64(username:password) Used by: Freshdesk, REST Connector
Authorization: Bearer {access_token} Automatic token refresh with refresh tokens
Used by: Salesforce, ServiceNow
X-API-Key: {api_key}
Authorization: Bearer {token} Platform-specific header formats
Used by: Quickbase, Smartsheet
< soap:Header >
< TokenPassport >
< account >{account_id}</ account >
< consumerKey >{consumer_key}</ consumerKey >
< token >{token}</ token >
</ TokenPassport >
</ soap:Header > Used by: NetSuite, Autotask
Each trigger URL includes unique client token:
https://api.fieldnation.com/integrations/trigger/{client_token}
Broker validates token before processing any data.
Authentication Errors
Invalid/expired credentials
Insufficient permissions
Token refresh failures
Action : Manual intervention required
Validation Errors
Missing required fields
Invalid data types
Schema validation failures
Action : Fix field mappings or source data
API Errors
Rate limit exceeded
Network timeouts
Service unavailable
Action : Automatic retry with backoff
Transformation Errors
JSONNET execution failures
Field mapping exceptions
Data type conversions
Action : Review custom actions
Configure email alerts for:
Failed operations
Dead letter queue entries
Authentication failures
High error rates
Throughput : Thousands of work orders per minute
Scaling : Horizontal scaling via additional worker instances
Concurrency : Multiple workers process queue in parallel
Typical : 2-5 seconds end-to-end (trigger → work order created)
Factors : External API response time, field mapping complexity, queue depth
Uptime : 99.9% availability
Failover : Automatic worker recovery
Circuit Breakers : Isolate failing external systems
Health Checks : Continuous monitoring
All operations logged with:
Timestamps
User identifiers
Operation types
Outcomes (success/failure)
Error details
API request/response
Track key metrics:
Queue depths by type
Processing rates
Error frequencies by category
API response times
Sync success rates
Comprehensive audit records for:
Compliance requirements
Incident investigation
Performance analysis
Capacity planning
Integration Broker UI : app.fieldnation.com/integrations
Configure:
Authentication credentials (encrypted)
Field mappings (source → target)
Event triggers (which events sync)
Operation types (create only vs bidirectional)
Email notifications
Required Configuration :
Webhook endpoints pointing to Field Nation trigger URL
Automation rules (Flows, Business Rules, Workflows)
API user with appropriate permissions
Firewall/IP whitelist configuration (if applicable)
Platform-specific guides →
In-Transit Processing : Data processed in memory, not stored
Logging : Error logs scrubbed of sensitive information
Retention : Audit logs retained per compliance requirements
Encryption : All API calls use TLS 1.2+
The Integration Broker supports:
SOC 2 compliance
GDPR data handling
Audit trail requirements
Data residency considerations
✅ Test in sandbox before production
✅ Use service accounts (not personal credentials)
✅ Set up error notifications
✅ Monitor queue depths
✅ Document field mappings
✅ Rotate credentials periodically
✅ Use least-privilege API access
✅ Enable IP whitelisting where possible
✅ Review audit logs regularly
✅ Encrypt credentials in your documentation
✅ Keep field mappings simple when possible
✅ Use custom actions only when necessary
✅ Batch updates where supported
✅ Monitor API rate limits
✅ Scale workers for high volume
Complete troubleshooting guide →