Best practices to avoid throttling when Azure Functions integrate with Dataverse:
πΉ 1. Use Custom API / Plugin (preferred approach)
-
Encapsulate multiple operations inside one Dataverse Custom API.
-
Azure Function only makes one call per message.
-
Benefits: fewer API calls, transactional consistency, central business logic.
πΉ 2. Use Batch (ExecuteMultiple / $batch)
-
If you must call Dataverse directly, use $batch requests.
-
Example: instead of 50 separate
POST /contacts, send them in 1 batch request. -
This reduces entitlement usage and network overhead.
πΉ 3. Distribute load across Application Users
-
Create multiple App registrations in Azure AD (with Dataverse application users).
-
Spread API calls across them to avoid hitting the per-user limits.
-
Example: Function A uses AppUser1, Function B uses AppUser2.
πΉ 4. Implement Retry with Exponential Backoff
-
Always catch 429 (Too Many Requests) and 5xx errors.
-
Retry after the
Retry-Afterheader or use exponential backoff (2s → 4s → 8s…). -
Don’t retry immediately — that makes throttling worse.
πΉ 5. Control Concurrency in Azure Function
-
By default, Functions can scale out aggressively, hammering Dataverse.
-
Use host.json settings to limit parallelism:
-
This ensures only 10 messages are processed at a time.
πΉ 6. Prefer Service Bus Sessions / Queues for pacing
-
Queue messages instead of firing all at once.
-
This lets you control throughput and align with Dataverse API limits.
πΉ 7. Use Change Tracking / Delta Loads
-
For data sync, only move delta records (changed rows).
-
Reduces unnecessary Dataverse API calls.
πΉ 8. Monitor & Tune
-
Enable Dataverse API metrics in Power Platform Admin Center.
-
Track: API calls/minute, throttled calls, failed requests.
-
Tune Function concurrency based on observed limits.
✅ Summary:
-
Best option → Use Custom API + Plugin to minimize calls.
-
If direct calls → use batching, retries with backoff, concurrency limits, and multiple app users.
-
Always monitor entitlements and adjust load.
Comments
Post a Comment