Posts
Showing posts from August, 2025
We need to import bulk data (tasks) from an external source (Excel file from another system) into Dataverse.
- Get link
- X
- Other Apps
Scenario We need to import bulk data (tasks) from an external source (Excel file from another system) into Dataverse . 🔹 Architecture Flow (Step by Step) Excel File Source External system exports tasks into an Excel file. File includes: title, description, category, duration, status . File uploaded to Azure Blob Storage . Staging Database (SQL Server) Before directly pushing into Dataverse, the file data is copied into a staging table in SQL Server. Benefits: Data validation before import. Status tracking ( import_status column: e.g., Pending, Success, Failed). Easier error handling/re-runs. Azure Data Factory (ADF) Pipeline Orchestrates the entire process. Activity 1 (Copy Activity): Reads Excel from Blob → Loads into SQL staging table. Activity 2 (Web Activity): Calls a Logic App to start processing staged rows. Logic App (Transformation + Import into Dataverse) Triggered via HTTP by ADF. Iterates throug...
Key ADF Features for Row-Level Fault Tolerance
- Get link
- X
- Other Apps
Fault Tolerance in Copy Activity You can configure Skip incompatible rows or Redirect rows with errors into a separate log/sink (like Blob or SQL). This allows you to capture failed records without breaking the entire batch. Mapping Data Flows – Error Row Handling Data Flows allow “Error Row Handling” , so you can redirect specific problematic rows (e.g., invalid GUID, missing lookup) to a quarantine table. Example: Valid rows → Dataverse Invalid rows → Error table with reason Retry Policies Copy activity has built-in retry policies with exponential backoff for transient errors. Helpful when Dataverse throttles or API timeouts happen. Logging & Monitoring ADF can now output error details (like column name, error code, error description) to another sink. This is row-level granularity for diagnostics. 🔹 So Why Use Logic Apps Then? Even though ADF now supports row-level fault tolerance, Logic Apps still makes sense in ...
Best practices to avoid throttling when Azure Functions integrate with Dataverse:
- Get link
- X
- Other Apps
🔹 1. Use Custom API / Plugin (preferred approach) Encapsulate multiple operations inside one Dataverse Custom API . Azure Function only makes one call per message. Benefits: fewer API calls, transactional consistency, central business logic. 🔹 2. Use Batch (ExecuteMultiple / $batch) If you must call Dataverse directly, use $batch requests . Example: instead of 50 separate POST /contacts , send them in 1 batch request . This reduces entitlement usage and network overhead. 🔹 3. Distribute load across Application Users Create multiple App registrations in Azure AD (with Dataverse application users). Spread API calls across them to avoid hitting the per-user limits . Example: Function A uses AppUser1, Function B uses AppUser2. 🔹 4. Implement Retry with Exponential Backoff Always catch 429 (Too Many Requests) and 5xx errors . Retry after the Retry-After header or use exponential backoff (2s → 4s → 8s…). Don’t retry immediately ...
Dataverse + Service Bus + Custom API integration:
- Get link
- X
- Other Apps
Let me break that passage down in simple terms with context around Dataverse + Service Bus + Custom API integration: 🔹 What it means “To execute operations on Dataverse, we rely on calling a custom API.” Instead of making multiple direct CRUD calls (Create, Read, Update, Delete) from Azure Functions or Service Bus consumers into Dataverse, you create a Custom API in Dataverse. This Custom API acts like a single entry point for your external system. “Our function will perform only one call to the Dataverse endpoints, reducing the consumption of API calls entitlements.” Every request to Dataverse consumes API calls quota (licensing entitlements). If you send multiple calls for one business process (like create a case, add tasks, update customer, assign), you consume many entitlements. By wrapping all that logic in a Custom API , you reduce it to just one call . “Our business logic can be grouped on Dataverse side in a single plugin, exposing ...
Overview of Dynamics 365 Contact Center
- Get link
- X
- Other Apps
Dynamics 365 Contact Center was launched in July 2022, marking a significant evolution within Microsoft's product suite, following a two-decade journey of development within the Dynamics family. The launch was influenced by the acquisition of Nuance, which brought advanced AI capabilities, including IVR, biometrics, and fraud prevention features. This contact center solution is designed to integrate seamlessly with existing Microsoft services and is built on Azure Cloud architecture. Key Features Omni-Channel Support : The platform provides a comprehensive omni-channel experience, allowing interactions across various channels such as voice, chat, and social media. AI Integration : Utilizes AI-driven features like co-pilot for agent assistance and frictionless conversational IVR for enhanced customer engagement. Data and Reporting : Leverages Microsoft Dataverse for real-time data access and power BI for reporting and analytics, improving decision-making for contact center managers....
Overview of Dynamics 365 Contact Center and Customer Service
- Get link
- X
- Other Apps
Overview of Dynamics 365 Contact Center and Customer Service Dynamics 365 Contact Center and Dynamics 365 Customer Service are distinct products designed for customer interaction, each suited to different use cases. Both applications provide omnichannel communication capabilities, allowing interactions across voice, chat, and other channels. The choice between these two solutions hinges primarily on the case management system in use. Key Functional Differences The Customer Service application integrates seamlessly with Dynamics 365 for case management, enabling users to create cases directly from the customer interaction interface. In contrast, the Contact Center lacks built-in case creation functionality, requiring users to connect to third-party case management systems, such as Salesforce. This difference in case management capabilities is crucial when deciding which product to implement based on organizational needs. User Interface and Experience Both applications feature fami...
Handling Transaction Failures in Dataverse Integrations
- Get link
- X
- Other Apps
When Dataverse is integrated with external systems, ensuring consistency between the two systems during failures is critical. Since Dataverse does not have distributed transactions with third-party systems, you need a compensating mechanism and a notification strategy . 1. Failure Notification to External System Use Power Automate or a Plugin to catch transaction failure events. Notify the third-party application via: Service Bus/Event Hub (for async notification) Webhook / Custom API call (for direct notification) Ensure the payload includes transaction details, error reason, and correlation ID . 2. Rollback Mechanism (External System Considerations) Since Dataverse cannot directly roll back in an external system, you must design for compensating transactions : Check if the external system provides rollback APIs (e.g., delete, cancel, or reverse transaction endpoints). If rollback is supported: Call the rollback API when Dataverse transact...
Key Steps for Handling Large Data Volumes During Migration
- Get link
- X
- Other Apps
Migrating large volumes of data to a new system is always a challenge. It requires careful planning to ensure smooth cutover, minimal downtime, and data integrity. Below are some key steps and considerations for handling high data volumes during migration: 1. Categorize Your Data Not all data is the same. Classify it into categories such as: Transactional Data – recent or ongoing business operations. Master Data – customers, vendors, products, or reference data. Historical Data – legacy records that may be required for compliance or reference. This helps you apply different migration strategies for each category. 2. Explore Early Migration Options Where possible, migrate historical or less frequently used data earlier, for example, during UAT (User Acceptance Testing) . This reduces the data volume that needs to be handled during the final cutover window, minimizing downtime. 3. Identify and Handle Delta Data During the migration window, new transactions m...
Key Insights from the Unified Routing Partner Workshop – Routing Emails & Cases in Dynamics 365
- Get link
- X
- Other Apps
In the world of modern customer service, getting the right case to the right agent at the right time is critical. Microsoft’s Unified Routing in Dynamics 365 Customer Service takes this challenge head-on by blending automation, AI-driven intelligence, and flexible rules to deliver optimal routing for cases and emails. Here’s a breakdown of the key takeaways from the recent workshop on how Unified Routing works and why it matters. What is Unified Routing? Unified Routing automates how work items —such as cases, chats, or emails—are assigned to agents. Using a combination of machine learning, configurable rules, and real-time agent data , the system factors in: Agent skills Availability Customer urgency & sentiment The result? A smarter, more efficient distribution of work that aligns with business goals and customer expectations . Configuration & Workforce Management Unified Routing is controlled from a centralized management hub , giving admins th...