Confiz https://www.confiz.com/ Fri, 04 Jul 2025 10:27:48 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://www.confiz.com/wp-content/uploads/2023/07/favicon.png Confiz https://www.confiz.com/ 32 32 The ultimate guide to Azure Cloud security Audits: Importance, best practices and checklist https://www.confiz.com/blog/the-ultimate-guide-to-azure-cloud-security-audits-importance-best-practices-and-checklist/ Fri, 04 Jul 2025 07:17:12 +0000 https://www.confiz.com/?p=9474 As businesses rapidly migrate their workloads to the cloud, Microsoft Azure has emerged as a leading cloud platform powering digital transformation for enterprises worldwide. However, with the speed and scalability of Azure cloud services come new cybersecurity challenges.

While Microsoft Azure provides robust infrastructure-level protections, the shared responsibility model means that organizations remain accountable for securing their applications, data, identities, and access configurations within their Azure environment. This is where an Azure cloud security audit becomes essential, a critical process designed to identify vulnerabilities, ensure regulatory compliance, and maintain a strong security posture amid an evolving threat landscape.

Importance of a security audit: Why does it matter more than ever?

In today’s dynamic and complex cloud environments, cyber threats are becoming more advanced and persistent. Many organizations assume that Microsoft secures everything within Azure by default. This misconception leads to overlooked risks, including misconfigured resources, excessive permissions, outdated policies, and unencrypted sensitive data.

A comprehensive Azure security audit helps identify vulnerabilities across your cloud infrastructure. Without regular audits, these hidden risks can go unnoticed, increasing the likelihood of security breaches and compliance violations. Here’s why an Azure security audit is more important than ever:

Prevent costly data breaches

One of the most serious consequences of poor cloud security is a data breach. Weak access controls or improperly configured Azure storage accounts can expose sensitive customer, employee, or financial data. A breach can result in financial loss, legal complications, and erosion of customer trust.

Meet regulatory compliance requirements

Azure environments must comply with industry regulations such as GDPR, HIPAA, ISO 27001, and NIST. A lack of regular security assessments can result in non-compliance, leading to hefty fines and reputational damage. Security audits help validate your alignment with these standards and uncover gaps before regulators do.

Strengthen identity and access management

Unauthorized access resulting from compromised credentials or inadequate identity controls poses a significant security risk. Azure security audits assess your identity and access management setup, helping to enforce best practices such as multi-factor authentication (MFA) and role-based access control (RBAC) to prevent privilege escalation and unauthorized activity.

Detect and prevent hidden cost exploits

Unsecured Azure environments can be exploited for illicit activities such as crypto mining, resulting in unexpected spikes in cloud usage costs. Regular audits help detect such activities early and reduce unnecessary expenditures.

Reduce operational downtime

Security incidents, whether caused by DDoS attacks, malware, or human error, can result in significant downtime. This affects business continuity and violates SLAs. Azure security audits help you strengthen incident response plans and improve overall system resilience.

Further readings: Best practices for setting up a cloud security compliance framework

What does a security audit cover? Your Azure Security Audit checklist

A comprehensive Azure security audit is a crucial process for assessing the security controls and configurations within your cloud environment. By regularly auditing your Azure setup, you can identify vulnerabilities, prevent security incidents, and ensure compliance with regulatory standards. Below are the key areas typically covered in an Azure security audit checklist:

1: Identity and Access Management (IAM)

Identity and Access Management (IAM) ensures that only authorized users and applications have access to your Azure resources. A security audit of IAM focuses on reviewing user permissions, enforcing multi-factor authentication (MFA), and securing access controls to minimize the risk of unauthorized access.

  • Role-Based Access Control (RBAC): Review access permissions to ensure users and apps have only the necessary privileges.
  • Multi-Factor Authentication (MFA): Confirm that MFA is enforced for all accounts, especially administrative ones.
  • Identity protection: Audit your Microsoft Entra ID (previously known as Azure Active Directory) settings for security and compliance with identity policies.

2: Network security

Network security safeguards your resources from unauthorized network access and attacks. This audit area focuses on reviewing network configurations, including virtual networks, firewalls, and VPNs, to ensure proper segmentation, isolation, and encryption of network traffic.

  • Virtual networks and subnets: Evaluate the configuration of virtual networks to ensure proper segmentation and security.
  • Network Security Groups (NSGs): Check for correct inbound and outbound traffic rules to restrict unauthorized access.
  • Firewalls and VPNs: Ensure that Azure Firewall and VPN connections are set up securely to prevent unauthorized network traffic.

3: Data protection

Data protection involves securing sensitive information from unauthorized access or accidental exposure. This audit area focuses on assessing encryption practices, secure data storage, and ensuring that sensitive data is protected both in transit and at rest.

  • Encryption protocols: Ensure data is encrypted both at rest and during transmission.
  • Azure Key Vault: Verify that sensitive information, such as keys and certificates, is securely stored in Azure Key Vault.
  • Data classification: Review data classification policies to protect sensitive information from unauthorized access.

4: Compliance alignment

Azure environments must comply with various industry regulations like GDPR, HIPAA, and NIST. A security audit assesses how well your Azure setup aligns with these standards, helping you identify gaps and minimize the risk of non-compliance penalties.

  • Regulatory standards: Map your Azure configurations against regulatory requirements such as GDPR and HIPAA to identify compliance gaps.
  • Compliance manager reports: Review compliance reports to ensure your resources align with legal and regulatory requirements.

5: Monitoring and logging

Continuous monitoring and logging are crucial for identifying potential security threats in real-time. This audit area ensures that proper monitoring tools, like Azure Security Center and Azure Sentinel, are in place to track security events, generate alerts, and maintain logs for future incident investigations.

  • Azure Security Center: Ensure that continuous security monitoring is enabled to identify vulnerabilities in real-time.
  • Azure Sentinel: Leverage Azure Sentinel for advanced threat detection and incident management.
  • Audit logs: Confirm that audit logs are enabled and retained to support security investigations.

6: Configuration review

Ensuring your Azure resources are configured according to best practices and security policies is essential for reducing the attack surface. A configuration review audits the settings across resources, such as virtual machines and app services, to confirm compliance with security standards.

  • Azure Resource Manager (ARM) templates: Review ARM templates for security best practices when provisioning resources.
  • App services security: Audit the security settings for app services to ensure secure coding practices are in place.
  • Virtual Machines (VMs): Ensure proper configuration and patch management for VMs to protect against vulnerabilities.

How do you strengthen your organization’s Azure Security posture?

While Microsoft provides powerful tools, such as Microsoft Defender for Cloud and Azure Policy, to help secure your Azure environment, a robust security posture requires a strategic and layered approach. Some Azure security audit best practices that you can adopt to strengthen your organization’s security posture include:

  • Enabling Multi-Factor Authentication (MFA) adds an extra layer of identity verification, reducing the risk of compromised user accounts.
  • Implementing Role-Based Access Control (RBAC) enforces the principle of least privilege, ensuring users have only the access necessary to perform their roles.
  • Encrypting data both at rest and in transit protects sensitive information from interception or unauthorized access.
  • Setting up virtual networks and firewall controls helps manage network traffic and protect cloud assets from external threats.
  • Performing regular vulnerability scans and patch management helps identify and remediate security gaps before they can be exploited, thereby enhancing overall security.
  • Utilizing continuous monitoring with Azure Defender and Azure Sentinel detects emerging threats, analyzes security incidents, and responds swiftly to security alerts.
  • Engaging in third-party security audits helps you gain an unbiased assessment of your Azure security configurations and receive actionable recommendations.

Protect your cloud investments with Confiz’s Azure security expertise

At Confiz, we help enterprises gain complete visibility into their Azure cloud environment, minimize security risks, and close compliance gaps through our in-depth Azure Cloud Security Assessment. Our expert team conducts a thorough evaluation of your Azure configurations, identifies vulnerabilities, and provides clear, actionable steps to strengthen your security posture.

Our Azure Security Assessment offer includes:

  • Comprehensive audit report detailing vulnerabilities, misconfigurations, and compliance gaps
  • Risk heatmap identifying high-priority security concerns
  • An actionable remediation plan aligned with industry best practices
  • Recommendations for strengthening compliance with GDPR, HIPAA, and NIST
  • Data encryption assessment (at rest and in transit)
  • Network security review, including Virtual Networks and Firewalls
  • Vulnerability scanning and penetration testing
  • Continuous monitoring setup with Microsoft Defender for Cloud
  • Audit readiness evaluation and compliance tracking

By partnering with Confiz, you can reduce the risk of data breaches, improve regulatory alignment, safeguard your brand reputation, and ensure business continuity.

Regular Azure security audits are not just a best practice; they’re essential. They empower your organization to identify threats early, enforce compliance, and build a resilient cloud environment. To learn more or request your assessment, contact us at marketing@confiz.com.

]]>
Retrieving data in Dynamics 365 F&O in Copilot Studio using Power Automate https://www.confiz.com/blog/retrieving-data-in-dynamics-365-fo-in-copilot-studio-using-power-automate/ Thu, 03 Jul 2025 12:34:27 +0000 https://www.confiz.com/?p=9465 In today’s AI-driven business landscape, organizations are looking to automate critical financial operations and streamline decision-making. Microsoft Dynamics 365 Finance and Operations (D365 F&O) offers robust tools for managing complex business processes; however, accessing this data can often be slow and manual.

By integrating D365 F&O with Microsoft Copilot Studio, businesses can enable conversational agents that retrieve and interact with F&O data using natural language.

In this article, we will explore two integration approaches, via Power Automate flows and Dataverse tables, and how they support the automation of key financial and operational processes.

Working with Dynamics 365 F&O data entities

At the core of Dynamics 365 Finance and Operations are data entities that store critical business data, such as customers, vendors, and financial transactions. These entities are exposed through OData endpoints, providing standardized web access to data that external tools, such as Microsoft Copilot Studio, can consume.

Example:

  • Entity Name: CustomersV3
  • Endpoint: https:// [your-instance].operations.dynamics.com/data/

Customization options for data entities

While default entities cover many scenarios, organizations often need to tailor data structures to their unique business processes. Dynamics 365 Finance and Operations supports:

  • Extending existing entities by adding custom fields to capture additional information relevant to financial operations or human resources.
  • Developing custom entities using X++ development enables deeper customization for specialized business-critical agents.

These customization capabilities ensure that your AI agents can access and operate on the precise data needed to execute complex workflows.

Read more: The integration of Dynamics 365 F&O and Copilot Studio: Low-code for AI

Methods to retrieve and integrate Dynamics 365 F&O Data into Copilot Studio

There are two main approaches to bring Dynamics 365 Finance and Operations data into Microsoft Copilot Studio for agent development:

1. Power Automate flows

This method is ideal for creating flexible, logic-based workflows that can dynamically retrieve, transform, and process data. Power Automate features extensive connectors and a low-code environment, making it accessible to both developers and business users for automating business processes.

2. Dataverse tables

For organizations replicating D365 F&O data into Microsoft Dataverse, Copilot Studio can directly access this data, simplifying integration and enabling faster agent flows.

In Copilot Studio, agent triggering allows agents to be activated by user messages or specific events, enabling them to automate business processes and execute actions across various channels and applications.

Both methods support the creation of powerful AI agents that can execute financial processes, automate business workflows, and interact across multiple channels such as Microsoft Teams or voice agents.

Using Power Automate for retrieving and processing data in Dynamics 365 F&O: An implementation guide

For Dynamics 365 Finance and Operations, the “Fin & Ops Apps (Dynamics 365)” connector provides direct access to entities and operations, enabling you to create intelligent agents that automate financial operations and other mission-critical business processes.

Choosing the right flow action

Power Automate offers several actions tailored for D365 F&O data management:

ActionDescription
Get a RecordRetrieve one item from an entity
List RecordsFetch multiple rows from an entity
Create a RecordAdd new data to an entity
Update a RecordModify existing entries
Delete a RecordRemove data from the system
Execute ActionTrigger predefined operations

For retrieving bulk data with specific criteria, the List Records action is particularly powerful, allowing you to filter, sort, and limit results efficiently.

Key parameters in the ‘List Records’ action

When configuring the List Records action, consider these parameters to optimize data retrieval:

  • Instance name: Specifies the D365 F&O environment hosting your data.
  • Entity name: The target entity, such as CustomersV3 or custom entities.
  • Filter query: Retrieves only records that match the specified criteria.
  • Order by: Sorts results by specified fields.
  • Top count: Limits the number of records returned.
  • Select query: Selects specific fields to reduce payload.
  • Cross company: Enables fetching data across multiple legal entities.

Data transformation in Power Automate

After pulling data from D365 F&O, you can shape and format it for Microsoft Copilot Studio using built-in actions:

  • Parse JSON: Converts raw responses into structured data for easier handling.
  • Select: Maps and extracts the necessary fields to create a clean output.
  • Append to string: Formats responses into user-friendly messages.
  • Send to Microsoft Copilot Studio: Passes processed data back to the conversational AI for agent responses.

These transformations empower your AI agents to understand and respond with precise and actionable business information.

Connecting the flow with Microsoft Copilot Studio

Once your Power Automate flow is ready, integrate it within a Topic in Microsoft Copilot Studio. This connection enables your AI agents to trigger the flow, retrieve data, and interact in real time with users.

For testing, use tools like the Test Sidecar or the D365 F&O Sidecar environment to simulate conversations and validate agent behavior before publishing.

After thorough testing, publish your changes to make intelligent agents available to end users, enabling them to execute business processes conversationally across multiple channels, including Microsoft Teams and voice agents.

Conclusion

Successfully integrating Microsoft Copilot Studio with Dynamics 365 Finance and Operations empowers organizations to automate business processes with intelligent, AI-driven agents. By connecting through Power Automate flows and Dataverse tables, businesses can eliminate manual effort, enhance operational visibility, and accelerate decision-making across finance, HR, and supply chain functions.

Exploring how Copilot Studio can transform your ERP automation strategy? We’ll help you assess integration needs, design scalable solutions, and ensure a smooth deployment. For tailored guidance, connect with us at marketing@confiz.com.

]]>
Smart licensing in Dynamics 365 F&O: How to cut costs and prepare for upcoming changes? https://www.confiz.com/blog/smart-licensing-in-dynamics-365-fo-how-to-cut-costs-and-prepare-for-upcoming-changes/ Wed, 02 Jul 2025 09:05:13 +0000 https://www.confiz.com/?p=9447 Starting September 1, 2025, Microsoft will begin enforcing user license validations in the Microsoft Dynamics 365 Finance & Operations (D365FO). Active users without appropriate licenses aligned to their system access will start receiving notifications within the D365FO Production environment.

The User license validation notice is part of Microsoft Proactive Quality Updates (PQU) and has been rolled out into customer environments as mandatory feature since May 2025 itself.

Read further: What does the Microsoft D365 F&SCM license enforcement 2025 mean?

So, how do you prepare for this change and evaluate your existing licenses to optimize user license costs? This blog guides you through the process, providing insights from our experienced ERP consultants.

Why is it important to optimize Dynamics 365 F&O licensing cost

As we get closer to the date when notifications will start appearing for users in D365FO, it’s a strategic necessity and important for ERP Admins to consider options for optimizing user licensing costs. Here are some reasons why:

Licenses have a direct cost impact

This one’s obvious. Microsoft D365FO operates on a per-user license model where each user role (e.g., Finance, Operations, Team Member) has a different monthly cost. This means if users are unnecessarily assigned higher-tier roles, your organization is paying more without the added benefit.

For instance:

  • Operations Activity License ≈ $50/user/month
  • Finance or Supply Chain full license ≈ $210/user/month
  • Team Member license ≈ $8/user/month

Unused or overallocated licenses waste money

Many organizations do not periodically review user roles and licenses they are consuming. This results in overallocated subscription costs. Here’s what you can do:

  • Look for inactive users: Dormant or inactive users still consuming paid licenses
  • Users with full licenses performing only basic tasks: Identify users with multiple higher privileges or full license roles (e.g., Operations) but are using only low-tier features.
  • Over-licensing or misaligned licensing: E.g., assigning a Finance license to someone doing only data entry.

In short, users should only be granted access to the resources necessary to perform their specific job functions.

Scalability and forecasting

When expanding operations, inadequate licensing can impact scalability and system performance, potentially leading to unexpected costs or delays in business operations. Optimizing D365FO licenses is not about reducing functionality; it’s about paying only for what you use.

Microsoft allocates resources based on the user licenses subscribed; therefore, an optimal license enables organizations with:

  • Agile onboarding for new teams
  • Better compliance and license budget forecasting

Ensuring compliance and optimizing licensing in Dynamics 365 Finance & Operations: A self-assessment checklist

Managing licenses efficiently in Dynamics 365 F&O isn’t just about cost control; it’s also key to maintaining compliance. As a user, you can take proactive steps through a self-assessment to identify optimization opportunities and reduce unnecessary licensing expenses. Here’s how to get started:

  1. Analyze user activity: Review user login patterns and system usage to understand who is actively using the application and how.
  2. Compile a list of users and security roles: Gather a comprehensive list of all users, along with their assigned security roles, in the system.
  3. Identify compliance vs. non-compliance: Cross-check assigned roles against Microsoft licensing requirements to flag any non-compliant user-role combinations.
  4. Evaluate high-cost roles: Identify which roles are mapped to higher license tiers, and assess if they’re truly needed based on user responsibilities.
  5. Spot optimization opportunities: Identify instances where roles can be adjusted, combined, or reassigned to align with lower license requirements.
  6. Downgrade security roles where applicable: If certain users don’t need full functionality, consider downgrading them to roles that require a less costly license, without affecting productivity.

Read more: A guide to effective license management for Dynamics 365 F&O

How to downgrade licensing cost without hurting business operations?

Let’s say you’ve completed the steps above, but now you’re facing one key challenge: reducing licensing costs through role downgrades. To help with this, our experts have curated practical steps to guide you in identifying and downgrading excessive licenses without disrupting user productivity.

Step 1: Enable user security governance in Microsoft D365 F&O

The User Security Governance feature is a valuable addition to Dynamics 365 Finance & Operations. It provides an easy and better approach to define custom security roles, inquiry, and optimize license utilization. Now, system administrators can track changes in security privileges, access to entry points, and license consumption by each user.

To enable User Security Governance (preview), go to System Admin > Workspaces > Feature Management, as shown in the screenshot below.

Note: Please note that Microsoft will continue refining the USG feature to ensure stability ahead of its general availability in July 2025 with the 10.0.44 release. As a result, the features available in build 10.0.43 may differ from those in 10.0.44 due to ongoing enhancements.

You should be able to see the “Security Governance” under System Administration > Security > Security Governance.



Set up user aging periods (parameter)

Now it is time to set up user activity aging periods. To do this, navigate to System Administration > Setup > Security Governance Setup Parameters and set the values for Periods 1 to 5 (as shown in the screenshot below).

Step 2: Analyze user activity

You are now ready to conduct a “User activity audit” which helps understand and analyze active users in the system and start a cleanup activity. This will help optimize license utilization, reducing unnecessary costs, and potentially avoiding the need to purchase additional licenses.

Navigate to System Administration > Security > Security Governance > User Activity Aging

The screenshot below shows the user activity inquiry screen. Use this to identify users with extended periods of inactivity. Collaborate with relevant departments to review these users, remove unnecessary licenses, and disable accounts inactive for more than 60 or 90 days.

If you only want to identify inactive users (without removing licenses or blocking them), simply comment out the -BlockedUsers and -RemovedLicenses filters in the User names field under System Administration > Users.

Unused accounts not only consume valuable licenses but also pose compliance and security risks. Proactively managing these accounts ensures better license allocation and reduces administrative overhead.

Step 3: Conduct license usage audit

Now it’s time to conduct a “License usage audit” to understand who has which license and why. This will help review user access, assign appropriate roles, and identify security roles that may require optimization.

To perform this audit, navigate to System Administration > Security > Security Governance > License Usage Summary.

This report provides two(2) important pieces of information:

  1. User licenses – which users are consuming which and how many licenses
  2. User Role licenses – Licenses consumed by each user role

User licenses explained

The “User Licenses” tab, in the “License Usage Summary report”, lists the type of license that each user is currently consuming, based on their cumulative assigned roles, duties, and privileges.
User licenses (tab) inquiry provides details of License (e.g, Finance, Supply Chain Management) and License type (Base or attach) applicable for each user (as shown highlighted in screenshot below).

How do “user licenses” help organizations?

  • Cost: Indicates the actual license level being billed for a user (e.g., Operations, Finance, Activity, Team Member)
  • Cost optimization: Helps identify users assigned high-tier licenses unnecessarily
  • Correct licensing: Correlates license level, higher-tier licenses usually grant broader system access

For example, “User 1” (shown in red, highlighted in the screenshot above) is listed as consuming three licenses: (1) “Finance”, (2) “Human resource”, and (3) “Supply chain management.” However/their job is limited to reviewing and posting vendor payments. This means that unnecessary system access is adding to the cost of licenses.

  • Action: Review his/her roles. It is likely that the user only needs the Finance license (Base).
  • Impact: Downgrading could save ~$60/month/user while minimizing unnecessary access rights

User role licenses explained

The “User role Licenses” tab in the “License Usage Summary report” lists all security roles assigned to users and maps them to the minimum required license based on the highest privilege of any duty or privilege in that role, as shown in the screenshot below.

In the User Role Licenses section, you’ll find two key pieces of information:

  1. License type
  2. License quantity per user and role (highlighted in red in the screenshot above).

Focus only on entries where the License quantity equals 1, as this indicates an active license. Entries with a value of 0 can be ignored. That’s why only three licenses apply to User-1 (highlighted in yellow).

You can now correlate license information for User-1 across both the “User Licenses” and “User Role Licenses” sections.

You’ll notice that both sections indicate three licenses assigned to User-1. However, the User Licenses section provides additional details, including the license type (e.g., Finance, Supply Chain Management, Human Resources) and whether it’s a base or attach license. This distinction is important, as the cost difference between Base and Attach licenses can be significant.

How do “user roles licenses” help organizations?

  • License eligibility check: Determines which roles push a user into a higher license bracket
  • Security role impact analysis: Identifies which roles are license-intensive and shall be reviewed and optimized
  • Custom role assessment: Identify candidate roles for re-designing or modification to ensure they stay within a desired license tier

For example, the role “AP_DirectPayment_Custom” is mapped to a Finance license because it includes the duty to approve vendor invoices, a task that requires a higher-tier license. This role is also assigned to multiple users, making it essential to determine whether all users truly need access to this level of functionality.

  • Action: Redesign the role to exclude review & approval privileges, keeping only entry-level tasks
  • Impact: Brings the role under Activity license, allowing more users to use it cost-effectively.

Using “User Licenses” and “User Role Licenses” for security and cost optimization

Out-of-the-box security roles in Dynamics 365 Finance & Operations often don’t align with the unique needs of an organization. They may lead to higher licensing costs if assigned as-is. To optimize both security and cost, organizations should consider creating custom roles while reusing standard privileges wherever possible. This approach simplifies setup and ensures proper access control.

Key principle: Optimizing D365FO licenses isn’t about reducing functionality; it’s about ensuring you’re only paying for what you use.

By conducting a structured review of license usage and security roles, organizations can:

  • Reduce over-licensing and redundant privileges
  • Align user roles more closely with actual job responsibilities
  • Improve compliance and forecasting
  • Strengthen overall system security

Best practices for license and security optimization

1: Correlate user licenses with job roles
If a user has a high-tier license, determine which role(s) triggered it and redesign accordingly to reduce licensing requirements.
2: Ongoing license monitoring
Continuously monitor license usage and schedule regular reviews with IT and department heads.
3: Implement HR-offboarding policies
Ensure user licenses are removed during the HR offboarding process.
4: Clean up inactive and duplicate users
Remove test/demo accounts and inactive users that consume real licenses.
5: Comply with Microsoft licensing policies
Reevaluate and redesign security roles to minimize permissions and avoid unnecessary high-cost licenses.
6: Use custom roles where appropriate
Avoid assigning default high-privilege roles, such as “System Admin,” to general users. Instead, use modular custom roles such as:

  • Read-only roles
  • Data entry roles
  • Task-specific roles (e.g., AP clerk, warehouse user)

7: Leverage Microsoft’s User Security Governance (USG) feature
The USG feature in D365FO eliminates the need for third-party ISV security tools. It supports:

  • Mapping security roles to business processes
  • Managing access rights and tracking changes
  • Enhancing audits and operational security
  • Ensuring compliance with Microsoft licensing policies
  • Generating license utility reports for better insights

By following these practices, organizations can establish a secure and cost-effective user access structure tailored to real-world usage.

Preparing for licensing updates: Why it matters?

Licensing updates in Dynamics 365 F&O can significantly impact your ERP budget, potentially costing millions if overlooked. With this self-assessment guide, you’re already well on your way to identifying gaps and optimization opportunities.

Now is the ideal time to evaluate your current Dynamics 365 F&O/SCM licensing setup, ensure compliance, and prepare for long-term success. At Confiz, we offer a comprehensive License Compliance Assessment to help organizations assess their current licensing landscape and get ready for Microsoft’s upcoming enforcement policies.

For expert guidance on Dynamics 365 Finance and Operations licensing, cost optimization, or implementation, contact us at marketing@confiz.com.

]]>
Optimizing inventory tracking in Microsoft Dynamics 365 Finance & Operations https://www.confiz.com/blog/optimizing-inventory-tracking-in-microsoft-dynamics-365-finance-operations/ Tue, 01 Jul 2025 07:30:38 +0000 https://www.confiz.com/?p=9472 Accurate inventory tracking is crucial for effective supply chain management and accurate financial reporting. Microsoft Dynamics 365 Finance & Operations (D365 F&O) provides a comprehensive suite of inventory management tools that support a range of functionalities, from real-time warehouse control to product costing. However, the true value of these tools depends on how well they are configured, tested, and adopted within business processes.

This article examines key inventory tracking features in D365 F&O, the benefits and limitations of each, and provides actionable best practices to support a successful implementation.

Key inventory tracking capabilities in Dynamics 365 Finance and Operations

Dynamics 365 Finance and Operations offers robust inventory tracking features designed to boost accuracy, traceability, and operational control. Here are the key capabilities you should know:

1. Multi-Unit of Measure (UoM) tracking

Overview

Supports inventory tracking across various units (e.g., boxes, pallets, pieces), enhancing operational flexibility.

Benefits

  • Simplifies procurement and sales processes
  • Enhances financial and inventory accuracy

Limitations

  • Requires precise configuration and thorough testing
  • User training is essential to prevent transactional errors

Pro tip

Always validate unit conversions using real-life data in a sandbox environment before go-live.

2. Catch weight management (Variable weight items)

Overview

Ideal for industries such as food, chemicals, or metals, where items vary by weight but are sold by quantity.

Benefits

  • Supports weight-based pricing and costing
  • Enables accurate inventory valuation for variable-weight products

Limitations

  • May require integration with digital scales or weighing systems
  • Adds complexity to transactional flows

Pro tip

Use catch weight tracking only when weight impacts pricing, cost, or compliance.

Read more: How to set up catch weight products in Dynamics 365 Finance and Operations

3. Lot and serial number tracking

Overview

Allows tracking of individual items or item batches through lot or serial numbers.

Benefits

  • Ensures traceability for regulatory compliance
  • Enhances quality control and recall management

Limitations

  • Increases data entry and operational time
  • Not suitable for all item types

Pro tip

Restrict tracking to regulated or high-value items to maintain operational efficiency.

4. Advanced Warehouse Management (WMS)

Overview

Leverages barcode scanning, mobile workflows, and automation for real-time inventory visibility.

Benefits

  • Streamlines warehouse operations
  • Reduces errors through mobile validation and automation

Limitations

  • Requires detailed design and process testing
  • Must be piloted in a controlled environment

Pro tip

Implement gradually, starting with a specific warehouse zone before scaling organization-wide.

Further reading: Drive operational excellence with Dynamics 365 Warehouse Management Solution

5. Inventory costing and valuation methods

Overview

Supports methods such as FIFO, LIFO, moving average, and standard costing to align with financial reporting standards.

Benefits

  • Ensures accurate cost tracking
  • Supports audit and compliance requirements

Limitations

  • Requires collaboration with finance stakeholders
  • Periodic reconciliation is necessary

Pro tip

Select the appropriate costing method early and align it with organizational accounting practices.

6. Product variants and inventory dimensions

Overview

Enables item tracking at the variant level (e.g., size, color, configuration) for improved master data management.

Benefits

  • Simplifies reporting and stock management
  • Reduces item duplication

Limitations

  • Excessive variant use can lead to system complexity
  • Requires disciplined data governance

Pro tip

Maintain a clean variant matrix and adhere to consistent naming conventions.

Best practices for inventory tracking in D365 F&O

1. Conduct a business process review

Analyze current processes before selecting tracking options, particularly for costing, fulfillment, and inventory reconciliation.

2. Test with real data

Simulate transactions in a sandbox environment, including edge cases like returns, partial deliveries, and non-standard UoMs.

3. Focus on user training

Go beyond “how” and explain the “why” to warehouse, procurement, and finance teams.

4. Implement barcode scanning and mobile apps

Enhance speed and accuracy by utilizing D365 mobile apps for inventory management and counting.

5. Schedule regular inventory reconciliation

Utilize cycle counting and inventory journals to identify and correct discrepancies.

6. Track key inventory metrics

Monitor KPIs like inventory accuracy, turnover rate, and warehouse performance using embedded Power BI or third-party analytics.

Concluding thoughts: Strategic advantage through inventory tracking

Effective inventory tracking in Microsoft Dynamics 365 Finance and Operations is more than a system configuration; it is a strategic initiative that strengthens operational reliability, financial transparency, and supply chain agility.

Organizations that implement D365 F&O’s inventory tracking capabilities with discipline and purpose can achieve improved forecasting accuracy, streamlined compliance, and better customer service.

Explore Confiz’s Microsoft Dynamics 365 Supply Chain Management services to drive smarter tracking and enhanced operational excellence. Contact us at marketing@confiz.com to discover how our experts can help you.

]]>
Low-code AI for Dynamics 365 F&O: Exploring Microsoft Copilot Studio https://www.confiz.com/blog/low-code-ai-for-dynamics-365-fo-exploring-microsoft-copilot-studio/ Wed, 25 Jun 2025 11:11:02 +0000 https://www.confiz.com/?p=9439 If you are using Microsoft Dynamics 365 Finance and Operations, you may have encountered difficulties accessing data and creating automations without coding. This challenge has been addressed thanks to Microsoft Copilot Studio!

Microsoft Copilot Studio’s low-code approach enables customers to construct intelligent, conversational agents that can interact with Microsoft Dynamics 365 for Finance and Operations (D365 F&O), all without writing a single line of X++ code. Whether you’re automating data retrieval, initiating Power Automate flows, or improving customer interactions, Microsoft Copilot Studio enables you to do more with less work.

In this article, we explore Microsoft Copilot Studio and how integrating it with Dynamics 365 Finance and Operations (F&O) can open up new possibilities.

What is Microsoft Copilot Studio? Let’s understand this platform

Microsoft Copilot Studio is Microsoft’s low-code development platform designed to create intelligent AI assistants (Copilots) tailored to business needs.

Some of its capabilities include:

  • Graphical interface for building workflows and automation.
  • Seamless integration with Power Automate and external APIs.
  • Easy addition of custom topics that interact directly with Microsoft Dynamics 365 for Finance and Operations.
  • Immediate publishing of topics for real-time user access.

Further readings: A new era in low-code and no-code evolution with Copilot Studio

Features and capabilities of Microsoft Copilot Studio

Below are some key features and capabilities of Microsoft Copilot Studio:

Adding Knowledge Sources in Microsoft Copilot Studio

In Microsoft Copilot Studio, knowledge sources collaborate with generative answers to enable agents to provide more relevant and helpful responses. When knowledge sources are added, agents can retrieve data from sources such as Power Platform, Dynamics 365, websites, and other external systems.

These sources can be added while creating an agent, after it has been created, or directly in a topic using a generative answers node.

Supported knowledge sources

You can connect the following types of knowledge sources:

  • Public websites: Enables the agent to search the query using Bing and return results only from the specified URLs.
  • SharePoint: Connects to a SharePoint URL and uses GraphSearch to return results.
  • Dataverse: Uses data from your configured Dataverse environment with retrieval-augmented generation.
  • Documents: Searches documents that are uploaded to Dataverse and returns content from them.
  • Enterprise Data via Copilot Connectors: Connects to organization data indexed by Microsoft Search.

Generative answers using knowledge sources

By default, Microsoft Copilot Studio includes a “Conversational boosting” topic with a generative answers node. All knowledge sources added at the agent level are automatically utilized by this node.
Generative answers help reduce the need to create topics for every question, thereby speeding up manual deployment.

Authentication

 i

When using sources such as SharePoint, Dataverse, or enterprise data, the agent will utilize Microsoft Entra ID authentication. This ensures that users only see content to which they have permission.

The role of actions in Microsoft Copilot Studio

In Microsoft Copilot Studio, actions are used to connect your Copilot to external systems and services. These actions power real-time data queries, process automation, and AI-driven responses.

Types of actions available in Copilot Studio

The way your Copilot communicates with other systems, such as D365 F&O, is determined by the actions you take. Here are a few actions that are supported:

  • Prebuilt connector actions – Use Microsoft’s built-in connectors to seamlessly integrate with services such as Dataverse, SharePoint, and others.
  • Custom connector actions – Connect to your APIs by establishing custom connectors tailored to your business requirements.
  • Power Automate flows – Automate and orchestrate business logic using Power Automate. Useful for triggering multi-step workflows or approvals.
  • AI Builder prompts – Generate dynamic responses using AI models. Ideal for summarizing content, extracting information, or providing intelligent suggestions.
  • Bot framework skills – Extend Copilot’s capabilities using custom skills developed in the Bot Framework. Best suited for offloading complex conversations or modular logic.
  • REST API calls – Make direct API calls to third-party or internal systems, including D365 F&O, giving you full control over request and response handling.

Dynamics 365 F&O and Copilot Studio: What does this integration enable?


Integrating Microsoft Dynamics 365 for Finance and Operations (F&O) with Microsoft Copilot Studio opens up powerful AI-driven experiences by combining enterprise data with natural language interaction. Here’s how it works and why it matters:

  • Custom AI tools: You can build AI tools that tap into F&O business logic using X++ code, then expose them as Copilot tools via Dataverse APIs.
  • Natural language interaction: Users can ask questions like “What’s the current balance for customer Contoso?” and Copilot will invoke backend logic to calculate and return the answer.
  • Embedded and sidecar experiences: Copilot can appear as a side panel (sidecar) or be embedded directly into F&O apps, offering contextual help, summaries, and task automation.

Conclusion

Microsoft Copilot Studio bridges the gap between Dynamics 365 Finance and Operations and users by providing an intuitive, low-code platform for building conversational AI experiences. This approach streamlines data retrieval, reduces manual effort, and helps teams make faster decisions. Ready to transform how your organization works with Dynamics 365 F&O? Contact us at marketing@confiz.com.

]]>
Model Context Protocol for Dynamics 365 Finance and Operations and Copilot Studio: A complete guide https://www.confiz.com/blog/model-context-protocol-for-dynamics-365-finance-and-operations-and-copilot-studio-a-complete-guide/ Tue, 24 Jun 2025 14:22:06 +0000 https://www.confiz.com/?p=9424 Modern AI applications increasingly require seamless access to enterprise data and tools across diverse environments. As organizations build intelligent agents at scale, it becomes essential to standardize how these systems connect to business logic, invoke tools, and adapt to evolving scenarios. This is where the Model Context Protocol (MCP) proves invaluable.

In this blog, we explore Model Context Protocol MCP for Finance and Operations apps, and how it enables scalable, intelligent integrations, especially within You’ll gain a clear understanding of what is Model Context Protocol, its architecture, real-world use cases, and how to build and deploy your own MCP server using Copilot Studio.

What is Model Context Protocol (MCP)?

The Model Context Protocol (MCP) is an open standard that connects large language models (LLMs) to external tools, business data, and APIs in a structured and uniform way. Often referred to as the “USB-C for AI agents,” the MCP Model Context Protocol eliminates the need for custom integration logic, enabling plug-and-play AI connectivity.

What are the main benefits of using the Model Context Protocol?

The Model Context Protocol (MCP) is designed to facilitate efficient and accurate interaction between large language models (LLMs) and external systems or applications. Here are the main benefits of using the Model Context Protocol:

  • Unified access to business logic and data across multiple finance and operations applications.
  • Cross-platform agent reuse for streamlined development and maintenance.
  • Tool interoperability, allowing tools to be accessed from any MCP-compatible agent platform.
  • Simplified development experience, reducing overhead for building and connecting intelligent agents.

Overview of the MCP architecture

The Model Context Protocol architecture is based on a modular client-server model. The main components include:

  • MCP Hosts
  • MCP Clients
  • MCP Servers

These elements work together to enable flexible, standardized communication between AI agents and enterprise systems. Below is a Model Context Protocol architecture diagram to illustrate the high-level data flow and interaction.

Extending Dynamics 365 F&O’s capabilities using MCP

Dynamics 365 F&O contains extensive business logic and data, making it ideal for LLM-based copilots. By building an MCP server over your D365 APIs or databases, you can create agents that read data, trigger actions, and answer complex business questions.

Read more: Key features to explore in Microsoft Dynamics 365 Finance and Operations in 2025

Prerequisites

Before using the Dynamics 365 ERP MCP server, ensure the following versions are met:

  • Finance and Operations apps version: 10.0.44 (10.0.2263.17) or later
  • Copilot in Microsoft Dynamics 365 Finance: 1.0.3049.1 or later
  • Copilot in Microsoft Dynamics 365 Supply Chain Management: 1.1.03046.2 or later

Introducing the default MCP server in Microsoft Dynamics 365

Microsoft Dynamics 365 ERP now includes a built-in MCP server. This server exposes tools from Dynamics 365 Finance and Operations applications to agent platforms that support MCP, enabling the following capabilities:

  • Agent access to data and business logic across multiple apps
  • Reuse of agents across ERP systems
  • Tool interoperability across any MCP-compatible agent platform
  • A simplified agent development experience

Using the Dynamics 365 ERP MCP Server in Copilot Studio

You can use the Dynamics 365 ERP MCP server to create agents in Microsoft Copilot Studio, too. The server provides tools that enable actions within Dynamics 365 Finance and Supply Chain Management.

Integrating the MCP server Copilot Studio workflow is straightforward:

  • Open or create an agent in Copilot Studio.
  • Navigate to the Tools tab and select ‘Add a tool.’
  • Filter by Model Context Protocol and search for ‘Dynamics 365 ERP MCP.’
  • Create a connection and add it to the agent.

Once connected, the agent can leverage all tools made available through the server to interact with your finance and operations data.

Read more: How to build your own copilot in Microsoft Copilot Studio?

An overview of Dynamics 365 ERP MCP tools: What’s available?

The Dynamics 365 ERP MCP server includes a static list of predefined tools. Each tool is backed by a custom Dataverse API, which defines its schema and performs the operation. You can find these APIs in the corresponding Dataverse solutions:

  • Copilot in Microsoft Dynamics 365 Finance
  • Copilot in Microsoft Dynamics 365 Supply Chain Management

Each tool includes:

  • A description of its purpose
  • A schema definition via the Dataverse custom API
  • A list of input parameters
  • A set of expected outputs

Here are some tools:

1. Find approved vendors

  • Name: findapprovedvendors
  • Purpose: Retrieves vendors approved to supply specific items
  • Input Parameters:
    • ItemNumber (String, Optional) — The item ID to filter approved vendors
    • vendorAccountNumber (String, Optional) — If provided, limits the result to this vendor
  • Use Case: Used by procurement agents to validate sourcing.
  • Output:
    • Itemnumber (String) — The item number.
    • approvedvendoraccountnumber (String) — The vendor account number of the approved vendor for the item.
    • validfrom (datetime) — The date and time from which the approval is valid.
    • validto (datetime) — The date and time from which the approval is valid.
  • Custom API: msdyn_FindApprovedVendors

2. Create a transfer order for a single item

  • Name: createtransferorderforsingleitem
  • Purpose: Creates a transfer order for a specified item.
  • Input Parameters:
    • ItemNumber (String, Required) — The item code to transfer.
    • fromWarehouseId (String, Required) — Warehouse ID from where the item will be shipped.
    • toWarehouseId (String, Required) — Warehouse ID to which the item will be sent.
    • quantity (int, Required) — The number of items to transfer.
  • Use Case: Used in inventory management agents to automate internal stock movements.
  • Output:
    • result (String) — A message indicating the result of the transfer operation.
  • Custom API: msdyn_CreateTransferOrderForSingleItem

3. Match the invoice

  • Name: matchinvoice
  • Purpose: Matches vendor invoice with product receipt.
  • Input Parameters:
    • invoiceId (String, Required) — Vendor invoice number to be matched.
  • Use Case: Ensures invoicing and receipt records align.
  • Output:
    • fullyMatched(Boolean) —Whether the invoice has been fully matched
  • Custom API: msdyn_VendInvoiceMatchProductReceiptCustomAPI

Conclusion

Successfully integrating the Model Context Protocol (MCP) with Microsoft Dynamics Finance and Operations empowers businesses to create intelligent agents that are scalable, secure, and deeply embedded into their operational processes. At Confiz, we’ve seen that a well-planned MCP implementation, paired with strong technical expertise and cross-functional alignment, can enhance automation, streamline data access, and support faster and smarter decision-making across the enterprise.

Considering MCP for your AI integration strategy? We’ll help you evaluate compatibility, identify gaps, and plan a seamless rollout. For tailored guidance or to explore your integration needs, connect with us at marketing@confiz.com.

]]>
Automate with precision: The role of financial tags and default rules in Dynamics 365 Finance https://www.confiz.com/blog/automate-with-precision-the-role-of-financial-tags-and-default-rules-in-dynamics-365-finance/ Mon, 23 Jun 2025 07:34:31 +0000 https://www.confiz.com/?p=9355 Modern finance teams are under growing pressure to do more with their data. As transactions become complex, so does the need for better traceability, richer context, and smarter reporting. Traditional financial structures in Microsoft Dynamics 365 Finance offer tools like financial dimensions. Still, those tools alone aren’t always enough for users who need more flexibility, fewer constraints, and deeper insight at the transaction level.

That’s where the financial tags come in. Introduced as a new capability in Dynamics 365 Finance, financial tags allow users to assign up to 20 customizable values to transactional records, unlocking more granular visibility without the rigidity of predefined account structures. These tags aren’t just visible when posting and carrying through voucher transactions; they also enable more intuitive filtering and data exploration.

In this blog, we’ll explore how D365 financial tags work, how they differ from traditional financial dimensions, and how to configure them. We’ll also discuss automation through default rules that can save time and improve consistency across your financial operations.

What are financial tags in Dynamics 365 Finance?

Financial tags in Dynamics 365 Finance are a powerful feature that lets you add up to 20 user-defined fields to transactions for enhanced tracking and reporting, without modifying your chart of accounts or financial dimensions.
Financial tags are lightweight, flexible labels you can attach to transactions (like journal entries, invoices, or payments) to capture extra context, such as:

  • Sales order numbers
  • Vendor or customer names
  • Campaign codes
  • Internal project references

They’re especially useful for internal analysis, audit trails, and ad hoc reporting.

Financial tags vs financial dimensions: What’s the difference?

A common misconception among Dynamics 365 users is that financial tags D365 function similarly to financial dimensions. While both tools help categorize and track financial data, they serve different purposes and behave differently in the system. Understanding the distinction between financial tags vs financial dimensions is key to using each effectively and avoiding misconfigurations, and some of these are listed below:

Financial DimensionsFinancial Tags
Must be included in account structures or advanced rules to be functional.  Not tied to account structures, offering more flexibility and fewer validation constraints.
Undergo validation during accounting entry based on active status and structure rules.    Not subject to structural validations during entry, making them easier to apply.    
It is not supported in dimension sets or trial balances but is visible in separate voucher columns, which is ideal for filtering transactions.    Values of financial tags in D365 can be edited or removed even after posting.
Useful in financial dimension sets and compatible with Management Reporter for detailed financial reporting.It cannot be associated with master data; it is applied at the journal header level and can default to lines.    
Requires maintenance mode to activate new dimensions.    Values of financial tags in D365 can be edited or removed even after posting.
Once a transaction is posted, dimension values cannot be changed. Values of financial tags in D365 can be edited or removed even after posting.

Configuring financial tags in D365 Finance: Step-by-step guide

Setting up financial tags in D365 Finance is a straightforward process, but it requires a few key steps to ensure proper functionality. Follow this guide to enable the feature, define tag behavior, and make them available across your journals.

Step 1:

First, navigate to the feature management and enable the “Financial Tags” feature.

Step 2:

We need to add a delimiter that separates each financial tag from the others. This will be defined in the General Ledger parameters > Financial Tags Tab.

Note: Choose this delimiter wisely because it can’t be used in the Financial Tag values, which we will explore further in later parts of this blog.

Step 3:

We are all set to create the D365 F&O financial tags.

Navigate to General Ledger > Chart of Accounts > Financial Tags > Financial Tags.

Step 4:

Click on the New button to create a new financial tag in Dynamics 365 Finance and Operations.

Step 5:

Fill in the Name of the financial tag in the field “Financial tag”.

Step 6:

Select the correct Value Type. This defines the type of values for the financial tags that will be available for the user’s selection.

We have the following options available:

  • Text: The user can type any desired value in the field without restrictions.
  • List: If this option has been selected, we need to define the entity in the Use Values from field so that relevant options, such as Bank Account Names, Business Units, Customer accounts, customer groups, fixed asset numbers, item groups, etc., are only available for selection.
  • Custom List: Under this option, we can define a custom list as tag values so that users can specifically choose from the custom list only.

To define the custom list for selection, click the Tag Values button.

Step 7:

Define the values that you require for your custom list.

Step 8:

To make the financial tags visible or available, click the Activate or Deactivate tags, move the required dimensions to the Active financial tags section, and click OK.

Once the batch job is ended, you will see the previously selected financial tags as Active.

Walkthrough: Using financial tags in a vendor invoice journal

Once your financial tags are configured and activated, you can apply them during transaction entry. Here’s how to test them using a vendor invoice journal:

Step 1:

For demonstration purposes, create the following three financial tags:

Financial TagValue TypeUse Values from
DepartmentTagCustom List 
CampaignText 
FixedAssetListFixed Asset Number

Step 2:

When posting the Vendor Invoice journal, you would see the financial tags and the offset financial tags fields for the information to be filled in.

Step 3:

Once click on the drop-down, you will see the active financial tags and the allowed tag values for each financial tag. The user can select the required financial tag values that need to be tagged with the Invoice journal

Step 4:

Alternatively, you can define the financial tag values on the header level and populate them on all the subsequent lines.

Step 5:

Post the vendor invoice journal.

Step 6:

Let’s check the voucher together.

Here, you will observe that financial tags have also been tagged with the posted financial voucher.

Step 7:

Next, once the “Enable financial tags for Accounting Source Explorer” feature is enabled, you can find these financial tags in the Accounting Source Explorer.

Step 8:

You can explore the accounting source explorer for the earlier posted expense account and check whether these financial tags are available there.

Step 9:

We have confirmed that financial tags and their relevant values are also available in the accounting source explorer.

Editing financial tag values after posting

One advantage of using these financial tags in Dynamics 365 is that you can edit them even after the voucher has been posted. Assume you mistakenly selected the wrong financial tag values when posting the journal or invoicing the purchase order. If these were the financial dimensions, you would probably have reversed the journal and reposted it, but this is not true with financial tags.

To correct any financial tag values, you need to follow the steps:

Step 1:

You can navigate to the Voucher Transactions.

Step 2:

Select the relevant voucher and click OK to view the specific voucher transaction.

Step 3:

Select the transaction/transactions and click on the Edit voucher.

Step 4:

Then select the option “Edit internal voucher data”.

Step 5:

Now, fill in the following information:

  • In the field “New financial tags,” select the new / required financial dimension. In our case, we have changed the DepartmentTag from “Finance” to “Human Resources”.
  • Write the reason for the changes in the “Reason for edit” text box. In our case, it’s “Wrong Department selected”.

Step 6:

Finally, click the OK button.

You will notice that the financial tag values have been changed as per the desired values.

Automating tag values with financial tag rules in D365 Finance

To simplify data entry and improve consistency, Microsoft Dynamics 365 Finance (as of version 10.0.42) introduces Financial Tag Rules (preview). This feature enables an automatic population of tag values based on user-defined conditions, no need for manual selection or Power FX coding.

Let’s walk through how to configure these rules using the built-in rule builder and test their functionality using a real-world scenario.

Supported transaction types (as of version 10.0.42)

Financial Tag Rules currently work with:

  • General journal
  • Allocation journal
  • Reporting currency adjustment journal
  • Vendor invoice journal

Scenario or the test case

We will now be setting up the financial tag rules as per the following criteria:

  • If the Business Unit = 003 or 004, then the DepartmentTag should be “Consulting”; otherwise, the DepartmentTag should be “Finished Goods”
  • If the Currency is “USD,” then the OrderLocation should be “Local.” If the currency is “EUR,” then the OrderLocation should be “Europe.” Otherwise, the OrderLocation financial tag should be “Rest of the world.”

Real-world use case: Configuring financial tag rules

Step 1:

Navigate to the General Ledger > Chart of Accounts > Financial Tags > Financial tag rules (preview).

Step 2:

Click on “New” to create a new financial tag rule.

Step 3:

Select or define the following data on the financial tag rule setup page:

  • Name: This is the name you want to assign to this rule.
  • Enable: If we want to activate the rule as soon as your setup gets completed, then set this option to “Yes”.
  • Overwrite existing value: If you want this rule to override the existing values selected in the journals, set this option to “Yes.” Otherwise, let it be set to “No.”
  • Transaction entry point: Select the document where this rule will apply, such as the General journal, allocation journal, reporting currency adjustment journal, or vendor invoice journal.
  • Transaction level: We can define our financial tag rule for one or more of the following, turn by turn:
    • Header (Here we can define the rule based on the Journal batch number)
    • Account (Here we can define the rule for the financial tags)
    • Offset account (Here we can define the rule for the offset financial tags)
  • Target: This requires us to define the relevant or targeted Financial Tag to which the rule will be applied.

Step 4:

Let’s set up the rule by clicking on the “Conditions”.

Step 5:

Click on the “New Condition”.

Step 6:

To comply with our test case scenario, set the rules with the following conditions:

  • Rule 1

If Business Unit is 003 or Business Unit is 004, then the target value’s custom text is “Consulting”; else, the value of the financial tag should be “Finished Goods”

  • Rule 2
    • If Currency = USD, then OrderLocation should be “Local”
    • If Currency = EUR, then OrderLocation should be “Europe”
    • Else the OrderLocation should be “Rest of the World”

Step 7:

Now, we can create and post a Vendor Invoice Journal to validate whether our financial tag rules are working perfectly.

Note: Ensure that your financial rule is enabled to apply to our Vendor Invoices.

Step 8:

Now, we will create an invoice journal and set the Currency to USD first. You will notice that “Local” is selected. On the other hand, we will leave the Business Unit financial dimension blank so that “Finished Goods” is selected for the DepartmentTag.

Step 9:

Now, we will change the Currency to “EUR,” which must fetch “Europe” as the tag value for OrderLocation. Then, we will select the Business Unit as 003 so that DepartmentTag automatically selects ” Consulting.”

So we noticed that the value populated in the Financial tags field “Europe_Consulting” is the same as per the criteria of our financial tag rules in Microsoft Dynamics 365.

Conclusion

Leveraging financial tags and automation rules in Microsoft Dynamics 365 Finance opens the door to more agile, accurate, and insightful financial operations. By moving beyond traditional dimensions, organizations can reduce reliance on rigid structures, streamline journal processing, and enhance data traceability, without compromising control.

At Confiz, we’ve helped finance teams implement and scale financial tagging strategies that simplify reporting and enable automation through rule-based defaults. A well-planned approach is key, whether you’re just starting out or looking to optimize existing processes.

Need expert guidance on setting up financial tags or automating tag logic in D365? Contact us at marketing@confiz.com. We are here to help you drive efficiency and unlock better financial insight.

]]>
Unit of measure for line-level charges in D365: A small feature with big impact https://www.confiz.com/blog/unit-of-measure-for-line-level-charges-in-d365-a-small-feature-with-big-impact/ Fri, 20 Jun 2025 06:48:24 +0000 https://www.confiz.com/?p=9364 In a globalized, cost-conscious business landscape, every detail in supply chain finance matters, including how charges are applied at the line level. Microsoft Dynamics 365 Finance and Operations continues to evolve with real-world business needs in mind. One such enhancement is assigning Units of Measure (UOM) to line-level charges.

This seemingly minor feature offers significant flexibility and control over how additional costs are distributed and calculated on purchase or sales documents.

Explore the purpose and benefits of the Unit of Measure in Dynamics 365 F&O and learn how to configure it for accurate and streamlined operations.

What is the “Unit of Measure for line-level charges” feature?

Previously, in Dynamics 365, line-level charges (such as handling fees, container surcharges, environmental taxes, etc.) were often calculated using fixed or percentage-based values. But businesses that offer bulk, volume, or per-unit-based services need more granular control.

In Dynamics 365, a Unit of Measure (UOM) defines how quantities of products or services are tracked, sold, purchased, or stored. It’s a foundational concept across modules like Finance and Operations, Sales, Commerce, and Supply Chain.

With this feature, users can now apply charges based on a specific unit of measure (UOM) such as:

  • Per unit
  • Per kilogram
  • Per pallet
  • Per liter, etc.

This makes the charges more dynamic, scalable, and aligned with real-world logistics and procurement costs.

Application of Unit of measure: A business use case

Let’s say your company imports chemicals in bulk:

  1. You ordered 10,000 liters of solvent from a supplier.
  2. The supplier charges a hazard handling fee of $0.50 per liter.

Without UOM-based charges: You must manually calculate and enter a total fixed charge (e.g., $5,000) for the handling fee.

With a UOM-based charge: You can define a charge code like “Hazard Handling” with a UOM = Liter and rate = $0.50.

D365 will now:

  • Automatically multiply the charge per UOM by the ordered quantity.
  • Apply it as a line-level charge.
  • Recalculate the charge dynamically with quantity changes

Examples of unit-based charges by industry

Industry Use Case
Retail and FMCGPackaging surcharge per unit or box
ManufacturingEnvironmental fees per KG of raw material
LogisticsHandling charge per pallet or container
Pharma & ChemicalsHazard fees per liter or bottle
Food & BeverageRefrigeration surcharge per case

Benefits of UOM-based charges

Using Units of Measure (UOM) in Dynamics 365 brings many practical benefits across inventory, sales, purchasing, and production. Here’s how they make life easier:

  • Automation and accuracy

No more manual charge calculations or spreadsheets! D365 handles it automatically based on item quantity and UOM.

  • Improved cost tracking

Helps ensure that true landed or operational costs are applied accurately, making profitability and pricing analysis more reliable.

  • Compliance and transparency

Supports UOM-based tax, environmental, or regulatory fees, aiding compliance and audit readiness.

  • Dynamic charge application

Charges are recalculated automatically when quantities or UOMs are updated, maintaining financial accuracy.

How to set up Unit of Measure in Dynamics 365 (An overview)

Setting up Units of Measure (UOM) in Dynamics 365 Finance & Operations ensures accurate inventory, pricing, and transaction handling across your supply chain. Here’s how to do it:

  1. Navigate to: Procurement and Sourcing > Setup > Charges > Auto Charges (for Purchase Order)
  2. Click on the Level dropdown and select “Line.”

3. Create New Auto Charges.

4. Select a Category.

    Specific Unit / Specific Unit Match

    Unit of measure: Select the desired UOM (e.g., kgs or liters)

    Charge value: e.g., 2.00 per item

    When this charge is applied to a PO or SO line, D365 will use the quantity × rate logic using the selected UOM.
    Once you have created a PO and selected the item to procure, enter the quantity and amount (if needed) and select the unit of measure.

    Then, click on the “Purchase Order” fast tab on the “View” tab, click on “Total”, and view the charges for the purchase order.

    The charges will be the Auto charges set for the procurement and sourcing of Auto Charges.

    Here, the charges: $20 are calculated as: Quantity Procured x Charges Value = 10 x 2 = $20

    These charges will be automatically added to the PO Invoice at the time of invoicing.

    Note: For SO, set up the auto charges for the accounts receivable.

    Tips to follow

    • Use consistent naming for clarity across teams.
    • You can also use the Unit creation wizard to auto-generate common UOMs and conversions.
    • For multilingual environments, add translations for UOMs under the “Translations” tab.

    Final thoughts

    The “Unit of Measure for Line-Level Charges” feature may appear to be a minor configuration option. Still, it reflects D365’s ability to adapt to real-world financial, logistical, and compliance challenges. For businesses dealing with volume-based pricing or import/export complexities, this capability can drive cost accuracy, automation, and reporting clarity.

    Want to explore how to tailor Dynamics 365 Finance and Operations to your unique business needs?
    Get in touch with our experts at marketing@confiz.com.

    ]]>
    Modernizing version control for D365: A step-by-step guide to Git configuration & migration https://www.confiz.com/blog/modernizing-version-control-for-d365-a-step-by-step-guide-to-git-configuration-migration/ Thu, 19 Jun 2025 11:40:29 +0000 https://www.confiz.com/?p=9295 As organizations modernize their DevOps practices, shifting from centralized version control systems like TFVC to distributed systems like Git has become a strategic priority, especially for teams working on Microsoft Dynamics 365 Finance & Operations (F&O). Git not only enables better collaboration and faster delivery cycles but also integrates seamlessly with Azure DevOps Git, making it the preferred choice for development teams.

    In this blog, we’ll walk you through everything you need to know to start with Git for Dynamics 365 F&O projects. You’ll learn how to configure Git, use key commands in Visual Studio, adopt best practices for commits and pull requests, and execute a smooth migration from TFVC to Git. Whether optimizing your version control practices or planning a full migration, this guide offers actionable steps to help you modernize your DevOps approach confidently.

    Why do we use Repos in DevOps for D365 F&O development?

    In DevOps for Dynamics 365 Finance and Operations (F&O) development, repos (repositories) are used primarily for version control. This allows developers to track changes to their code over time, collaborate effectively with other team members, and manage different codebase versions. It also ensures smooth development and deployment processes by facilitating branching, merging, and code reviews, all within a centralized system like Git Azure DevOps.

    Key benefits of using Repos in D365 F&O development

    • Collaboration

    Multiple developers can work on the same codebase simultaneously, each making changes on their own branch and merging them into the main codebase to avoid conflicts.

    • Version control

    Track every change made to the code, allowing you to revert to previous versions if needed easily.

    • Code review process

    Enable team members to review each other’s code changes before merging them into the main branch, improving code quality.

    • Branching strategies

    Create different branches for development, testing, and production environments. This allows for isolated testing and deployment of new features without affecting the live system.

    • Automated build and deployment

    Integrate with CI/CD pipelines in Azure DevOps to automatically build, test, and deploy code changes to different environments based on commits to the repo.

    • Git as the preferred version control system

    Most modern D365 F&O development teams use Git as the underlying version control system within their repos, due to its flexibility and popularity.

    • Branching strategy

    Choosing a suitable branching strategy (like feature branching, trunk-based development) is crucial for efficient collaboration and managing different code versions.

    Importance of Git

    Git is a modern, distributed version control system that is critical in today’s software development practices. It allows developers to work independently with a complete copy of the codebase, enabling faster performance and offline access. Git simplifies collaboration by supporting efficient branching and merging, which helps teams develop features, fix bugs, and experiment safely without affecting the main codebase.

    It also provides detailed history tracking, making it easy to audit changes and roll back if necessary. Widely integrated with DevOps tools and CI/CD pipelines, Git ensures smooth automation of build, test, and deployment processes. Its open-source nature and strong community support have made Git the industry standard for version control.

    Use of Git fetch, pull, push, and sync for version control in Visual Studio

    • Fetch

    It is important to fetch and pull before you push. Fetching checks for any remote commits you should incorporate into your local changes. If you see any, pull first to prevent any upstream merge conflicts. When you fetch a branch, the Git Changes window has an indicator under the branch drop-down, which displays the number of unpulled commits from the remote branch. This indicator also shows you the number of unpushed local commits.

    • Pull

    Always pull before you push. When you pull first, you can prevent upstream merge conflicts.

    • Push

    When you create commits, you have inherently saved local code snapshots. Use Push to push the commits to GitHub, where you can store them as backups or share your code with others.

    • Sync

    You push commits if your local branch is behind the remote branch. If you try to push, a dialog prompts you to pull before pushing.

    Use this operation to both pull and then push, sequentially.

    Azure DevOps Git vs TFVC

    Key PointsTFVCGit
    ArchitectureCentralized Version Control System (CVCS)Distributed Version Control System (DVCS)
    Repository StructureA single centralized repositoryEach developer has a full copy of the repository
    Branching ModelHeavyweight, branching is less frequentLightweight, branching is fast and encouraged
    Offline WorkLimited offline capabilitiesFull offline capabilities
    PerformanceSlower due to network dependencyFaster due to local operations
    MergingMore complex merging processMore flexible and efficient merging
    Storage ModelTracks changes per fileTracks changes per commit (snapshot-based)
    CollaborationDevelopers check in changes to a central serverDevelopers commit locally and push to a shared repository
    Learning CurveEasier for beginnersFine-grained permission control at the file and folder level
    Security & Access ControlFine-grained permission control at file and folder levelPermissions are typically repository-wide
    Use CaseBest for large enterprises with strict controlIdeal for open-source and distributed teams

    Comparable actions in TFVC and Git

    TFVC    Git
    Map Code ⟷ Clone 
    Check In ⟷ Commit, Push 
    Code Review ⟷ Pull Request 
    New Branch ⟷ Fork 
    Get Latest  ⟷ Pull 

    Git review process

    Git configuration

    Let’s begin with the practical part of Git configuration and its usage. Setting up Git correctly is the first step to ensuring smooth collaboration, version control, and code management in your development workflow. In this section, we’ll walk through how to configure Git on your machine, set global settings like username and email, and understand how these configurations help manage your commits and contributions effectively.

    • Create a new repository.
    • Select Git as your repository type and select the check box “Add a README.”
    • After creating the repo, click the clone button to clone the repo locally.
    • Select Visual Studio from the IDE option.
    • Copy the clone URL of the repository.
    • Click on the clone repository option from Visual Studio.
    • Paste the clone URL in the repository location section and click on the clone button.
    • Cloning the repository from the server to your local machine.
    • Click on the Git repository button under the Team Explorer tab. Right-click on the main branch and click on New local branch.
    • Enter the name of the local branch and click “create.” Your local branch will then be created and shown under the branches section.
    • Create a new model.
    • Copy the newly created model.
    • Copy and paste the new model into the metadata folder of the local repos.
    • Open a command prompt and write a make link command to make the link.
    • The link of the model has now been created successfully in AOSService.
    • Create a new project.
    • To commit the newly created model and project to the local branch, click on the Git changes tab. Select the local branch, add the model and project files in the staged changes section, and commit the initial commit. The initial commit has been checked into the local branch.
    • To see the history of the local branch, select the local branch and view the history tab.
    • After committing an initial check-in, right-click on the local branch and then click on push. This action saves the commit as a local snapshot.
    • You can see the newly created local branch on Azure.
    • To create a pull request for the initial commit, right-click on the local branch and then click on “create pull request.” A pull request is used to merge the code from one branch to another branch.
    • Verify the artifacts that need to be added to a pull request, and after giving the pull request a name, click on the create button.
    • To view the pull request on Azure, select the pull request tab.
    • After verifying the elements, click on pull request and then the complete button to merge the initial commit from the local branch to the main branch or default branch.
    • This dialogue box will open after clicking on the complete button of the previous image. In this dialogue box, select merge as the merge type and select the “complete associated work items after merging” option. And click on the complete merge button.
    • Verify the results by viewing the main branch.

    Work with Git

    After the initial commit, the configuration part of Git has been completed. Now, see the commit for the very first task. Take a runnable class as an example.

    • Create a class and then go to the Git changes. In the staged changes section, take a class, and after giving the name of the commit, click on commit staged.
    • Right-click on the local branch and click on push.
    • Right-click on the server local branch, select create pull request, and then create the pull request.
    • After creating a pull request, click Complete the pull request.
    • Click on complete merge.
    • To verify the result, go to the main branch and find the elements of the recent pull request.

    Migrate repositories from TFVC to Git

    You can migrate code from an existing TFVC repository to a new Git repository within the same organization.

    Why migrate from TFVC to Git?

    Over the past several years, we added no new features to Team Foundation Version Control (TFVC). Git is the preferred version control system in the Azure DevOps Git Repository. Furthermore, all the improvements we made in the past few years regarding security, performance, and accessibility were only made to Git repositories. We have seen a continuous decline in the usage of TFVC over these years. A few customers still use TFVC, and we do not plan to remove this feature set for those customers. However, we plan to phase out TFVC in all new projects and organizations or projects that do not currently use it.

    As a step in that direction, we will introduce a setting to “Disable creation of TFVC repositories.” It only changes the creation of new TFVC repositories; existing ones have no impact. The setting is available at the organization and project levels; the organization setting has a priority over the project level. By default, the toggle will be enabled. This means you can no longer create TFVC repositories in these new projects. But if you have a strong reason, you can have your administrator turn off the toggle. We will not allow you to turn off the toggle in the future.

    Prerequisites of the migration

    Before beginning the method of importing the repository, numerous prerequisites need to be met to facilitate migrations:

    • Only a single branch is migrated. When planning the migration, we should use a new branch strategy for Git.
    • Import only the latest version of the source control. You can optionally migrate some history, up to 180 days.
    • Your repository does not have binary assets like images and scientific data. These assets should use the Git LFS (large file support) extension, which the import tool does not configure.
    • The imported repository cannot exceed 1 GB in size.

    Migrate repositories from TFVC to Git

    The process to migrate from TFVC is simple:

    • Create a new repository where you’d like to store the code for migration.
    • Select Git as your repository type and unmark the “add a README” option for the migration.
    • Click on the Import button under the Import a repository
    • Select the type TFVC and give the repository path with the branch. ($/Repository/Branch). If you also want to migrate history, mark the migrate history check box ‘yes’ and enter the history up to 180 days.
    • The migration procedure will begin after you click the Import button.
    • The migration has been completed successfully.
    • It takes a while to finish, and then the outcome is shown. Click on the Contents button to view the items in that moved repository.
    • It takes a while to finish, and then the outcome is shown. Click on the History button to view the history of that repository.

    Conclusion

    Successfully transitioning from TFVC to Git within Microsoft Dynamics 365 F&O environments is a foundational step toward building a more agile and future-ready DevOps setup. By adopting Git, development teams gain better control over versioning, enable seamless collaboration, and accelerate delivery cycles through modern branching strategies and CI/CD integration.

    At Confiz, we’ve helped organizations streamline their version control practices by aligning the right tools with proven methodologies. Our experience shows that careful planning, hands-on configuration, and team enablement are key to unlocking Git’s full potential in enterprise environments.

    If you’re looking to modernize your DevOps workflows or need expert guidance with Git implementation, contact us at marketing@confiz.com.

    ]]>
    Green AI: Rethinking Generative AI for a sustainable future https://www.confiz.com/blog/green-ai-rethinking-generative-ai-for-a-sustainable-future/ Wed, 18 Jun 2025 11:31:01 +0000 https://www.confiz.com/?p=9292 The climate crisis can feel like a moving target. Just as one goal is set, it’s missed and replaced by another. Add pollution, deforestation, and the rapid loss of biodiversity to all of this, and it’s easy to feel uncertain about the planet’s future.

    Amid these mounting environmental challenges, technology is playing a critical role in shaping our response. One emerging innovation – Generative AI – is showing promise in sustainability efforts. Analyzing complex environmental data at scale can provide surface insights and patterns that help address some of our most persistent climate issues.

    More insights: What is generative AI: A new era in intelligent automation

    For business leaders, the intersection of Gen AI and sustainability marks a turning point. As Gen AI adoption scales, so does Generative AI’s environmental impact—each interaction with OpenAI’s chatbot uses 2.9 watt-hours of electricity, and AI-related energy use could reach 3% of global consumption by 2030. Balancing these impacts should be central to your green Gen AI strategy and business case discussions.

    This blog explores the lesser-known side of Generative AI, its environmental impact, and why sustainability must be part of your Gen AI adoption strategy.

    We will cover:

    • What is the environmental impact of Generative AI and the energy it consumes?
    • How to make Generative AI green (more sustainable)?
    • Why must business leaders take actionable steps to embed sustainability into their AI strategies?

    The hidden cost of Gen AI: Exploring its environmental impact and carbon footprint

    Before we discuss the role of AI in sustainability, let’s take a closer look at its darker side: What is the environmental impact of Generative AI?

    While Generative AI holds immense promise for innovation and efficiency, its growing adoption comes with a significant yet often overlooked environmental cost. From powering large language models to running millions of real-time interactions, the energy demands of Gen AI are substantial. Each prompt, response, and model training session consumes electricity, contributing to carbon emissions and data center strain. As businesses scale their AI strategies, it’s crucial to understand the environmental carbon footprint of generative AI and consider how to align AI adoption with broader Gen AI and sustainability goals. Addressing this challenge is not just a technical necessity; it’s a strategic responsibility.

    Despite valid concerns, the full picture is more nuanced.

    While its energy consumption is real and rising, Generative AI shouldn’t be viewed solely as a burden on sustainability. When used strategically, it can actually support environmental goals by reducing waste, optimizing resource use, and enabling smarter decisions across operations, supply chains, and product design. In short, its impact depends on how we use it.

    The sustainability-profits paradox- and how AI can solve it

    For years, business leaders have struggled to turn sustainability goals into action. Financial pressures often force difficult tradeoffs, pushing environmental initiatives down the priority list. Even when a strategy exists, progress stalls when sustainability is treated as separate from core business objectives.

    One major challenge has been integrating AI sustainability into the business model in a way that supports growth. Many leaders see sustainability as a revenue enabler but still feel stuck choosing between doing what’s right for the planet and meeting performance targets.

    That’s starting to change. Generative AI is helping companies see these goals not as conflicting, but as complementary. With the ability to rapidly analyze complex data and generate insights, AI can support decisions that drive environmental and financial outcomes, reassuring business leaders about the economic benefits of sustainability.

    Read more: Learn about the ethics of generative AI and how to use it responsibly

    For example, generative AI can help companies:

    • Predict demand more accurately using historical sales data and market trends
    • Optimize production levels to reduce waste and avoid overstock
    • Align operations more closely with sustainability targets without hurting margins

    However, generative AI can’t drive this change alone. It works best when combined with traditional AI, IoT, and other emerging technologies and when supported by the right foundations:

    • High-quality, trusted data
    • Integrated systems and workflows
    • Skilled teams with decision-making capabilities aligned to sustainability goals

    Organizations that mature across these areas are more likely to see their AI sustainability efforts translate into real business performance. And while interest in generative AI is growing, its success ultimately depends on the quality and transparency of the data it’s built on.

    Without strong data, even the best AI can’t deliver measurable results. But with the right foundation, generative AI can help eliminate the tradeoff and turn sustainability into a driver of long-term value.

    How to operationalize Generative AI for sustainability?

    Operationalizing generative AI for sustainability starts with intentional design and governance. Organizations must ensure AI models are trained and deployed using energy-efficient infrastructure while aligning use cases with clear environmental goals, such as reducing waste, optimizing resource consumption, or improving supply chain efficiency.

    It’s also essential to establish metrics to track AI initiatives’ environmental costs and sustainability gains. When paired with responsible data practices and cross-functional collaboration, generative AI can become a practical tool for driving measurable climate impact.

    But how can we consider and lower the sustainable impact of generative AI? Here are the top ways to operationalize generative AI for sustainability:

    1. Democratize insights across teams

    You can enable sustainability data and insights for enhanced performance across ecosystems and enterprises, understanding where particular generative AI use cases pose risks or add value. Use green gen AI for sustainability and find patterns for better pricing, budgeting, and incentive mechanisms depending on sustainability data and metrics.

    2. Embed sustainability across the enterprise

    Business leaders can align business, AI strategies, and sustainability to avoid modernizing generative AI in isolation. You can integrate AI in sustainability-driven initiatives, which are projects or actions designed to promote environmental and social sustainability, into all corporate governance frameworks and business units. Use gen AI for sustainability to augment and improve your data to report and operationalize sustainability targets.

    3. Innovate, don’t just automate

    To transform how things get done, you can utilize generative AI as an origin of innovation for sustainability. However, don’t start automating suboptimal, existing working methods.

    Importance of collaboration while building a sustainable ecosystem for Gen AI

    Generative AI sustainability is a team effort; no organization can do it alone. Natural resources cross borders and industries; protecting them requires a shared commitment. Corporations and governments have a role in safeguarding the planet for future generations.

    Building a truly sustainable ecosystem means bringing together a wide range of expertise. That includes AI specialists, data scientists, environmental experts, policymakers, and business leaders working toward common goals.

    The importance of collaboration isn’t new. What’s changed is how generative AI is enabling it. With faster data analysis, shared insights, and co-created solutions, AI is helping organizations connect and innovate more effectively across the ecosystem.

    For example, generative AI can help:

    • Manufacturers, scientists, and consumer brands co-develop sustainable packaging
    • Partners evaluate material choices based on the environmental impact of generative AI and product performance
    • Teams accelerate design and testing using AI-generated options aligned with green criteria

    Generative AI empowers ecosystems to make smarter, more sustainable decisions by supporting real-time, data-driven collaboration. Many organizations are already building this into their strategy, co-developing AI-driven sustainability solutions with partners and suppliers to scale impact.

    Read more: What’s the difference between Generative AI and LLMs?

    Enabling sustainability through strategic partnerships: A leadership perspective

    You can scale value across the enterprise and use your ecosystem to reduce the carbon footprint of generative AI and benefit as additional business goals. Business leaders can also co-create gen AI and sustainability with generative AI powers with partners, control environmental impact, and modernize sustainability initiatives.

    You need to focus on these key aspects to control generative AI’s carbon footprint through strategic partnerships:

    1. Make sustainability a team effort

    Collaboration with ecosystem partners is essential to achieving meaningful AI sustainability outcomes. When partners are actively involved in your generative AI and data initiatives, it opens the door to shared insights, faster innovation, and more scalable solutions. Sharing data enables deeper collaboration and co-creation of effective, scalable solutions.

    2. Empower employees with accessible insights

    Empowering employees’ access to relevant sustainability Data and AI tools is key to implementing strategy. When business leaders enable this strategy, people can make informed decisions in their daily work. Even small adjustments can add up to a significant impact. A company’s generative AI carbon footprint-controlling goals come to life through thousands, sometimes millions, of daily actions.

    3. Build the right skill sets

    Building a sustainable business starts with investing in people. Prioritize developing teams with the right blend of sustainability knowledge and generative AI skills. At the same time, generative AI for sustainability should be used to educate employees on key sustainability concepts, helping to embed awareness and action across the organization.

    Generative AI for sustainability: Actionable strategies for a greener future

    As the urgency around climate action grows, businesses are looking to technology for smarter, more sustainable solutions. When used thoughtfully, this technology offers powerful opportunities to reduce the environmental impact of generative AI while driving innovation.

    So, “How to make generative AI green?” Here are some practical strategies to align AI adoption with sustainability goals:

    1. Build smarter, not bigger

    As generative AI becomes more embedded in business operations, now is the time to recalibrate your AI for a sustainability approach. Focus on building sustainable AI practices before expansion makes change more difficult. Prioritize upgrading and fine-tuning existing models instead of training new ones from scratch, and adopt lower energy-intensive computing methods wherever possible.

    2. Recalibrate your AI strategy

    Sustainable IT design is just as important. Monitor energy use, hardware efficiency, and data storage to uncover opportunities for greater energy savings. When applied strategically, technologies like hybrid cloud and generative AI can significantly reduce the carbon footprint of generative AI.

    3. Rethink IT infrastructure for sustainability

    Most importantly, stay intentional. Avoid shortcuts that compromise long-term goals. Implement strong data governance to ensure that using generative AI in sustainability aligns with your organization’s objectives and values.

    4: Shift to renewable-powered data centers

    An effective way of on how to make generative AI green is to move AI processing to data centers powered by renewable energy to lower emissions. Major cloud providers like Microsoft and Amazon have pledged 100% renewable energy transitions, an important development as AI’s computational demands rise.

    5: Incorporate sustainability into AI governance

    Embed environmental impact assessments into your AI strategy, procurement decisions, and risk frameworks.

    6: Consolidate and clean data pipelines

    Efficient data handling reduces compute demand. Eliminate redundancy and ensure high-quality data to avoid unnecessary model complexity.

    7: Track AI’s environmental footprint

    Establish KPIs to monitor carbon emissions, compute usage, and energy consumption tied to generative AI applications.

    8: Blend Gen AI with IoT and traditional AI

    Combine generative AI with other technologies like IoT sensors or predictive analytics for greater efficiency and real-time environmental insights.

    9: Raise awareness and build accountability

    Educate teams, stakeholders, and leadership on the environmental impact of AI use. Create internal accountability mechanisms to ensure AI initiatives are aligned with AI for sustainability goals and are transparently reported.

    Conclusion

    While it carries environmental risks, generative AI also has significant potential to drive sustainable innovation, but only if it’s approached with intention, responsibility, and collaboration.

    The path forward requires balance. Business leaders must integrate sustainability into every aspect of their AI strategies – from how models are built and deployed to how insights are shared and acted upon. Organizations can begin aligning generative AI in sustainability adoption with long-term sustainability goals by empowering employees, engaging partners, and investing in energy-efficient technologies. In short, the goal isn’t just to use generative AI, it’s to use it wisely.

    If you want to align innovation with impact, now is the time to explore how generative AI can support your sustainability goals. At Confiz, we offer a POC approach to help businesses build, test, and scale generative AI solutions responsibly, ensuring that sustainability is embedded from the start.

    Ready to begin your responsible AI journey? Contact us at marketing@confiz.com to kick-start a tailored POC that aligns with your business goals and environmental priorities.

    ]]>