• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Platform

Monitoring the Power Platform: Dataverse – Continuous Monitoring with Availability Tests

April 9, 2021   Microsoft Dynamics CRM

Summary

 

Continuous monitoring allows enterprises the opportunity to ensure reliable performant services are immediately available to its users. Platforms need to be able to keep up with demand in such a way that its seamless to a user. Systems that become unreliable or unusable are quickly disregarded or abandoned. One sure fire way to ensure uses won’t use a service is if the service is unavailable. To account for this, enterprises looks to service level agreements and business continuity strategies. Part of this strategy includes testing for availability.

Azure Application Insights provides features to allow organizations to quickly report and take action on current and trending availability metrics. This article will review the various tests that can be configured within the service. From there we will go into how the data is and can be collected for analysis. We will look into a use case involving monitoring the Dataverse API. Finally, we wrap with implementing a monitoring strategy to assist with notifications and automation.

Azure Application Insights Availability Tests

 

Azure Application Insights availability tests come in three distinct groupings. The first, reaches out to a URL from different points around the world. The second, allows for the replay of a recorded user interaction with a site or web based service. Both of these originate from within the Azure Application Insights feature itself, created in the Azure Portal or through the Azure APIs.

The final type of test is a completely custom test. This custom test allows flexibility into how, what and where we test. Due to these attributes, this type of test is ideal and will serve as the test strategy implemented below.

Important Note on web tests:

The web test mechanism has been marked as deprecated. As expected this announcement comes with various feedback. With this in mind, I recommend avoiding implementing web tests. If web tests are currently being used, look to migrate to custom tests.

Building Ping Tests

 

URL Ping Tests with Azure Application Insights are tests that allow the service to make a request to a user specified URL. As documented, this test doesn’t actually use ICMP but sends an HTTP request allowing for capturing response headers, duration timings, etc.

For the Power Platform, this can be useful for testing availability of Power Apps Portals or services utilizing Power Virtual Agents or custom connectors. When configuring the test, conditions can be set for the actual request to the URL. These include the ability to include dependent requests (such as images needed for the webpage) or the ability to retry on an initial failure.

The test frequency can be set to run every five, ten or fifteen minutes from various Azure Data Centers across the globe. Its recommended to test from no fewer than five locations, this will help diagnose network and latency issues.

Finally, the referenced documentation recommends that the optimal configuration for working with alerts is to set the number of test locations equal to the threshold of the alert plus two.

Building Custom Tests

 

Continuing down the path of availability tests, the need to expand beyond URL ping tests will eventually come up. Situations such as validating not only uptime but authentication, latency, service health, etc. all could benefit from custom availability tests.

Before building custom tests, let’s look take a closer look at the Availability table within Azure Application Insights.

The Availability Table

 

The availability table is where our test telemetry will reside, either from URL ping tests or custom testing. The table is designed to allow for capturing if a test was successful, the duration (typically captured in milliseconds with a stopwatch approach), location of the test and other properties. I’ll review this in depth further in the article but for now keep in mind at a minimum we want to capture successfulness, timing and location for each test.

Creating and Deploying an Azure Function(s)

Azure Functions offer an ideal service for hosting out availability tests. Deployable worldwide and resilient, we can quickly modify and publish changes to our tests with minimal effort. Azure Functions also offer a real advantage, the ability to use a timer based trigger similar to a CRON job (actually uses NCRONTAB expressions) or an HTTP trigger allowing for ad-hoc test runs.

AUTHOR’S NOTE: Click on each image to enlarge for more detail

3060.pastedimage1617882411222v1 Monitoring the Power Platform: Dataverse   Continuous Monitoring with Availability Tests

Triggers

 

The image above shows two public entry points, one based on the HttpTrigger and one based on the TimerTrigger. The HttpTrigger is relatively straight forward allowing for GET and POST messages. The advantage for the use of this type of trigger is flexibility, function requests can be sent from practically any pillar in the Power Platform, such as Power Apps, Power Automate Flows or Power Virtual Agents.

The TimerTrigger on the other hand is, as expected, set to run on a predefined interval using the CRON (NCRONTAB) expression schema. What I like about this approach is we use a well known interface for scheduling tasks utilizing the power of Azure Functions. In the image above the schedule is hard coded but this can be configurable using settings contained within the Azure Function or elsewhere.

The approach laid out here provides another major advantage. This approach is decoupled from the Power Platform and will not be impacted or hindered by the very platform we are testing for availability!

Dataverse Requests

 

The decision how best to connect and report on Dataverse availability is completely up to the business requirements that need to be met. What I would recommend is to authenticate and perform a command confirming not only availability but authorization. Below is an example of requesting an OAuth token and performing two requests, the WhoAmI and RetrieveCurrentOrganization actions.

7652.pastedimage1617882411223v2 Monitoring the Power Platform: Dataverse   Continuous Monitoring with Availability Tests

This helps confirm that the service principal used is valid and can return responses from the Dataverse. In my example, I had two main requirements for working with the Dataverse: connect using a service principal and avoid any SDK dependency. The test must reduce any potential blockers such as license modifications or assembly version lock-in. Again, how you implement this is completely up to you.

Location, Location, Location…

 

Once its been determined what needs to be tested and how to go about testing, the decision on where to test and metric collection still needs an answer. Again, Azure Function comes to the rescue, allowing developers (DevOps engineers cover your ears) to quickly deploy from Visual Studio to locations around the world. That said, ideally proper CI/CD processes are followed.

 Monitoring the Power Platform: Dataverse   Continuous Monitoring with Availability Tests

Measuring latency across regions and continents is natively collected by Microsoft by use of ThousandEyes. The Region to Region Latency tool, documented here, is a good reference of the average latency between Azure Data Centers when performing actions across the Azure network backbone. Alternatively, you can collect latency and bandwidth information using Mark Russinovich’s popular tool, PsPing.

Building the Availability Telemetry

 

Once the application we want to test for availability has been identified and the endpoint connected to, we now need to create the availability telemetry message. As discussed above, the AvailabilityResults table within Azure Application Insights contains columns that can be used to track successfulness of API actions and the locations from which they originated.

For the Dataverse, or in fact any HTTP invoked request, we can also capture headers from the response. This can prove beneficial to gaining insights as well into current API usage as it pertains to limits (e.g. Entitlement Limits) or correlation identifiers. These headers work well in the Custom Dimensions column, a JSON serialized column providing the flexibility needed to add additional data points.

That said at a minimum what I have found most useful for the Azure Application Insights tooling is first coming up with a name for your tests. Once named, setting the Run Location property will be key to grouping the tests regionally. Within Azure Functions is an environment variable called “REGION_NAME” that provides the data center location. Finally, setting the Success property along with the Message is needed to ensure we track uptime.

Optionally, duration can be set and depending on what your requirement is for this will dictate what call timings are captured. In my example I am executing a simple action call and wrapping that in a timer. Taking this duration and comparing it to the latest region to region latency should provide ample timing metrics.

Reviewing Tests

 

The messages from our availability tests will reside within the AvailabilityResults table with Azure Application Insights. The messages are summarized and can be drilled into visually using the Availability feature as shown below.

2248.pastedimage1617882411223v4 Monitoring the Power Platform: Dataverse   Continuous Monitoring with Availability Tests

As the image shows, the tests are grouped by name, in this case “Dynamics 365 Availability Test“. Expanding the test we can see the various regions. Once a region is selected we can drill into the scatter plot to see how uptime may have been impacted. Consider the gif below, showing how to add filters to expand the time window searched as well as organization version.

Dynamics365AvailabilityTests Monitoring the Power Platform: Dataverse   Continuous Monitoring with Availability Tests

Using the technique described above, we can now see not only when a organization version changed but the beginnings of availability and duration timings.

Custom Availability Tests with the Dataverse API Demo

 

Below is a link to my YouTube video including detailed analysis and a demo of working with availability tests within Azure Application Insights. The demo includes setting up the test, deploying to Azure, reviewing logs and creating Azure Monitor alert rules.

Sample Code

 

Sample code can be found here.

Next Steps

 

In this article we have reviewed how to use availability tests within Azure Application Insights. We explored creating URL ping tests as well as building custom tests. From there, we designed and published an Azure Function globally to test Dataverse availability.

As shown, not only can the availability tests let us know the uptime of an API or service but is flexible enough to capture data points such as build versions, flow runs, etc. Consider how to use extend the use of this type of testing to further utilize Azure Monitor capabilities such as alerting.

If you are interested in learning more about specialized guidance and training for monitoring or other areas of the Power Platform, which includes a monitoring workshop, please contact your Technical Account Manager or Microsoft representative for further details.

Your feedback is extremely valuable so please leave a comment below and I’ll be happy to help where I can! Also, if you find any inconsistencies, omissions or have suggestions, please go here to submit a new issue.

Index

 

Monitoring the Power Platform: Introduction and Index

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

Read More

Snorkel AI’s app development platform lures $35M

April 7, 2021   Big Data

Transform 2021

Join us for the world’s leading event about accelerating enterprise transformation with AI and Data

July 12-16, 2021

Free Registration


Join GamesBeat Summit 2021 this April 28-29. Register for a free or VIP pass today.


Snorkel AI, a startup developing data labeling tools aimed at enterprises, today announced that it raised $ 35 million in a series B round led by Lightspeed Venture Partners. The funding marks the launch of the company’s Application Studio, a visual builder with templated solutions for common AI use cases based on best practices from academic institutions.

According to a 2020 Cognilytica report, 80% of AI development time is spent on manually gathering, organizing, and labeling the data that’s used to train machine learning models. Hand labeling is notoriously expensive and slow, with limited leeway for development teams to build, iterate, adapt, or audit apps. In a recent survey conducted by startup CloudFlower, data scientists said that they spend 60% of the time just organizing and cleaning data compared with 4% on refining algorithms.

Snorkel AI hopes to address this with tools that let customers create and manage training data, train models, and analyze and iterate AI systems. Founded by a team spun out of the Stanford AI Lab, Snorkel AI claims to offer the first AI app development platform, Snorkel Flow, that labels and manages machine learning training data programmatically.

 Snorkel AI’s app development platform lures $35M

Application Studio will expand the Snorkel AI platform’s capabilities in a number of ways, the company says, by introducing prebuilt solution templates based on industry-specific use cases. Customers can leverage templates for contract intelligence, news analytics, and customer interaction routing as well as common AI tasks such as text and document classification, named entity recognition, and information extraction. Application Studio also provides packaged app-specific preprocessors, programmatic labeling templates, and high-performance open source models that can be trained with private data, in addition to collaborative workflows that decompose apps into modular parts.

Beyond this, Application Studio offers a feature that versions the entire development pipeline from datasets to user contributions. With a few lines of code, apps can be adapted to new data or goals. And they keep training data labeling and orchestration in-house, mitigating data breach and data bias risks.

Application Studio is in preview and will be generally available later this year within Snorkel Flow, Snorkel AI says.

Palo Alto, California-based Snorkel AI’s latest fundraising round brings the startup’s total raised to date to $ 50 million, which 40-employee Snorkel AI says will be used to scale its engineering team and acquire new customers. Previous investors Greylock, GV, In-Q-Tel, and Nepenthe Capital, along with new investor Walden and funds and accounts managed by BlackRock, also participated in the series B.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Cere Network raises $5 million to create decentralized data cloud platform

March 29, 2021   Big Data

From TikTok to Instagram, how’s your creative working for you?

In digital marketing, there’s no one-size-fits-all. Learn how data can make or break the performance of creative across all platforms.

Register Now


Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.


Cere Network has raised $ 5 million for its decentralized data cloud (DDC) platform, which is launching today for developers. The company’s ambition is to take on data cloud leader Snowflake.

The investment was led by Republic Labs, the investment arm of crowdsourced funding platform Republic. Other investors include Woodstock Fund, JRR Capital, Ledger Prime, G1 Ventures, ZB exchange, and Gate.io exchange. Cere Network previously raised $ 5 million from Binance Labs and Arrington XRP Capital, amongst others, bringing its total raised to $ 10 million.

“Enterprises using Snowflake are still constrained by bureaucratic data acquisition processes, complex and insufficient cloud security practices, and poor AI/ML governance,” Cere Network CEO Fred Jin said in an email to VentureBeat. “Cere’s technology allows more data agility and data interoperability across different datasets and partners, which extracts more value from the data faster compared to traditional compartmentalized setup.”

The Cere DDC platform launches to developers today, which allows thousands of data queries to be hosted on the blockchain, the transparent and secure digital ledger.

The platform offers a more secure first-party data foundation in the cloud by using blockchain identity and data encryption to onboard and segment individual consumer data. This data is then automated into highly customizable and interoperable virtual datasets, directly accessible in near real time by all business units, partners/vendors, and machine-learning processes.

Above: Cere Network’s data cloud query.

Image Credit: Cere Network

The Cere token will be used to power its decentralized data cloud and fuel Cere’s open data marketplace that allows for trustless data-sharing among businesses and external data specialists, as well as staking and governance. The public sale of the Cere token will be held on Republic, the first token sale on the platform.

“We’ve been following Cere Network for some time and have been impressed with the team and the market fit – and need – for a decentralized data cloud,” said Boris Revsin, managing director of Republic Labs, in a statement. “We’re very excited to host Cere Network’s token sale on Republic, which will ensure a decentralized network and faster adoption in the enterprise space of blockchain technology. Their DDC improves upon Snowflake using blockchain identity and data encryption to onboard and segment individual consumer data.”

Developers can access the Cere DDC here. The public sale for Cere token is scheduled for March 31 on Republic. The company said it is working with a number of Fortune 1,000 customers.

“There’s a huge amount of opportunities in this rapidly shifting space for the coming years. We don’t plan to take on the likes of Snowflake head on, yet, but rather focus on specific solutions and verticals where we can bring more customization and efficiency. We are ok with chipping away at their lead while doing this,” Jin said. “We are bringing an open data marketplace which will open up data access beyond the limitation of traditional silo’d data ecosystems, which include Snowflake, and the likes of Salesforce.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Internet Explorer 11 support for Dynamics 365 and Microsoft Power Platform is deprecated

March 17, 2021   Microsoft Dynamics CRM

Effective December 2020, Microsoft Internet Explorer 11 support for Microsoft Dynamics 365 and Microsoft Power Platform is deprecated, and Internet Explorer 11 won’t be supported after August 2021.

This will impact customers who use Dynamics 365 and Microsoft Power Platform products that are designed to be used through an Internet Explorer 11 interface. After August 2021, Internet Explorer 11 won’t be supported for such Dynamics 365 and Microsoft Power Platform products. We recommend that customers transition to Microsoft Edge.

If you have more questions, contact your Microsoft Customer Service representative or Microsoft Partner.

Walter Carlin

Senior Customer Engineer | Dynamics 365 CE | Microsoft Corp.

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

Read More

Dynamics 365 and Power Platform Monthly Update-March 2021

March 14, 2021   Microsoft Dynamics CRM

Dynamics 365 and Power Platform at Microsoft Ignite – March 2-4, 2021

 

Virtual Microsoft Ignite is here – It has tones of contents on the future of Microsoft Dynamics 365 and Microsoft Power Platform, capabilities and roadmap.
  • Connect with experts.
  • Engage with your global community.

Access On-Demand content

Check out a curated list of sessions here.

Microsoft Business Applications Launch Event – April 6, 2021

Get an in-depth look at the new features and capabilities in the 2021 release wave 1 for Dynamics 365 and Microsoft Power Platform on April 6.

Power Apps 2021 Wave 1 Preview Release is now Available | UI features enhancements

      

 The Power Apps Release Wave 1 for 2021 is available for you to enable in your environment ahead of the April rollout. 

We have a number of new features that will modernize your end users’ model-driven app experiences.

 Check here to opt-in your environment into the 2021 release wave 1 for:

Check this page on Power Apps blog to learn about few enhancements that will increase productivity for your end users:

Modern Search

Mobile Enhancements:

Modern Experience for End Users

Microsoft Dynamics 365 App for Windows comes with Offline Access in Public Preview

                                             

Learn more from the instructions on  this page on Power Apps blog how to accomplish actions below:

Power Apps Mobile | Introducing Power Apps mobile app’s new look | Public Preview

Note : Customer engagement apps (such as Dynamics 365 Sales Apps and Dynamics 365 Customer Service Apps) don’t run in Power Apps mobile. Instead, you use the Dynamics 365 for phones and tablets apps. More information: User Guide for Dynamics 365 for phones and tablets.

We announced the public preview of the new app selection and discovery experience for Power Apps mobile. With the new look, there now exists a home page to access your commonly used content and a navigation bar to easily navigate through the app.  

Check this page of Power Apps blog to learn more about the new experience.
  • Personalized Home Page experience

  • Favoriting and pinning apps to home screen using gesture controls

  • Find all your apps in one page

Power Apps Mobile | Run model-driven apps and canvas apps on Power Apps mobile

                     

Note : Customer engagement apps (such as Dynamics 365 Sales Apps and Dynamics 365 Customer Service Apps) don’t run in Power Apps mobile. Instead, you use the Dynamics 365 for phones and tablets apps. More information: User Guide for Dynamics 365 for phones and tablets.
  • When you create an app, or someone shares an app with you—either a canvas app or model-driven app—you can run that app on iOS and Android devices by using Power Apps mobile.
  • If you’re on a Windows device, you can only run canvas apps; model-driven apps aren’t supported on Power Apps for Windows.

 

  •  If you want to we create an entirely separate icon on our phone that will open your desired app with one click, learn more about this functionality here  and more on Microsoft docs website. 

Power Apps Mobile | The new search experience for model-driven apps is now also available on Mobile

                     

  • In November 2020, usability enhancements and core improvements to relevance search were introduced on the web.
  • These additions make information discovery fast and easy. 
  • We are now bringing this experience to mobile.  
  • These improvements include:
    • A user-friendly search experience and smart suggestions during search.
    • Ability to Search from any page in the app.
    • Recent searches and recently accessed records will show on the search page
    • Suggested results as you type 
  • Additional Enhancements coming to search in March 2021:
    • Results page, Filtering your search results. 

With Relevance Search and 2021 release wave 1 preview enabled, you can try out all these capabilities. 

Learn more on this page on Power Apps blog.

Dynamics 365 Sales App | Sales accelerator brings digital scalability within reach

                                                                                

We’re introducing sales accelerator, a digital selling capability of Microsoft Dynamics 365 Sales. 

Sales accelerator helps sellers work faster and smarter with:

  • A prioritized work list of the next best lead or opportunity

  • Allowing to engage through the customers’ preferred channel with integrated email, phone dialer, and a Microsoft Teams channel (coming soon)

  • For more technical details, check out our blog to “Enable digital selling with the sales accelerator in Dynamics 365 Sales.”

Dynamics 365 Customer Engagement (on-premises) |

V9.1 On-Premises update brings the latest version of Unified Interface

                                             

 

  • The Microsoft Dynamics 365 for Customer Engagement Version 9.1 On-Premises is coming in Q2 2021.
  • In March 2021, we will publish the dates on when they will become available so you can plan your upgrades. 
  • This release contains major updates to the Unified Interface with improved performance and a host of new and updated usability features.  
  • Upgrades from Microsoft Dynamics 365 version 8.2 or Microsoft Dynamics 365 version 9.0 to version 9.1 will be supported for customers to take advantage of the latest version of Unified Interface. 

What’s New for Unified Interface in on-premises ?

  • Unified Interface uses responsive web design principles to provide an optimal viewing and interaction experience for any screen size, device, or orientation.
  • Whether you are using a browser, tablet, or phone, you will be able to consume similar experiences.
  • With this release, features and improvements that were previously only available online will now also be available on-premises, such as:
    • Enterprise Sales usability enhancements
    • Improved data management with record sharing, bulk edit and merge capabilities
    • Timeline usability enhancements
    • Agent productivity enhancements for knowledge capabilities
    • Case routing rules
    • Advanced Find
    • Hierarchy view
    • Improved navigation experiences
    • Main form dialog editing experience for queue items
    • Hybrid experiences like Sharing
    • Personal views 

Recommendation for D365 On-Premises Customers

  • Now is a great time to start planning and preparing to transition On-Premises deployments to Unified Interface for Model-driven apps from the legacy web client, and to take advantage of these major improvements! 

Dynamics 365 Customer Engagement (on-premises) |

On-premises data gateway February 2021 update is now available

                                             

Here are sone of the highlights in the February 2021 update release:

  • Azure Cosmos DB connector renamed from DocumentDB to Cosmos DB on Gateway

  • Compatible with February PBI desktop

  • .Net Framework Update

  • Deprecation

  • Kerberos single sign-on (SSO) to SAP BW using gx64krb5

  • SAML single sign-on (SSO) to SAP HANA using OpenSSL 

The following issue has also been resolved:

  • In a Flow, when using an action where fields change dynamically if any change was made in the source of the connector, it was necessary to restart the gateway to clear the cache to see the changes. 

Learn more on the update here on Microsoft Power BI Blog.  

D365 Online | Deprecations | Most recent deprecations with deadlines in 2021

IE 11 support for Dynamics 365 and Microsoft Power Platform is deprecated

  • In line with already public Microsoft 365 and Azure DevOps announcements, you probably have received  a message in the Message Center announcing we will no longer be supporting Internet Explorer 11 (IE 11) after August 17, 2021.  
  • This will impact customers who use Dynamics 365 and Microsoft Power Platform products that are designed to be used through an Internet Explorer 11 interface.  
  • After August 2021, Internet Explorer 11 won’t be supported for such Dynamics 365 and Microsoft Power Platform products. We recommend that customers transition to Microsoft Edge. 

 

Low-density headers in model-driven apps won’t be supported with the 2021 release wave 2

  • With the upcoming 2021 release wave 2 (public preview in August 2021 and GA in October 2021), the low-density header option, and runtime experience won’t be supported in model-driven app forms. 
Refer to the Microsoft docs to understand Why is it needed and Action required.

 

Form footers in model-driven app won’t be supported with the 2021 release wave 2

  • With the upcoming 2021 release wave 2 (public preview in August 2021 and GA in October 2021), form footers won’t be supported in a model-driven app form.
Refer to the Microsoft docs to understand Why is it needed and Action required.

 

App Source: Deprecation of Attachment Management Add-on for Dynamics 365 using Azure Blob storage

  • Attachment Management solution is a popular add-on feature to Dynamics 365 Sales/Customer Services to manage notes and email attachments using Azure blob storage.
  • Enables business users to optimize use of Dynamics 365 Sales/Customer Services storage and retrieve files on-demand through Dynamics 365.
  • This solution is built on Dynamics 365 and seamlessly works on Dynamics CRM 2016 and above.

 

By June 2021, this solution will deprecate from App Source.

  • Recommended to move Default Dataverse File storage by providing Consent in Blob storage settings page for the existing Users of this solution.
  • Once the Consent accepted by the Customer then all the solution related Create/Update Plug ins will be disabled and start move all the new notes/activity mime attachments to default file storage.
  • Product team will start migrating the old attachments to Default File storage in back end.
  • Additional details are available here.

 

Organization data download filters for mobile offline are deprecated

  • Effective February 2021, Organization data download filter option that filters the data when you set up mobile offline are deprecated.
  • We recommend that you start preparing your organization and move relevant data filters from Organization data download filter to the offline profile option, which lets you determine what data will be available when users work in offline mode. 

Once the old filter criteria has been moved to offline profile, you can clear or delete the filters set in Organization data download filter.

 

TLS RSA cipher suites are deprecated

  • Beginning March 1, 2021, customers can only use our standard cipher suites. This change impacts your clients and servers that communicate with our servers, for example, syncing emails from your Microsoft Exchange server, running outbound plug-ins, using native (local) clients to access our servers. 

 

Dynamics 365 Home is deprecated

Dynamics 365 Home users will see notification about the new location and recommendation to change browser bookmarks starting October 1, 2020.

 

Deprecation of Office365 authentication type and OrganizationServiceProxy class for connecting to Dataverse

  • Effective April 2021, the authentication protocol will be retired for all new environments within a tenant.
  • Effective April 2022, the authentication protocol will be retired for all new and existing environments within a tenant. 
  • Effective February 4, 2020, the WS-Trust authentication type that is used by custom clients to connect to Dataverse is deprecated.
  • This change affects applications that utilize Microsoft.Xrm.Sdk.Client.OrganizationServiceProxy and Microsoft.Xrm.Tooling.Connector.CrmServiceClient classes for the authentication type of “Office365″. 

 

Regional Discovery Service is deprecated

  • Until March 1, 2021, Microsoft will continue to provide support, security, and other critical updates for the regional Discovery Service, but won’t release any additional functionality beyond what has already been announced.
  • After March 1, 2021, the regional Discovery Service won’t be available. 

 

Some client APIs are deprecated

  • Deprecated Client API: ClientGlobalContext.js.aspx
  • Replacement Client API: None
  • Comments:
    • The ClientGlobalContext.js.aspx page is deprecated and scheduled to be unavailable after October 1, 2021.
    • Alternative methods to access global context information will be available before April 1, 2021. 

Reminder | For the full list on deprecation check out the


Updates & Releases

 

Manage Updates | General availability deployment

 

 

  • Check out Dynamics 365 and Power Platform Release Plans to learn more about new features to be released in the release waves.
  • After a release wave is generally available, all environments will be automatically turned on to receive mandatory updates which will enable the early access features and the general available features of a release.
Tips: Stay Informed with Message Center notifications for Dynamics and PowerApps HERE

Microsoft Dynamics 365 Online Release

 

Note:

  • The Service Updates for Dynamics 365 online version 9.1 are now available in some regions and coming soon to others.
  • To determine which version your organization has applied, check your Microsoft Dynamics 365 Online version number.
  • Click the gear icon in the upper-right corner, then click About.
  • Platform Services
    • When creating a canvas app within a solution and viewing the dependencies in the solution, the canvas app was not listed as a dependency.
    • The “Call to” and “Call from” fields were empty in the Activity entity grid when used in offline mode. *
  • Accessibility
    • When creating a new record within a custom entity, the lookup menu did not contain a name label in the Unified Interface.
    • On iOS, in the Field Service mobile app, the accessibility screen reader did not read the name or purpose of navigation buttons. *

 

Service Update 21023 (9.2.21023 or higher) resolved issues including in this list below and much more:
  • Platform Services
    • When importing a contact record into a marketing campaign, the import remained in the “transform” stage. *
  • Unified Interface
    • New records created in the contact sub grid were not displayed in the sub grid. *
    • When creating a phone call activity, the activity description was not present in the timeline. *
    • The Notes tab in a Booking entity and Timeline tab of a work order continuously loaded. *

Microsoft Dynamics 365 On-premises Updates

 

For the latest Service Updates for Dynamics 365 on-premise version 8.2 and 9.0, visit Service Updates

Cumulative updates available for Microsoft Dynamics CRM 9.0

  • Repaired Functionality
    • Issue with the Audit Summary view filter not showing correctly.
    • White space display issue on editable sub-grid.
    • Unexpected error displaying cases on TimeLineWall.
    • Upgrade of BulkOperation Activities from v8

Portal Capabilities for Microsoft Dynamics 365 Releases

 

For the latest Portal capabilities, visit Latest Portal capabilities  - Read the Potential Breaking Change section for awareness.

Version Check and Upgrade Information

  • To determine the version of a particular portal deployment, please reference to this KB #3166126.
  • For instructions on how to upgrade the solutions within Dynamics 365 when a newer version is available, please reference to this KB #3192042. 

Latest Portal Capabilities Update-

This release includes only an updated portal host and no solution package updates.

The portal host will automatically be updated to version 9.2.10.19 by Microsoft.

To know the release schedule please check Office 365 message center for release schedule in your geographical region.

The latest Portal host Version 9.2.10.x resolves the following issues and more:

  • Related Record Filtering defined on lookups is not working correctly when a 1:n relationship is used to filter records.
  • Unable to search datetime field in dd/MM/yyyy format when using search functionality in entity list.
  • Month chart does not honor the sorted order defined in the chart for link entities

                               Back to Top

Additional Information

 

 

PowerApps | January 2021 Updates recap for Microsoft PowerApps

Power Apps feature recap for the month of January 2021:

Go here to register to the next Power Apps Global Bootcamp session, where you can see demonstration of features being working on, that enables even better integration of Power BI and Power Automate.

Power Apps | New Search Experience and quick actions in Power Apps

                                                                                                                                                                                                            

  • It enhances productivity, makes information discovery easier and is an essential part of how customers navigate through model-driven Power Apps. 
  • With relevance search and 2021 release wave 1 preview enabled, you can try out all of these capabilities:
    • Search bar in the header
    • Suggested results, as you type
    • Results page that is easy to understand
    • Quick actions with search 
To find out more, check out this page on Power Apps blog

Power Apps | Use ‘Edit in Excel” in Dataverse for Teams

 

To learn more about this functionality, check out this page on Power Apps blog.

Power Automate | Automate document processing with Power Automate

 

Power Platform enables you to build a rich and robust document automation solution using:

  • Power Automate to orchestrate the overall process.
  • AI Builder to bring the Intelligence required to efficient extract information from documents.
  • Power Apps to allow users to manually review and approve documents.
  • Dataverse to manage the document queue and store all the data, files and configuration required.
We released the ‘document automation’ which provides this standard end-to-end solution and includes the following components:
  • Manage documents received in emails orchestrated by Power Automate,
  • Use AI Builder form processing solution to extract data from those documents,
  • Process the data extracted and allow users to review and approve data, through a central manual validation Power App.

Power Automate | 12 New Connectors Released in January 2021

 


Power Automate | Power Automate Desktop February 2021 Update

 

The February 2021 update of Power Automate Desktop has been released!

New features and updates have been added, it includes:

  • The ‘Run from here’ functionality is now available in the flow designer.

  • Various actions have been improved to be configured more intuitively.

  • Existing variables can now be selected and used as produced variables.

Learn more about the February 2021 Update here on Power Automate blog. 

Training Corner

Power Apps | Optimizing data migration – integration with Power Platform

Learning corner                                                                                              

This blog covers scenario where we need to move data (testing/building data migration/integration process), using ADF (Azure Data Factory) and how to optimize the data transfer. 

  • For instance, you can move millions of rows quickly; in addition, you have a daily integration that needs to move high number of rows in a very short time frame.  
  • You are able push some data BUT not fast enough. 

This blog will help explains the impact of different configurations such as batch size and parallelism level. 

For full details, on this page on Dynamics 365 Community blog.

Power Apps | Enhanced component properties

Learning corner                                                                                              

We’ve added two great new experimental features to formula based components:

  • Property parameters.  You can now pass parameters for a property evaluation, similar to how you would pass parameters to a parameterized query or function in other languages.   You can now, for example, define an output property MathUtils.RandBetween that does a calculation based on its parameters and is called like a function.

With property parameters, you can now create simple user defined functions with formulas! 

There are some limitations, such as output properties must be pure without side effects, but it shows the direction we are headed.

There will be more to come in the months ahead.

Read more details on this page to test the experimental features on Power Apps blog.

Dynamics 365 0n-premises | Set of Best practices for a Dynamics 365 on-premises deployment

Learning corner

Here us a list of good documentation and best guidance for a Dynamics 365 On-Premises deployment:

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

Read More

Power Platform – Dynamics 365: Technical Update Briefing Webcast: March 25

March 13, 2021   Microsoft Dynamics CRM

Please join us on March 25, 2021 at 11:00am US Eastern Time for a Technical Update Briefing regarding Power Platform and Dynamics 365.  This update will include top of mind updates for Power Platform and Dynamics 365 topics and will be based on our monthly updates we have been providing in this blog for some time.  The webcast will be delivered via a Teams Live Event, with a link to join below and a link to add to your calendar.  Some of the topics we will be covering are:

  • Microsoft Events (Ignite, BizApps Launch event…)
  • Power Apps and Dynamics 365 2021 release wave 1
  • Introducing Power Apps mobile app’s new look
  • Understanding Updates
  • Deprecations
  • And many others…

Questions will be taken via chat during the webcast and we will answer as many as possible.  So please join us March 25 for the latest updates on Power Platform and Dynamics 365!

Meeting link (accessible March 25 prior to meeting)

Add to Calendar link (iCalendar File)

Microsoft Business Applications Customer Engineer Team

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

Read More

IBM launches AI platform to discover new materials

March 6, 2021   Big Data
 IBM launches AI platform to discover new materials

The power of audio

From podcasts to Clubhouse, branded audio is more important than ever. Learn how brands are increasing customer loyalty and personalization with these best practices.

Register Now


Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.


IBM today announced the launch of the Molecule Generation Experience (MolGX), a cloud-based, AI-driven molecular design platform that automatically invents new molecular structures. MolGX, a part of IBM’s overarching strategy that aims to accelerate the discovery of new materials by 10 to 100 times, uncovers materials from the property targets of a given product.

The chemical sciences have made strides in the discovery of novel and useful materials over the past decades. For example, in the area of polymers, the recent development of thermoplastics has had an influence on applications ranging from new paints to clothing fibers. But while the discovery of new materials is the driving force in the expansion and improvement of industrial products, the vastness of chemical space likely exceeds the ability of human experts to explore even a fraction of it.

By observing and selecting a dataset, MolGX leverages generative models to produce molecules from chemical properties like “solubility in water” and “heatability.” The platform trains an AI model to predict chemical characteristics within given parameters and synthesizes molecular structures based on the model built.

“The development of new materials follows a number of different pathways, depending on both the nature of the problem being pursued and the means of investigation. Breakthroughs in the discovery of new materials span from pure chance, to trial-and-error approaches, to design by analogy to existing systems,” Seiji Takeda, technical lead of material discovery at IBM, wrote in a blog post. “While these methodologies have taken us far, the challenges and requirements for new materials are more complex — so too are the demands and issues for which new materials are needed. As we face global problems such as pandemics and climate change, the necessity and urgency to design and develop new medicines and materials at a faster pace and on a molecular scale through to the macroscopic level of a final product is becoming increasingly important.”

IBM has released a free trial version of MolGX trained using a built-in dataset, which the company applied internally to the development of a new photoacid generator — a key material in electronics manufacturing. A professional version of MoIGX with additional functionality including data upload, results exportation, customized modeling, and more is available with a license. According to IBM, this paid release carried out the inverse design of sugar and dye molecules over 10 times faster than human chemists at Nagase & Co Ltd, a chemical manufacturing company.

Beyond IBM, startups like Kebotix are developing AI tools that automate lab experiments to uncover materials faster than with manual techniques. Meanwhile, Facebook and Carnegie Mellon have partnered on a project to discover better ways to store renewable energy, in part by tapping AI to accelerate the search for electrocatalysts, or catalysts that participate in electrochemical reactions.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Teradata Joins Open Manufacturing Platform

March 2, 2021   BI News and Info
teradata joins open manufacturing platform.jpg?width=1280&height=672&ext= Teradata Joins Open Manufacturing Platform

Teradata will work with leading manufacturers in the Open Manufacturing Platform community to develop solutions for industrial IoT and Industry 4.0

Group aims to enable faster and more cost-effective innovation and production in manufacturing and automotive through cloud-based data analytics

Teradata (NYSE: TDC), a leading multi-cloud data warehouse platform provider, joined the Open Manufacturing Platform (OMP) – a global alliance initiated by the BMW Group and Microsoft to drive industrial IoT developments and build future Industry 4.0 solutions in the manufacturing and automotive sector. As a member of the OMP, Teradata will help manufacturers accelerate innovation and achieve production efficiencies across the entire value chain by unlocking the potential of their data.

Common Challenges for Industry 4.0

While working to build their “smart factory,” in which all plants, products and processes are digitalized and interconnected, manufacturers worldwide share similar advantages and challenges: They benefit from a rapid adoption of new technologies that allow for greater operational efficiencies, factory output, customer loyalty and net profits, but too many manufacturers still struggle with the real-world implementation of IoT concepts. One such challenge is ensuring that data from legacy, as well as modern systems, is interoperable and usable across a multi-cloud environment so that holistic decisions and predictions can be made. Proprietary systems, which so often become data silos, make reliable visibility throughout the supply chain daunting, and thus hinder future business models. In addition, thousands of analytical models are needed to evaluate the petabytes of data produced by machines and robots every hour in order to generate actionable insights.

Building the Future of Manufacturing with Data Analytics

To solve these pressing challenges, Teradata will bring its extensive experience, expertise, and solutions in cloud-based data analytics to the OMP community, with a specific focus on the “Manufacturing Reference Architecture” and “Semantic Data Structuring” working groups. In open collaboration and knowledge sharing with leading manufacturers, Teradata will work to develop a blueprint for an industrial IoT architecture, and to simplify the deployment and management of the entire cloud and edge manufacturing environment. Teradata will also help develop a semantic data structuring layer so that manufacturers can easily share, reuse, and evaluate heterogeneous data.

Both are critical projects in terms of the objective of the working groups: to define and to use a common open data model, to influence existing and future standards, and to build open solutions for industrial IoT. By being completely independent of cloud providers or other vendors, Teradata’s technology substantially aligns with the core principle of the OMP to bring forward platform-agnostic solutions that enable a smarter future in manufacturing.

“Teradata has been helping the world’s leading manufacturing and automotive companies boost their production through data for more than 40 years,” says Peter Stadler, Vice President, Central Europe at Teradata. “As a member of OMP, we are pleased to accompany them on their way to Industry 4.0 and to contribute to new cutting-edge solutions with the help of data analytics and thus also to smart manufacturing of the future.”  

Founded in 2019 under the umbrella of the Joint Development Foundation, a part of the Linux Foundation, the OMP helps manufacturing companies accelerate innovation at scale through open cross-industry collaboration, knowledge and data sharing as well as access to new technologies. Current OMP steering committee members are Anheuser-Busch InBev, BMW Group, Bosch Group, Microsoft, and ZF Friedrichshafen AG.

Teradata also contributes to other industry alliances, such as the Open Subsurface Data Universe (OSDU) Forum, focused on developing a standard data platform that brings together exploration, development, and wells data in the oil and gas industry.

More information about the Open Manufacturing Platform: https://open-manufacturing.org/

Let’s block ads! (Why?)

Teradata United States

Read More

TripleBlind raises $8.2 million for its encrypted data science platform

March 1, 2021   Big Data
 TripleBlind raises $8.2 million for its encrypted data science platform

Data: Meet ad creative

From TikTok to Instagram, Facebook to YouTube, and more, learn how data is key to ensuring ad creative will actually perform on every platform.

Register Now


Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.


TripleBlind, a Kansas City, Missouri-based startup developing a platform that enables companies to train models on encrypted data, today announced that it raised $ 8.2 million. The company says the proceeds will be put toward R&D as it looks to expand the international reach of its products.

In industries like financial services and health care, privacy regulations like the U.S. Health Insurance Portability and Accountability Act (HIPAA) prevent companies from sharing data. As a result, 73% of enterprise data goes unanalyzed, according to a recent analysis.

Earlier in his career, CEO Riddhiman Das was the product architect at EyeVerify, where he helped commercialize a software-only biometric method for verifying the identity of mobile users. EyeVerify was bought by payments giant Ant Financial, but it ran into privacy issues and wasn’t able to collect eye image data with which to improve its algorithms. Shortly after, cofounder Greg Storm and Das  left Ant to devote themselves to creating a solution — TripleBlind — that focused on market-driven and regulatory concerns with regard to data storage and auditability. .

TripleBlind claims to have developed “next-generation” cryptology that allows companies to provide and use sensitive data and algorithms in an encrypted space, without compromising privacy. The startup aims to launch a marketplace that will let research institutes, businesses, and engineers collaborate around both sensitive data and algorithms.

TripleBlind’s product encrypts data during upload and affords users the option of encrypting their algorithms where intellectual property theft might be of genuine concern. The company’s digital rights management and auditability tools for transactions are designed to complement this by letting suppliers control who, when, how often, and for what purpose their models are used.

For example, in health care settings, TripleBlind — which has an ongoing partnership with the Mayo Clinic, Accentures, and about a dozen other partners and customers — enables health care providers to provide only encrypted data from partners, providers, and patients. That allows data scientists to perform operations they would not be able to do using raw electronic medical records without running afoul of HIPAA.

“The core thesis at TripleBlind is that privacy enforced data and algorithm interactions will unlock the tremendous value, currently trapped in private data stores and proprietary algorithms,” Das said. “We will move the world from ‘don’t be evil’ to ‘can’t be evil,’ by enabling everyone to freely collaborate around their most sensitive data and algorithms without compromising their privacy. Together we will create a new paradigm of compounded value.”

This latest investment in TripleBlind brings the company’s total raised to nearly $ 10 million, following a round led by Accenture in November 2020.

Enthusiasm for data encryption has given rise to a cottage industry of startups estimated to be worth a combined $ 268.3 million by 2027. Duality Technologies, which recently attracted funding from one of Intel’s venture capital arms, pitches its homomorphic encryption platform as a privacy-preserving solution for “numerous” enterprises, particularly those in regulated industries. In March 2019, Paris-based Cosmian raised €1.4 million (about $ 1.5 million) for a data encryption product that combines functional and homomorphic encryption. There’s also Enveil, which is developing “enterprise-scale” data encryption solutions, and security-conscious collaboration platform Cape Privacy.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Optimizing data migration/integration with Power Platform

February 28, 2021   Microsoft Dynamics CRM

You are getting close to your “Go Live Date”.  You have gone through multiple iterations of testing processes and you are now in the process of testing/building data migration/integration process.  You have to move millions of rows quickly; in addition, you have a daily integration that needs to move high number of rows in a very short time frame.   You are able push some data but not fast enough.  

If you are still following, you might have encountered this issue in the past or will encounter it in the future, maybe you had a similar issue as well.  In this blog I’ll cover a scenario where we need to move data, using ADF (Azure Data Factory) and how to optimize the data transfer.   I’ll also explain the impact of different configurations such as batch size and parallelism level.  I do hope that you will find this blog helpful.

Challenge

We need to move the data from one instance to another, the data that we are moving is a custom table.  This is a very simple table, not too many fields no plugins are getting triggered.  The amount of data that we need to move is about 36K rows.  We need to move it as fast as we can.

Solution

If we are moving row by row, and each row takes 1 second to process, it may take up to10 hours (36k seconds) to process all the rows.  Luckily, Dynamics 365 support batching processes and parallelism so we can speed up the process.  When we are planning to utizlie parallelism we should also look at the Power Platform throttling or API calls limitations. 

See links:

Service Protection API Limits

Requests limits and allocations

To figure out what is the best configuration to move data I researched with different scenarios.   In the screenshots below, you can see Azure Data factory configuration with the Dynamics 365 connector.

In the Sink tab, you can configure the batch size and max concurrent connections:

 3250.pastedimage1614362910745v1 Optimizing data migration/integration with Power Platform

  

In the Setting tab, you can configure the degree of copy parallelism:

2146.pastedimage1614362957377v2 Optimizing data migration/integration with Power Platform

In case that you are not familiar with Azure Data Factory, here is a useful link:

Copy data from and to Dynamics 365 (Common Data Service) or Dynamics CRM by using Azure Data Factory

Azure Data Factory documentation

I ran multiple scenarios where I moved the data, you can see the results in the following table:

Batch size

Concurrent

Parallel

Rows

Seconds

Create rows per second

10

52

32

35985

98

367.1938776

50

52

32

35985

84

428.3928571

100

52

32

35985

86

418.4302326

100

52

20

35985

99

363.4848485

100

52

16

35985

113

318.4513274

Consider the scenario where I deal with a larger data set, can I use the same configuration?  Do I need to modify it?  Please see the following scenarios:

Batch size

Concurrent

Parallel

Rows

Seconds

Create rows per second

10

52

32

150000

1147

130.7759372

50

30

30

150000

1133

132.39188

100

10

16

150000

834

179.8561151

200

24

8

150000

717

209.2050209

200

24

12

150000

795

188.6792453

Conclusions

  1. Increasing the batch size, doesn’t always help. Batching size can also have different impact based on the latency and locations of where those processes are running from/to.
  2. You could see that there is a different between vey high amount of data vs small amount of data and how can you spread the process to run in parallel.
  3. Increasing parallelism can help a lot, but if we have too many processes in parallel and high amount of data, that will actually results in a slower rate of operations.  (see service API Protection limits)
  4. If you are encountering this problem, use the tables above as a guideline, measure how many records you are creating, look at the throttling and limitations documentations.  I highly recommend to make few tests to optimize your process.
  5. You can apply the same process of optimizing, when you use other ETL tools or build your own console applications.  For more documentations and configuration see this link: How to maximize throughput

Disclaimers

Please note that I oversimplify the process here, things to consider when we are moving data:

  1. Are we triggering plugins? Can we turn them off?
  2. What type of entities are we moving, how many attributes, for example creating a contact is more expensive compared to create a custom entity, creating an entity with 10 columns is more expensive compared to creating and entity with 2 columns (columns type is also a factor)?
  3. How much data already exist in the system can also be impactful.
  4. What type of operation are we doing: create/update/insert/delete?

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

Read More
« Older posts
  • Recent Posts

    • Accelerate Your Data Strategies and Investments to Stay Competitive in the Banking Sector
    • SQL Server Security – Fixed server and database roles
    • Teradata Named a Leader in Cloud Data Warehouse Evaluation by Independent Research Firm
    • Derivative of a norm
    • TODAY’S OPEN THREAD
  • Categories

  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited