• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Azure

Build and Release Pipelines for Azure Resources (Logic Apps and Azure Functions)

January 16, 2021   Microsoft Dynamics CRM

Today’s blogpost is lengthy but should serve as a fantastic resource going forward. We will explore Build and Release Pipelines for Azure Resources as follows: Introduction Creating a build pipeline Creating a release pipeline Task Groups Library Logic App parameters Enjoy (and bookmark!) Azure Pipelines is a fully featured continuous integration (CI) and continuous delivery (CD) service.

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

Azure Availability Blog Series: Part 1 of 3

December 18, 2020   Microsoft Dynamics CRM

The move from on-premises to Azure brings its fair share of paradigm shifts to traditional architecture design, even if adoption of Azure services is not in the plan. Virtual Machine (VM) availability, or uptime, is one area that can cause a lot of issues if not planned for ahead of time, leading to rework and possible downtime. In this three-part blog series, we will look at why availability…

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

Better Microsoft CRM with Azure Synapse Analytics & Power BI Premium Gen 2

December 7, 2020   CRM News and Info

Let’s be honest. The world we live in is driven by data. Presentation and efficiency matter. Using the right CRM with the best visuals and security will not only increase your revenue, but guarantee the satisfaction of your valued customers.

JourneyTEAM, a Microsoft Gold Partner, will help you get your organization ahead of this data-driven game. Microsoft announced updates with safer data storage, and smarter, faster analysis platforms. Introducing Azure Synapse Analytics and the Next Generation Power BI Premium! Below you’ll discover why you need not just one, but both in supporting your CRM.

x2020 12 02 11 12 31 625x349.jpeg.pagespeed.ic.GptUULYq E Better Microsoft CRM with Azure Synapse Analytics & Power BI Premium Gen 2

Teamwork Makes the Dream Work

Azure analytics and Power BI are an unparalleled combination. This duo is the optimum performance accelerator. You can reap the benefits of increased capabilities, and incredible cost-management. Using Azure and Power BI together allows you to extract maximum data and technology value for your organization. Just like it is important that your customers trust you, you need to use your data productively with Microsoft technology you trust. Read more to learn about each product. Then, see how they make the best team.

The Basics of Azure

Microsoft’s most popular cloud computing service is known as Azure. Azure gives you access to a limitless directory of:

  • Databases
  • Backups
  • Virtual machines
  • File storage
  • Services for web and mobile apps 

x2020 12 02 11 11 43 625x336.jpeg.pagespeed.ic.Elb5Lw7iqe Better Microsoft CRM with Azure Synapse Analytics & Power BI Premium Gen 2

The Update: Azure Synapse 

Azure Synapse Analytics is a modern, cloud based data warehouse. Now, you can build, deploy and run edge and hybrid computing apps. This developmental environment allows you to run these apps consistently across your IT ecosystem. It is also flexible regardless of the workload. Synapse Analytics offers benefits in performance, protection and price.

  • Pristine Performance – Synapse is limitless. The service unites enterprise data warehousing and big data analytics seamlessly. You can easily learn, prep, manage, and serve data. Azure Synapse has an incredible capacity so that you can, on your own terms, query data. You have the option to use either provisioned or serverless resources. 
  • Protection – Azure’s analytics are services you can trust. Expect nothing less than sound protection when it comes to data governance, compliance, and security. You can find comfort in knowing your most valuable business data is protected by the most advanced security and privacy features in the market. 
  • Cost Management – If transformative insights are what youre looking for, Azure Synapse is what you need. With analytics that produce a significant 271-percent RO, it’s the price performance undisputed leader. This means it is 94 percent cheaper and up to 14 times faster.

The Basics of Power BI

Microsoft’s Power BI is an easy-to-use scalable platform. This business analytics service helps you gain more profound data insight. Data analyzing is an experience to enjoy. Get impressive visual data charts, dashboards, and reports. Personalize your reports with your brand and KPIs. Connect to, and create unforgettable presentations. Everyone in your organization will love Power BI’s stunning visualizations.

x2020 12 02 11 26 10 625x351.jpeg.pagespeed.ic.mm5I0WBIE6 Better Microsoft CRM with Azure Synapse Analytics & Power BI Premium Gen 2

The Update: Power BI Premium Gen 2

With new Premium capacities, you can have access to all of the features of Power BI. Power BI Premium Gen 2 and its improved architecture provides you with these benefits:

  • You get free distribution and even licensing for on-premise usage. Get Premium capabilities for Non-Premium owners.
  • Analytics operations run up to 16X faster with enhanced performance – any capacity size and any time. The top speed operations will not slow down.
  • Fewer memory restrictions means you don’t have to track refresh schedule spacing. Schedule refreshes and report interaction have complete separation.
  • Autoscale automatically adds 1 v-core at a time for 24-hr time periods. When idle time is detected, the v-core is removed.
  • New utilization analysis, budget planning, and metrics are reimagined. With built-in reporting you have clearly visible chargebacks and need to upgrade.

How They Work Together

Azure Synapse and Premium Gen 2 work in perfect harmony. The collaborative process is as follows: Synapse determines materialized views from collected query patterns. Then, Power BI creates the materialized views, and directs the queries. You are able to find solutions for reporting, advanced analytics, and real-time analytics with little effort exerted on your part. Let Azure Synapse and Premium Gen 2 help you business run even smoother.

Get Azure Synapse Analytics and Power BI Premium Gen 2 with JourneyTEAM

JourneyTEAM will show you how to get the most out of these services along with your CRM. We’re a Microsoft Gold Partner with knowledge and experience with Microsoft products. Allow us to help you make this transition. Contact us today!

Click HERE to see the full article

JourneyTEAM was recently awarded Microsoft US Partner of the Year for Dynamics 365 Customer Engagement (Media & Communications) and the Microsoft Eagle Crystal trophy as a top 5 partner for Dynamics 365 Business Central software implementations. Our team has a proven track record with successful Microsoft technology implementations and wants you to get the most out of your organization’s intranet system. Consolidating your communication, collaboration, and productivity platform with Microsoft will save time and money for your organization. Contact JourneyTEAM today!


162x227x2020 08 24 15 32 44.jpeg.pagespeed.ic.gHcGugJN8B Better Microsoft CRM with Azure Synapse Analytics & Power BI Premium Gen 2Article by: Dave Bollard – Head of Marketing | 801-436-6636

JourneyTEAM is an award-winning consulting firm with proven technology and measurable results. They take Microsoft products; Dynamics 365, SharePoint intranet, Office 365, Azure, CRM, GP, NAV, SL, AX, and modify them to work for you. The team has expert level, Microsoft Gold certified consultants that dive deep into the dynamics of your organization and solve complex issues. They have solutions for sales, marketing, productivity, collaboration, analytics, accounting, security and more. www.journeyteam.com

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Tagging In Azure DevOps

November 8, 2020   Microsoft Dynamics CRM

What is Tagging? Tagging has long been a term used to describe a kind of graffiti to mark public spaces. In Azure DevOps (or ADO), tagging is similar because it can serve as a colorful label for work items to visually distinguish one epic from another, isolate a critical user story from hundreds of others, and so on. The big difference here is that tags in ADO should be encouraged (though…

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

Top 5 reasons why Attach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint, Dropbox and Azure Blob!

October 17, 2020   Microsoft Dynamics CRM

xa2d blog crm 625x357.jpg.pagespeed.ic.Pfp8d4nhnQ Top 5 reasons why Attach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint, Dropbox and Azure Blob!

One of the important roles of Dynamics 365 CRM is to provide structured document management functionalities. As documents in Dynamics 365 CRM are stored as attachments or Note records, managing large number of documents becomes quite complex. And the higher the size of the database, the costlier it gets for buying more storage. Hence, a solution that provides seamless integration of Dynamics 365 CRM with cloud storage of your choice for document management becomes a necessity.

Attach2Dynamics, as the name implies, is a Microsoft Preferred Dynamics 365 CRM / Power Apps solution that provides efficient management of Dynamics 365 CRM attachments and documents by integrating CRM to multiple cloud storages viz. SharePoint, Dropbox, and Azure Blog Storage. It has a user-friendly interface that lets you perform common file management actions like upload, download, browse, and delete documents within Dynamics 365 CRM without having to navigate to the configured cloud storage. Further, you can also send documents as emails and also generate sharable links of the files and folders for emailing. And all this from within Dynamics 365 CRM!

To better understand how this ‘out-of-the-box’ solution Attach2Dynamics will help you increase your efficiency in Dynamics 365 CRM, let us go through some of the prominent features of this productivity app.

1. Native Integration with multiple Cloud Storages

600x334xAttach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint Dropbox and Azure Blob 1 625x348.png.pagespeed.ic.dVjTuL VWR Top 5 reasons why Attach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint, Dropbox and Azure Blob!

Attach2Dynamics offers native integration between Dynamics 365 CRM and cloud storage like SharePoint, Dropbox, and Azure Blog Storage. This helps more space for document management in Dynamics 365 CRM. You can also create multiple instances of the same cloud storage with different names to store and bifurcate your data on the cloud as per your convenience.

2. File management with a user-friendly interface

600x336xAttach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint Dropbox and Azure Blob 2 625x350.png.pagespeed.ic.HOPmt gfu  Top 5 reasons why Attach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint, Dropbox and Azure Blob!

With the help of eye-pleasing easy to use interface of Attach2Dynamics, you can drag and drop multiple files and folders from Dynamics 365 CRM to configured cloud storages. You can also view all the copied/moved files and folders in a single glance with the help of Attach2Dynamics. File management actions like uploading, downloading, browsing, deleting documents, configuring notes, renaming files and folders and creating new folders are as simple as if you working on Windows Explorer.

3. Email Link or Attachment

600x330xAttach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint Dropbox and Azure Blob 3 625x344.png.pagespeed.ic.yY26DA3Jdq Top 5 reasons why Attach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint, Dropbox and Azure Blob!

With the help of Attach2Dynamics, you can also send files, folders and sales literature attachments in an email. It will also allow you to create anonymous file links for sharing. You can also set default from, to, cc and bcc for an email using the ‘Email configuration’ feature of Attach2Dynamics.

4. Bulk Migration Job

600x228xAttach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint Dropbox and Azure Blob 4 625x237.png.pagespeed.ic.euxCuLl8K1 Top 5 reasons why Attach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint, Dropbox and Azure Blob!

Attach2Dynamics will help you in copying/moving History Notes, Email and Sales Literature Attachments from Dynamics 365 CRM to SharePoint, Dropbox and Azure Blog Storage using this feature. Also, you can see the status of migrating attachments. For example, if you want to move emails of the last ‘X’ days from Dynamics 365 CRM to one of the configured cloud storages in bulk, you can do so easily with the help of this feature of Attach2Dynamics.

5. Security Templates

600x276xAttach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint Dropbox and Azure Blob 5 625x287.png.pagespeed.ic.IUQeKy1zzf Top 5 reasons why Attach2Dynamics is preferred document management app within Dynamics 365 CRM for SharePoint, Dropbox and Azure Blob!

Along with cloud integration and easy file management, Attach2Dynamics also enables managers to set permissions for all the user actions in configured cloud storage on the selected files and folders. User actions like Upload, Download, Email, Copy Link, Delete, etc. can be controlled using security templates.

With all these features, Attach2Dynamics is a must-have solution for your Dynamics 365 CRM environment. Also, check out our smart app SharePoint Security Sync that synchronizes Dynamics 365 CRM and SharePoint security model with a newly added feature to add eSign in your documents using DocuSign.

Don’t let your Dynamics 365 CRM / Power Apps clutter! Download Attach2Dynamics today and increase the efficiency of your CRM environment as well as that of your team.

You can get your free trial of 15 days from our website or Microsoft AppSource.

Drop us a mail at crm@inogic.com for a 1:1 demo or any of your unique document management requirements.

Stay safe and work smart!

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Microsoft explains how it improved automatic image captioning in Azure Cognitive Services

October 14, 2020   Big Data

Automation and Jobs

Read our latest special issue.

Open Now

Microsoft today launched a new computer vision service it claims can generate image captions that are, in some cases, more accurate than human-written descriptions. The company calls the service, which is available as part of Azure Cognitive Services Computer Vision, a “significant research breakthrough” and an example of its commitment to accessible AI.

Automatic image captioning has a number of broad use cases, first and foremost assisting users with disabilities. According to the World Health Organization, the number of people of all ages who are visually impaired is estimated to be 285 million, of whom 39 million are blind.

Accuracy becomes all the more critical when vision-impaired users rely on captioning for daily tasks. According to a study by researchers at Indiana University, the University of Washington, and Microsoft, blind people tend to place a lot of trust in automatically generated captions, building unsupported narratives to reconcile differences between image contexts and incongruent captions. When asked to identify captions of images on Twitter that might be incorrect, even blind users who describe themselves as being skilled and consistent about double-checking tended to trust automatic captions, the researchers found — no matter whether the captions make sense.

In early 2017, Microsoft updated Office 365 apps like Word and PowerPoint with automatic image captioning, drawing on Cognitive Services Computer Vision. (Cognitive Services is a cloud-based suite of APIs and SDKs available to developers building AI and machine learning capabilities into their apps and services.) More recently, the company launched Seeing AI, a mobile app designed to help low- and impaired-vision users navigate the world around them.

But while Office 365 and Seeing AI could automatically caption images better than some AI baselines, Microsoft engineers pursued new techniques to improve them further.

The engineers describe their technique in a September paper published on Arxiv.org, a server for preprints. Called visual vocabulary pretraining, or VIVO for short, it leverages large amounts of photos without annotations to learn a vocabulary for image captioning. (Typically, training automatic captioning models requires corpora that contain annotations provided by human labelers.) The vocabulary comprises an embedding space where features of image regions and tags of semantically similar objects are mapped into vectors that are close to each other (e.g., “person” and “man,” “accordion” and “instrument”). Once the visual vocabulary is established, an automatic image captioning model can be fine-tuned using a data set of images and corresponding captions.

 Microsoft explains how it improved automatic image captioning in Azure Cognitive Services

Above: Image captioning results on nocaps. B: A baseline without adding VIVO pretraining. V: With VIVO
pretraining. Red text represents novel objects. The bounding box color is brighter when the similarity is higher.

Image Credit: Microsoft

During the model training process, one or more tags are randomly masked and the model is asked to predict the masked tags conditioned on the image region features and the other tags. Even though the dataset used for fine-tuning only covers a small subset of the most common objects in the visual vocabulary, the VIVO-pretrained model can generalize to any images that depict similar scenes (e.g., people sitting on a couch together). In fact, it’s one of the few caption-generating pretraining methods that doesn’t rely on caption annotations, enabling it to work with existing image data sets developed for image tagging and object detection tasks.

Microsoft benchmarked the VIVO-pretrained model on nocaps, a test designed to encourage the development of image captioning models that can learn visual concepts from alternative sources of data. Evaluated on tens of thousands of human-generated captions describing thousands of images, the model achieved state-of-the-art results with substantial improvement for objects it hadn’t seen before. Moreover, on a metric called consensus-based image description evaluation (CIDEr), which aims to measure the similarity of a generated caption against ground truth sentences written by humans, the model surpassed human performance by a statistically significant margin.

In addition to the latest version of the Cognitive Services Computer Vision API, Microsoft says the model is now included in Seeing AI. It will roll out to Microsoft products and services including Word and Outlook, for Windows and Mac, and PowerPoint for Windows, Mac, and web later this year, replacing an image captioning model that’s been used since 2015.

“Given the benefit of this, we’ve worked to accelerate the integration of this research breakthrough and get it into production and Azure AI,” Eric Boyd, corporate vice president of AI platform at Microsoft, told VentureBeat via phone earlier this week. “It’s one thing to have a breakthrough of something that works in a delicate setup in the lab. But to have something that [in a few months] we can have pressure-tested and operating at scale and part of Azure … showcases how we’re able to go from the research breakthrough to getting things out into production.”

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

A Guide to Microsoft Azure and your CRM | Pizza as a Service Example

October 2, 2020   Microsoft Dynamics CRM

Several organizations of all sizes are moving away from their on-premise data centers to the flexibility and scalability of the cloud. While there are thousands of cloud computing services available, one of the most popular is Microsoft Azure, a cloud computing platform designed by Microsoft. But what is Microsoft Azure and why is it so popular? How are organizations using this platform? In this guide, we’ll provide an in-depth look at Microsoft Azure and how JourneyTEAM can help you migrate to the cloud with ease.

Cloud Computing Definition

To get a better understanding of Microsoft Azure, it’s important you know what cloud computing is. The simple definition of cloud computing refers to delivering various services through the internet rather than through a hard drive or on-premise data center. These services include anything from data storage and processing to software and analytics. Because the cloud is powered by the internet, the services and tools stored are accessible from anywhere with a stable internet connection. Some examples of popular cloud computing platforms include Google Drive and Microsoft OneDrive.

xMicrosoft Azure Image 625x585.jpg.pagespeed.ic.PCDKnuU43w A Guide to Microsoft Azure and your CRM | Pizza as a Service Example

A Look Into Microsoft Azure

Microsoft Azure is a public cloud platform developed by Microsoft that allows users to use tools, services, and resources for analytics, storage, virtual computing, deployment, and development. Azure is designed to replace or segment an organization’s on-premise data center.

Within Azure, there are three different types of cloud computing:

  • Platform as a Service (PaaS): In this model, users essentially rent everything they need to build an application from the cloud vendor. Using PaaS, users can build an application using the tools and resources available from the vendor.
  • Software as a Service (SaaS): This method allows users to access data from any device directly from an internet browser. SaaS is hosted and maintained by a software vendor.
  • Infrastructure as a Service (IaaS): This form of cloud computing provides computing, networking, and storage resources directly to users through the internet.

Pizza as a Service and Cloud Computing

The Pizza as a Service analogy was developed by Albert Barron in 2014 to help explain the differences between IaaS, PaaS, and SaaS.

  • Homemade: The process of making a homemade pizza falls entirely on you. You have to gather the ingredients, build the pizza, and bake it. On-premise computing is similar to making a pizza at home. The task of maintaining and operating the entire technology stack is your team’s responsibility.
  • Take and bake: At Papa Murphy’s, you’re able to choose what you’d like on your pizza. Based on your recommendations, they’ll assemble the pizza after which you’ll pick it up and bake it. With the IaaS model, your team is tasked with managing the operating system and the cloud vendor manages the networking and virtualization part of your system.
  • Pizza delivery: Ordering a pizza for delivery takes much of the responsibility off of you. All you have to do is provide a location in which the pizza can be eaten. The PaaS model is like this in which the cloud vendor has various services you can use and your team is responsible for managing the data and applications.
  • Eating out. When you dine out and order a pizza, a fully assembled and baked pizza is delivered to you and the restaurant provides a place for you to eat it. The entire process is taken care of for you. SaaS is similar to this method as everything you need to build, deploy, and scale applications is provided for you by the cloud vendor.

Putting Azure to Work in Your Business

The flexibility and agility of Azure allows businesses to put it to use in a number of ways. One of the most popular uses of Azure is using it to run virtual machines in the cloud, but there are dozens of ways to use the services and features of Azure. Some these include:

  • Deploy and manage virtual machines
  • Build and develop mobile applications
  • Development and deployment of online applications
  • Analytics
  • Storage
  • Networking
  • Security
  • Data backup and disaster recovery

These are just some of the ways you can make Microsoft Azure work for you. Additionally, Microsoft is continuously adding new features and tools to help you manage and deliver services.

Why Use Microsoft Azure

Microsoft Azure is one of the fastest growing cloud platforms on the market. One of the biggest reasons for its popularity is that it allows users to do so much more than build and host applications in the cloud. Additional benefits of using Microsoft Azure as your cloud platform include:

  • Tight security: When you move your data to the cloud, you want to ensure it’s protected as anyone can access it from anywhere. Microsoft Azure features a number of security certificates and was the first cloud vendor to be fully compliant with IOS 27018.
  • Machine learning and data analytics: Included with Azure is the Azure Machine Learning Studio that features a library of algorithms that can be used to predict customer behavior and provide valuable insight.
  • Scalability: Microsoft goes a step further with scalability by allowing users to pay as they go. Whether you’re looking to launch new applications, use machine learning, or utilize data processing features, you pay only for what you use.
  • Software development kits: Whatever your platform of choice is–JAVA, .NET, Ruby, or Python–Azure has an SDK for it.
  • Budget-friendly: When you migrate to the cloud, you no longer have to pay for software installations or upgrades or pay for extra storage or disk space. Additionally, Azure allows you to pay for only the services you need, keeping more money in your IT budget.

Migrate to Azure with the Support of a Microsoft Gold Partner

The reasons to move to Microsoft Azure are endless–the flexibility, reliability, and security of the platform being just a few. If you’re ready to make the move to Azure, let JourneyTEAM help you migrate. JourneyTEAM can help you learn these services and make them work in your organization. Whether you’re interested in machine learning, AI, analytics, or internet of things, our team is here to support you. Contact us today to get started.

SEE THE FULL ARTICLE HERE

About JourneyTEAM

JourneyTEAM has more than 20 years of experience in delivering IT solutions to businesses. As a Microsoft Gold Certified Partner, we’ve worked closely with Microsoft to carefully provide organizations with Microsoft products, including SharePoint, Azure, Office 365, Dynamics 364, SL, AX, GP, NAV, and CRM, and make them work for you. Whether you’re in need of a collaboration, marketing, sales, or productivity solution, JourneyTEAM can help you find the right technology for your business. To learn more about our team, visit our website at www.journeyteam.com.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Azure Data Factory pipelines: Filling in the gaps

September 19, 2020   BI News and Info

Through the term was not really in the vernacular as it is today, I have been a full or part-time “Data Engineer” my entire career. I have been quite comfortable with Microsoft ETL tools like SSIS for many years, dating back to the DTS days. My comfort with SSIS came from many years of trial and error via experimentation as well as adhering to the best practices put forth and tested by many of my colleagues in the SQL Server field. It was and still is a widely used and well-documented ETL platform. With the release of Azure Data Factory several years ago, though it was not touted as an SSIS replacement, many data engineers started working with and documenting this code-free or low-code orchestration experience, and I was one of them.

As with any technology, only with knowledge and experience will you be able to take advantage of all its key benefits, and by the same token will you uncover its severe limitations. On a recent assignment to build a complex logical data workflow in Azure Data Factory, that ironically had less “data” and more “flow” to engineer, I discovered not only benefits and limitations in the tool itself but also in the documentation that provided arcane and incomplete guidance at best. Some of the incomplete knowledge I needed was intrinsic to Azure Logic Apps, which I grant I had done very little with until this project, but it played a pivotal role as an activity called from the pipeline. I wanted to share a few pieces of this project with you here in hopes to bolster, however small, the available sources for quick insight into advanced challenges with ADF and to a lesser extent, Azure Logic Apps.

Specifically, I was asked to create a pipeline-driven workflow that sends approval emails with a file attachment and waits for the recipients to either approve, reject or ignore the email. If the approvers do not respond to the emails in the time frame defined by several variables like time of day and type of file, then a reminder email must be sent. Again, the recipients can approve, reject or ignore the reminder. Finally, a third email is sent to yet another approver with the same options. Ultimately the process will either copy the approved file to a secure FTP site after both of the initial two recipients or the final recipient approves the file, or it will send an email to the business saying the file was rejected. It may sound simple enough, even in a flow diagram; however, there were several head-scratchers and frustrated, lengthy ceiling stares that I may have easily avoided with a bit of foreknowledge.

The following are the four challenges I had to overcome to call the project a success:

When sending an approval email from Azure Logic Apps, which is initiated via a Webhook activity from the ADF pipeline, how do I force a response by a specific time of day? For example, if the initial emails must be approved or rejected by 9:00 AM, and it is triggered at 8:26 AM which is itself a variable time to start, how do I force the email to return control to the pipeline in 34 minutes?

The second challenge came with the Webhook activity itself. The Logic App needed to return status values back to the calling pipeline. While there was some minimal documentation that explained that a callbackURI was needed in an HTTP Post from within the Logic App, what I found informationally lacking was how to actually pass back values.

The third challenge was processing a rejection. The logic stated that if either of the initial approvers rejected the file, then the pipeline needed to stop further processing immediately and notify the business so a secondary file may be created and run through the workflow again. If the two initial emails to approvers were set to timeout after 34 minutes with no response (following the example above) and one of the approvers rejected the file in 3 minutes, the pipeline could not dilly dally for another 31 minutes spinning cycles waiting for the other approver.

Finally, each step in the process needed to be written to a logging table in an Azure SQL Database. That was not too difficult as it was a simple matter of passing dynamic values to a parameterized stored procedure. However, the number of times this needed to happen brought a much-unexpected consequence to my attention.

Setting a timeout value for the “Send approval email” action in Logic App

As I said earlier, I did not have much experience with designing Logic Apps going into the project, so I was a bit intimidated at first. The only other simple and expedient Logic App I had used with ADF pipelines in the past was for sending Success and Failure notification during pipeline executions. You can see in the designer below that the Failure Notification Logic App consisted of two steps, an HTTP request action and a Send an email action. This is one of the only ways to send emails from ADF as it is not natively supported.

word image 95 Azure Data Factory pipelines: Filling in the gaps

The approval aspect of the Logic App was new for this project and that did require some additional experimenting. Within minutes, though, I was sending test emails to myself and approving or rejecting them, noting the behavior of the flow. If I did not approve or reject the email, the process would just continue to wait. It was not obvious on the surface if there was a timeout value I could set.

After some fumbling around a bit, I checked the Settings of the send approval email action in the Logic App and did find a duration value that could be set, using the ISO 8601 format. I was not familiar with the format, so I had to look it up. This is an example of the ISO 8601 time format I would need to use to set the timeout value, where “P” is the period, “T” is the time, “30” is the number of minutes and “M” is the actual minute designator. Simple enough.

PT30M

I now had a way to control the timeout behavior of the send approval email action so that it did not wait indefinitely if no one approved or rejected the emails. Now came the challenge of dynamically setting the number of minutes before forcing a timeout based on the time of day and file type. I needed to set this value in the pipeline and then pass this value to the Logic App that would do the work of actually sending the email.

I used an If Condition activity in the pipeline to set the ISO 8801-formatted value to the Logic App.

word image 96 Azure Data Factory pipelines: Filling in the gaps

The conditional equated to true or false based on the value of a pipeline parameter called RateSheetType. The expression used for this evaluation is, @equals(pipeline().parameters.RatesheetType,'Daily'). If it returned true, meaning it was a daily run, then the conditional executed a Set variable activity to set the TimeoutMinutes value to be the difference in minutes from the current time to 9:00 AM, assuming that the pipeline would have been triggered well before 9:00 AM.

word image 97 Azure Data Factory pipelines: Filling in the gaps

The other values to consider is if the difference between now and the hour of 9:00 AM is greater than zero, meaning it is not past the hour and if it is less than two hours before the hour. If these are true, then set the timeout value to be the difference, otherwise set the value to a default of X minutes, in this case, 30. This last default is mainly done to allow enough time for a response prior to 9:00 AM and for a reminder to be sent.

I am going to go out on a limb here and say, as a career T-SQL developer and DBA, writing expressions in ADF is in a word, exasperating. Take, for example, the T-SQL code required to determine the timeout value from now until 9:00 AM on the same day. In this case, assume I am running the code at 8:05 AM, and I would expect the difference to be 55.

You can see it is fairly straightforward with minimal coding effort mainly due to the support for tried and true date/time functions and an effective CASE statement.

1

2

3

4

5

6

7

8

9

10

Select

CASE WHEN Datediff(mi,GetDate(),DATEADD(hh,9,DATEADD(dd,0,

         DATEDIFF(dd,0,GETDATE())))) > 0

         AND  Datediff(mi,GetDate(),DATEADD(hh,9,

         DATEADD(dd,0, DATEDIFF(dd,0,GETDATE())))) < 120

     THEN ‘PT’ + CAST( Datediff(mi,GetDate(),

         DATEADD(hh,9,DATEADD(dd,0,

         DATEDIFF(dd,0,GETDATE())))) as varchar(2)) + ‘M’

     ELSE ‘PT30M’

     END as TimeoutValue

word image 98 Azure Data Factory pipelines: Filling in the gaps

By contrast, the data factory expression to derive the same values is:

@If(and(lessOrEquals(div(div(mul(sub(ticks(convertFromUtc(concat(formatDateTime(utcNow(),‘yyyy-MM-dd’),‘T09:00:00Z’),‘Eastern Standard Time’)),ticks(formatDateTime(utcNow(),‘yyyy-MM-ddTHH:mm:ssZ’))),100),1000000000),60),120),greaterOrEquals(div(div(mul(sub(ticks(convertFromUtc(concat(formatDateTime(utcNow(),‘yyyy-MM-dd’),‘T09:00:00Z’),‘Eastern Standard Time’)),ticks(formatDateTime(utcNow(),‘yyyy-MM-ddTHH:mm:ssZ’))),100),1000000000),60),0)),concat(‘PT’,string(div(div(mul(sub(ticks(convertFromUtc(concat(formatDateTime(utcNow(),‘yyyy-MM-dd’),‘T09:00:00Z’),‘Eastern Standard Time’)),ticks(formatDateTime(utcNow(),‘yyyy-MM-ddTHH:mm:ssZ’))),100),1000000000),60)),‘M’),‘PT30M’)

I will grant you there are additional functions used in the expression that make it a bit more complex such as converting from UTC to Eastern Standard Time. However, the time to develop the logic initially was order of magnitudes more than the SQL equivalent. This is partly due to the nested nature of the expression as well as the unfamiliar syntax. But it was made worse by the lack of common DateDiff and DateAdd functionality. You can see that to determine if the difference in minutes from now to 9:00 AM, the arcane ticks() function combined with math functions div() and mul() were employed. Ticks() returns the number of nanoseconds that have elapsed since January 1st, 0001.

greaterOrEquals(div(div(mul(sub(ticks(convertFromUtc(concat(formatDateTime(utcNow(),‘yyyy-MM-dd’),‘T09:00:00Z’),‘Eastern Standard Time’)),ticks(formatDateTime(utcNow(),‘yyyy-MM-ddTHH:mm:ssZ’))),100),1000000000),60),0))

In the end, I chose to use the expression over the T-SQL code simply because I wanted to avoid having to have an entire dataset devoted to calling the stored procedure to return a five-character string value and the additional overhead that would have entailed. I will note here that if you decide to use a stored procedure to return a value, you will need to use the Lookup activity over the Execute Stored Procedure because the latter has no way of returning the output for further processing.

Now that the timeout value was working and the variable set, the next challenge was to pass that value to the Logic App.

Passing values to and from Logic Apps

Building a Logic App could take up an article in itself so to keep this as simple as possible, I want first to show you the HTTP Request that is called from the ADF pipeline via a Webhook activity. The Webhook supports a callback URI capability, unlike a simple Web activity. The callback functionality is needed to return values to the pipeline. In the screenshot below, you can see the Logic App request and the request body, which are the property values that the request will receive when called. Those property values can be passed to the Logic App and back to the calling application, along with additional values derived directly from the Logic App itself, such as the status of the email approval activity.

word image 99 Azure Data Factory pipelines: Filling in the gaps

In the Data Factory pipeline screenshot below, you can see that the URL will be the POST URL from the Logic App HTTP Request and the Method is POST. It is important to also specify the headers Name and Value as Content-Type and application/json respectively. Finally, the Body will contain the JSON-formatted values to pass to the request, again noting the value of the TimeoutMinutes variable is assigned to the property value expected by the Logic App request, TimeOutValue. Using the @json() function to format the string is a bit tedious until you become acquainted with the syntax.

word image 100 Azure Data Factory pipelines: Filling in the gaps

The timeout value for the Send approval email activity in the Logic App is not visible by default and can only be seen by going to Settings or looking for it in Code view. The value for the timeout will be set to @triggerBody()?['TimeOutValue'].

word image 101 Azure Data Factory pipelines: Filling in the gaps

That takes care of the timeout value passed to the HTTP request that initiates the Logic App, but what about returning the email approval status back to the ADF pipeline. It turns out, that is pretty straightforward as well, though it has not been well documented as far as I could discover.

You can see in the following screenshot, the Logic App executes an HTTP POST activity after the Send approval email activity. In the POST, all that is required is the callBackUri value: @{triggerBody().callBackUri}, as well as the “Output” values you wish to pass back to the calling pipeline. In this case, I wanted to receive the ApprovalStatus, which equated to the option they selected in the approval email, Approve, Reject or Null (for no selection or timeout). There are other values I was interested in like a final status, but for the sake of this example, I will stick with the ApprovalStatus value.

word image 102 Azure Data Factory pipelines: Filling in the gaps

I would like to note at this point, it was not apparent at first how to add in the output values in the Body of the HTTP POST. I had to again turn to Code view and just got lucky by placing a similarly formatted JSON block in the Body section. After I did, low and behold, the graphic displayed the expected and validated values. The final step to make this an end-to-end success was to test the values as they were returned to the pipeline. This leads into the third challenge, determining if the approval emails were rejected.

Dealing with rejection

To verify that I was receiving the approval status from the Logic App back to the pipeline as described above, I checked the value of the output of the Email_First_Approver_1 Webhook activity after manually rejecting a test email sent to my account. You can see in the screenshot below from a complete pipeline run that indeed I had received back a Reject ApprovalStatus value.

word image 103 Azure Data Factory pipelines: Filling in the gaps

From here, it was an easy matter of using this value in another If Condition activity for a rejected approval. As the workflow logic dictated, if either approver rejected the email, then the entire pipeline should stop processing at that point. The evaluation expression in the If Condition was a simple function:

@equals(activity(‘Email_First_Approver_1′).output.ApprovalStatus,‘Reject’)

word image 104 Azure Data Factory pipelines: Filling in the gaps

This is where things got a little interesting and messy. You can see in the above screenshot that there are four activities in the True container which would be executed if the ApprovalStatus equated to Reject. Three of those activities were responsible for setting a variable, sending a final notification that the initial email was rejected and writing the status out to a SQL logging table, which I will cover in the next section. I would refer to the fourth activity as a hack or a workaround to overcome one of several limitations of the current version of ADF, and that is the inability to terminate execution of a pipeline based on an evaluation. It should be a simple matter of terminating a pipeline run if the email is rejected. In Logic Apps, by comparison, there is a Terminate control.

word image 105 Azure Data Factory pipelines: Filling in the gaps

This concept does not, to my knowledge, exist in ADF. There are a couple of Azure feedback items that address this.

https://feedback.azure.com/forums/270578-data-factory/suggestions/38143873-a-new-activity-for-cancelling-the-pipeline-executi

https://feedback.azure.com/forums/270578-data-factory/suggestions/34301983-throw-error-activity

I suppose it is possible to self-cancel the pipeline run based on a value, but I did not explore that option. I will provide a link here if you would like to explore dynamically creating a URL string and POST the cancelling of your pipeline instead of the solution I chose.

https://docs.microsoft.com/en-us/rest/api/datafactory/pipelineruns/cancel

The path I chose was actually mentioned in one of the feedback links, which is to intentionally generate an error in the pipeline to force it to stop. Prior to the termination, it is possible to write out a log entry noting the intentional stoppage. The below stored procedure, not my proudest achievement in SQL development history, serves the purpose of killing my pipeline upon rejection and can be called with an Execute Stored Procedure activity right after the log entry is written.

CREATE PROCEDURE [dbo].[spThrowError]

AS

BEGIN

    

RAISEERROR(‘Forcing Pipeline To Stop’, 15, 1)    

END

GO

word image 106 Azure Data Factory pipelines: Filling in the gaps

Logging it all, and you are kidding me

Up to this point, I have demonstrated only a few of the overall pipeline activities for the approval process. In the final solution there are many more control tasks that send reminders, move files from source to sink locations, and as mentioned, write out log entries to a SQL table at almost every step. I created another simple stored procedure that performed this logging and placed it amply throughout the pipeline so that analysts could use the information to streamline the process in the future. It would be helpful to know, for example, how often the emails were being rejected or the duration of time between when the emails were sent and when they were approved. In this case, time really was money.

Below is the code for the stored procedure

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

CREATE PROCEDURE [dbo].[spInsertPipelineExecution]

(

    @PipelineId uniqueidentifier

  , @PipelineStatus nvarchar(600)

  , @PipelineStartDate datetime2

  , @PipelineEndDate datetime2

  , @TriggerName nvarchar(50)

  , @ActivityName nvarchar(1000)

  , @ApproverEmail nvarchar(1000)

  , @EmailStatus nvarchar(1000)

  , @FileName nvarchar(1000)

)

AS

BEGIN

    

    DECLARE @clean_PipelineRunId UNIQUEIDENTIFIER =

    CAST(REPLACE(REPLACE(@PipelineId, ‘}’, ”), ‘{‘, ”)

         AS UNIQUEIDENTIFIER);

    INSERT INTO [dbo].[RateSheetApprovalPipeline]

    (

       [PipelineID]

      ,[PipelineStatus]

      ,[PipelineStartDate]

      ,[PipelineEndDate]

      ,[TriggerName]

      ,[ActivityName]

      ,[ApproverEmail]

      ,[EmailStatus]

      ,[FileName]

    )

    VALUES

    (

       @clean_PipelineRunId

, @PipelineStatus

, @PipelineStartDate

, @PipelineEndDate

, @TriggerName

, @ActivityName

, @ApproverEmail

, @EmailStatus

, @FileName

    )

END

The values passed to the stored procedure would, in almost all cases, be dynamic such as is the case for PipelineID, PipelineStartDate and PipelineEndDate (which truth be told is a misnomer as this equates to an activity end date and time predominantly). At any rate, you can see the passing of some of these values at various stages of execution below.

word image 107 Azure Data Factory pipelines: Filling in the gaps

Below is a representation of the rows for different pipeline runs. The timeout values of PT2M was set for my testing purposes. I did not want to wait around for 20 plus minutes for the emails to timeout while debugging.

word image 108 Azure Data Factory pipelines: Filling in the gaps

So this brings me to the unexpected surprise which really should not have been a surprise after all of my many years working within the constraints of the technologies I implement. I had initially wanted to do all of the logic for the workflow and logging steps in one big pipeline. It was not until I was finishing up the last few tasks that I hit upon the 40 activity limitation per pipeline. I do not have a screenshot to share on this one, but trust me, if you try to add more than 40, you will not be able to publish the pipeline. You can find other limitations for Data Factory in the following link.

https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/azure-subscription-service-limits

Needless to say, I had to take a more modular approach and to be honest, that was probably for the best. To build a modular pipeline design does bring its own challenges, however, which I will be happy to delve into in the future.

Conclusion

For now, I am happy to pass along a few of the gaps I had to fill in with ADF. I am sure there are better (and worse) ways of doing some of the things I have demonstrated here. I am also sure that Microsoft will review the feedback and at some point in the future will address the concerns raised here as well as others. As I said at the outset, knowing the benefits and limitations upfront can save a lot of time and frustration. I hope that I have been able to shed a little more light on what you can do if you are tasked with building a fairly complex workflow pipeline in ADF and hit upon similar challenges.

Let’s block ads! (Why?)

SQL – Simple Talk

Read More

Memory Consumption in Azure Functions

September 16, 2020   Microsoft Dynamics CRM

The story behind this blogpost started from an exception we got when testing an interface – System.OutOfMemoryException. The goal of the interface was to take a ZIP file as an input, read the data from the archived files, and import customer payment journals into Dynamics 365 for Finance. It was using an Azure Function for unzipping an archive on Azure File Storage. Obviously…

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

Monitoring the Power Platform: Azure DevOps – Orchestrating Deployments and Automating Release Notes

August 31, 2020   Microsoft Dynamics CRM

Summary

 

DevOps has become more and more ingrained into our Power Platform project lifecycle. Work item tracking and feedback tools for teamwork. Continuous integration and delivery for code changes and solution deployments. Automated testing for assurance, compliance and governance considerations. Microsoft’s tool, Azure DevOps provides native capabilities to plan, work, collaborate and deliver. Each step along the way in our Power Platform DevOps journey can be tracked and monitored which will be the primary objective of this article.

In this article, we will focus on integrating Azure DevOps with Microsoft Teams to help coordinate and collaborate during a deployment. We will explore the various bots and how to set them up. From there we will walk through a sample scenario involving multiple teams working together. Finally, we will look to automate release notes using web hooks and Azure Function.

Sources

 

Sources of Azure DevOps events that impact our delivery can come from virtually any area of the platform including work items, pipelines, source control, testing and artifact delivery. For each one of these events, such as completed work items, we can setup visualizations such as charts based on defined queries. Service hooks and notification subscriptions can be configured to allow real time reporting of events to external parties and systems allowing for us to stay in a state of continuous communication and collaboration.

AUTHOR NOTE: Click on each image to enlarge for detail.

3288.pastedimage1598622673004v1 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Microsoft Teams, Continuous Collaboration and Integration

 

Azure DevOps bots with Microsoft Teams has quickly grown into one of my favorite features. For instance, Azure DevOps dashboards and kanban boards can be added to channels for visualizations of progress as shown below.

 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Multiple Azure DevOps bots can be configured to deliver messages to and from Microsoft Teams to allow for continuous collaboration across multiple teams and channels. These bots can work with Azure Pipelines, work items and code pull requests.

 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Work Items Code Pipelines
6431.pastedimage1598622673006v4 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes 1145.pastedimage1598622673006v5 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes 5047.pastedimage1598622673007v6 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

For monitoring and orchestrating deployments across our various teams, the Azure Pipelines bot is essential. Let’s begin by setting up subscriptions to monitor a release pipeline.

NOTE: The rest of this document will be using a release pipeline as an example, but this will also work with multi-stage build pipelines that utilize environments.

Configuring the Azure Pipelines Bot in Microsoft Teams

 

Use the “subscriptions” keyword with the Azure Pipelines bot to review and modify existing subscriptions and add new ones.

 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

In the above example, we are subscribing to any changes in stages or approvals for a specific release pipeline. Its recommend to filter to a specific pipeline to reduce clutter in our Teams messaging. The Azure Pipeline bot, using actions described in the article “Azure DevOps – Notifications and Service Hooks“, can be further filtered by build statuses. This is helpful to isolate the messages delivered to a specific Teams channel.

 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Once configured, as soon as our pipeline begins to run, Microsoft Teams will begin to receive messages. Below is an example showing the deployment of a specific release including stages and approval requests. What I find nice about this is that Microsoft Teams works on both my mobile devices and even Linux based operating systems, allowing any team on any workload to utilize this approach.

 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

I also want to point out that Azure DevOps also has the ability to natively integrate with other 3rd party tools such as Slack (Similar to the Teams bots), ServiceNow and Jenkins.

Release Pipelines

 

Quality Deployments

 

Deployments within a release pipeline allow for numerous ways to integrate monitoring into Azure DevOps processes. Each deployment include pre and post conditions which can be leveraged to send events and metrics. For instance, the Azure Function gate can be used to invoke a micro service that writes to Azure Application Insights, creates ServiceNow tickets or even Kafka events. The possibilities are endless, imagine sending messages back to the Power Platform for each stage of a deployment!

Approvals

 

Pre and Post approvals can be added to each job in the release pipeline. Adding these can assist during a complex deployment requiring coordination between multiple teams dispersed geographically. Shown below is a hypothetical setup of multiple teams each with specific deliverables as part of a release.

7002.pastedimage1598622673008v10 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

In this scenario, a core solution needs to be deployed and installed before relying features can begin. When any of the steps in the delivery process begins, the originating team needs to be notified in case of any issues that come up.

Using approvals allows the lead of the specific feature team to align the resources and communicate to the broader team that the process can move forward. The full example can be found below.

 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Here is an example of an approval within Microsoft Teams, notifying the lead of the core solution team that the import process is ready. The approval request shows the build artifacts (e.g. solutions, code files, etc), the branch and pipeline information.

 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Deployment Gates

 

At the heart of a gated deployment approach is the ability to search for inconsistencies or negative signals to minimize unwanted impact further in the process. These gates, which can be set to run before or after a deployment job, allow us to query for potential issues and alerts. They also could be used to notify or perform an operation on an external system.

 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Queries and Alerts

 

Deployment gates provide the ability to run queries on work items within your Azure DevOps project. For instance this allows release coordinators and deployment managers to check for bugs reported from automated testing using RSAT for Dynamics 365 F&O or EasyRepro for Dynamics 365 CE. These queries are created within the Work Items area of Azure DevOps. From there they are referenced within the pipeline and based on the data returned, upper and lower thresholds can be set. If these thresholds are crossed, the gate condition is not successful and the process will halt until corrections are made.

External Integrations

 

As mentioned above Azure Function is natively integrated within deployment gates for Release Pipelines. These can be used for both a pre condition and post condition to report or integrate with external systems.

Deployment gates can also invoke REST API endpoints. This could be used within the Power Platform to query the CDS API or run Power Automate flows. An example could be to query the Common Data Service for running asynchronous jobs, creating activities within a Dynamics 365 environment or admin actions such as enabling Admin mode. Another could be to use the robust approval process built in Power Automate for pre and post approvals outside of the Azure DevOps licensed user base.

Using Build Pipelines or Release Pipelines

 

In the previous section I described how to introduce quality gates to a release securing each stage of the pipeline. Release pipelines are useful to help control and coordinate deployments. That said, environments and build pipelines allow for use of YAML templates which are flexible across both Azure DevOps and GitHub and allow for teams to treat pipelines like other source code.

Environments

 

Environments in Azure DevOps allow for targeted deployment of artifacts to a collection of resources. In the case of the Power Platform, this can be thought of a release to an Power Platform environment. The use of pipeline environments is optional, that is unless you begin work using Release pipelines which do require environments. Two of the main advantages of environments are deployment history and security and permissions.

Environment Security Checks

 

Environment security checks, as mentioned above, can provide quality gates similar to the current capabilities of Release Pipelines. Below is an example of the current options compared to Release Pre and Post Deployment Quality Gates.

3603.pastedimage1598622673009v14 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Here is an example of linking to a template in GitHub.

0876.pastedimage1598622673009v15 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Compare this to the Release Pipeline Pre or Post Deployment Quality Gates.

 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Scenario: Orchestrating a Release

 

Ochestrate%20Release%20with%20Teams%20 %20Full Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

In the above example, we have a multi-stage release pipeline that encompasses multiple teams from development to support to testing. The pipeline relies on multiple artifacts and code branches for importing and testing.

In this example, we have a core solution containing Dynamics 365 entity changes that are needed by integrations. They will need to lead the deployment and test and notify the subsequent teams that everything has passed and can move on.

Below is an example of coordination between the deployment team and the Core team lead.

Ochestrate%20Release%20with%20Teams Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Below is an image showing the entire release deployment with stages completed.

4760.pastedimage1598622673010v19 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

Automating Release Notes

 

Azure Application Insights Release Annotations

 

The Azure Application Insights Release Annotations task is a marketplace extension from Microsoft allowing a release pipeline to signal an event in a release pipeline. An event could be the start of the pipeline, the end, or any event we are interested in. From here we can use native functionality of Azure Application Insights to stream metrics and logs.

Using an Azure Function with Web Hooks

 

Service Hooks are a great way of staying informed of events happening within Azure DevOps allowing you to be freed up to focus on other things. Examples include pushing notifications to your teams’ mobile devices, notifying team members on Microsoft Teams or even invoking Microsoft Power Automate flows.

2043.pastedimage1598622673010v20 Monitoring the Power Platform: Azure DevOps   Orchestrating Deployments and Automating Release Notes

The sample code for generating Azure DevOps release notes using an Azure Function can be found here.

Next Steps

 

In this article we have worked with Azure DevOps and Microsoft Teams to show an scenario to collaborate on a deployment. Using the SDK or REST API, Azure DevOps can be explored in detail, allowing us to reimagine how we consume and work with the service. This will help with automating release notes and inviting feedback from stakeholders.

Previously we looked at setting up notifications and web hooks to popular services. We then reviewed the Azure DevOps REST API to better understand build pipelines and environments.

If you are interested in learning more about specialized guidance and training for monitoring or other areas of the Power Platform, which includes a monitoring workshop, please contact your Technical Account Manager or Microsoft representative for further details.

Your feedback is extremely valuable so please leave a comment below and I’ll be happy to help where I can! Also, if you find any inconsistencies, omissions or have suggestions, please go here to submit a new issue.

Index

 

Monitoring the Power Platform: Introduction and Index

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

Read More
« Older posts
  • Recent Posts

    • Rickey Smiley To Host 22nd Annual Super Bowl Gospel Celebration On BET
    • Kili Technology unveils data annotation platform to improve AI, raises $7 million
    • P3 Jobs: Time to Come Home?
    • NOW, THIS IS WHAT I CALL AVANTE-GARDE!
    • Why the open banking movement is gaining momentum (VB Live)
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited