Tag Archives: using

Relevance Search – Additional Filtering Using Facets and Filters

Additional Filtering 300x225 Relevance Search – Additional Filtering Using Facets and Filters

Relevance Search distributes a search in a single result list and sorts it by relevance based on a scoring concept. One key thing to know is that the higher the score, the more relevant the item.

Relevance Search can:

• Find matches to any word in the search phrase. Matches include various forms of the search word for example, “service,” will match to “servicing,” or “serviced”

• Search for text in emails and notes

• Search records that you own as well as those that have been shared with you

• Search for text in an Option Set and Lookup field

• Search for text in SharePoint integrated documents (scheduled to be included in the next Dynamics 365 update)

• Search for text within Documents in Dynamics 365. These include documents in a Note, Attachments, Email, and Appointments.

As you can see, the Relevance Search can do many great things but it can also result in millions of matches depending on the size of your organization. Luckily, for us, Microsoft has thought about that and included a feature called Facets and Filters. We get additional filtering by Record Type, Owner, Modified Date, and Created Date to personalize search experience.

Additional Filtering using Facets and Filters

Global Facets: You can refine your search results to Record Type, Owner, Created On, or Modified On. In this example below, I filtered the search results to only show records for a specific “Owner.”

111617 2201 RelevanceSe1 Relevance Search – Additional Filtering Using Facets and Filters

Entity Specific Facets: When you click on a specific record type, additional facets appear. These facets are specific to the fields of the Record Type/Entity. System Administrators and System Customizers can configure which fields are available for faceting through the entity’s Quick Find view. In the example, clicking on Cases gave me two additional facets: Priority and Origin.

111617 2201 RelevanceSe2 Relevance Search – Additional Filtering Using Facets and Filters

End user configuration: End users can also personalize their search experience by configuring the facet fields that they would like to see for any searchable entity.

111617 2201 RelevanceSe3 Relevance Search – Additional Filtering Using Facets and Filters

111617 2201 RelevanceSe4 Relevance Search – Additional Filtering Using Facets and Filters

Now that you have learned how to narrow your search results you will get results that are more relevant to your needs, making it easier to find what you are looking for.

Keep up with the latest and greatest on Dynamics 365 by subscribing to our blog here!

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Data Management and Integration Using Data Entities – Part 2

Data Management Part 2 300x225 Data Management and Integration Using Data Entities – Part 2

In part two of our Dynamics 365 for Finance and Operations: Data Management and Integration series, we will cover detailed information on data management and integration using OData Services.

This type of integration is real time in nature and mainly conducted in scenarios where business requirements are around office integration and third party mobile apps integration. OData stands for Open Data Protocol, which is an industry standard Representational State Transfer (REST) based protocol for performing CRUD operations (Create, Read, Update, Delete) and integration with Dynamics 365 for Finance and Operations.

OData uses REST application programming interfaces (APIs) and OAuth 2.0 authorization mechanism to receive data to and from integration systems and Finance and Operations.

111317 2255 DataManagem1 Data Management and Integration Using Data Entities – Part 2

With OData Services for Finance and Operations, you can seamlessly integrate with all types of web technologies, such as HTTP and JavaScript Object Notation (JSON) and it lets developers interact with data in a standard yet powerful manner using RESTful web services.

OData endpoint

Data Entities that are marked Yes for the IsPublic property are exposed as an OData endpoint. When the IsPublic property for an updatable view is set to TRUE, that view is exposed as a top-level OData entity. Developers can consume this OData endpoint in their external application such as a .Net application for integration scenarios.

111317 2255 DataManagem2 Data Management and Integration Using Data Entities – Part 2

Integrating Client Application with OData:

OData integration REST API uses the same OAuth 2.0 authentication model as the other service endpoints. Before the integrating client application can consume this endpoint, developers must create and register the application ID in the Microsoft Azure Active Directory (AAD) and give it appropriate permission to Finance and Operations as per the steps below:

Go to Azure Portal > Azure Active directory > AppRegistrations

111317 2255 DataManagem3 Data Management and Integration Using Data Entities – Part 2

Click New Application Registration and select “Web app/API” for application type. Enter your Dynamics 365 URL for sign on.

111317 2255 DataManagem4 Data Management and Integration Using Data Entities – Part 2

Click Create and make sure to note the application ID. Click on the app and go to the “Required Permissions” page.

111317 2255 DataManagem6 Data Management and Integration Using Data Entities – Part 2

111317 2255 DataManagem5 Data Management and Integration Using Data Entities – Part 2

Click add and select “Microsoft Dynamics AX.” Go to the select permissions tab and select all the permissions available.

111317 2255 DataManagem7 Data Management and Integration Using Data Entities – Part 2

Once the Application is created and registered below activities are performed in Dynamics 365 for Finance and Operations. A Data project using the Data Management Framework can be used to create Import Export jobs for loading the data-to-data entities of extraction of the Data.

Click the Dynamics 365 URL > Go to System Administration > Data Management > Click on Import

111317 2255 DataManagem8 Data Management and Integration Using Data Entities – Part 2

In our next part of this series, we’ll look at Asynchronous Integrations. Stay tuned and subscribe to our blog to receive the latest posts in your inbox!

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Data Management & Integration Using Data Entities – Part 1

Data Management 300x225 Data Management & Integration Using Data Entities – Part 1

Today we will review data management and integration using Data Entities in the Dynamics 365 for Finance and Operations application.

Modern enterprise organizations have interconnected devices with systems and processes that collaborate with each other and users across business functions. This helps provide an exceptional user experience and supplies the data and information to ensure the correct business decisions are being made while aiding productivity – that is where integration plays a vital role.

Dynamics 365 for Finance and Operations offers both real time (synchronous) and batch driven (asynchronous) data management and integration options depending upon user requirements. Most commonly, Data Entities are used for data management and integration. A data entity provides
conceptual abstraction and encapsulation of underlying table schemas to represent key data concepts and functionalities

These Data Entities are grouped into five types that are based on their functions and the type of data that they provide.

Data Entity Type Features Example
Parameter Based Pre-requisite setup data specific to industry and or business functional and deployment parameters for modules. Consists of tables that contain only one record where columns are value for settings. AP / AR / GL / Inv. Setups, etc.
Reference Based Setup reference data specific to industry or business processes in small quantities to enable transactions. Units, dimensions, and tax group / codes, etc.
Master Large amounts of data required for business operations that are critical for transactions Customers, Vendors etc.
Documents Business Operational Data having complex structures including header and line items. Sales Order, Purchase Order, Journals, etc.
Transaction Operational transactional data such as posted transactions which are considered as record. Pending Invoices

How to build a data entity in Dynamics 365 for Finance & Operations

Option 1: building an entity by using a wizard

The easiest way to build an entity is to start the Wizard, add a new item of type Data entity
to the project. When building an entity, start with a root data source. However, you can add additional data sources either manually by adding new data sources or by selecting a surrogate foreign key field in the root data source to automatically expand the required data sources. When the Wizard is completed it creates following:

  • Data entity
  • Staging table (optional, if data management was enabled)

Here is a step-by-step illustration:

110917 1830 DataManagem1 Data Management & Integration Using Data Entities – Part 1

110917 1830 DataManagem2 Data Management & Integration Using Data Entities – Part 1

110917 1830 DataManagem3 Data Management & Integration Using Data Entities – Part 1

110917 1830 DataManagem4 Data Management & Integration Using Data Entities – Part 1

110917 1830 DataManagem5 Data Management & Integration Using Data Entities – Part 1

Option 2: Building an entity from a table

Developers can quickly create an entity from a table, and then customize the properties, data sources, and fields later.

Right-click the table, and then select Addins > Create data entity.

110917 1830 DataManagem6 Data Management & Integration Using Data Entities – Part 1

Output:

110917 1830 DataManagem7 Data Management & Integration Using Data Entities – Part 1

In previous versions of Microsoft Dynamics AX, there were multiple ways to manage data, such as Microsoft Excel Add-ins, AIF, and DIXF. The concept of data entities merges different ways into one.

Here is a summary of the Data Entity Management Integration Scenarios:

Type of Integration

Where Used

Synchronous Integration using OData Office Integration and Third Party Mobile Apps Integration
Asynchronous Integration Interactive file-based import/export

and Recurring integrations (file, queue, and so on)

Business Intelligence Aggregate data

And Standardized key performance indicators (KPIs)

Application Life Cycle Management Data Migration using Packages (exporting and importing data packages)
The Dynamics 365 Data Integration using Common Data Services Integration and flow of data between various MS Dynamics 365 products

Stay tuned for part 2 of this series, where we will look into each of the data management and integration options in detail. Don’t miss out, be sure to subscribe to our blog!

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

D365 In Focus: The Top Reasons Why You Should Be Using ServSmart [VIDEO]

D365 In Focus ServSmart Still 800x600 300x225 D365 In Focus: The Top Reasons Why You Should Be Using ServSmart [VIDEO]

The connected world provides an invaluable opportunity for organizations to generate data that can be monitored, analyzed, and acted upon, and there’s simply no denying the positive impact it can have on the Field Service Industry. In today’s episode of Dynamics 365 In Focus, our Field Service Practice Director, Dan Cefaratti, describes how PowerObjects’ ServSmart framework is quickly becoming a must have for manufacturers and aftermarket service providers looking to improve data visibility while reducing costs and providing an unmatched customer experience.

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Customers Demonstrate Amazing Achievements Using API Management

blog 01 Customers Demonstrate Amazing Achievements Using API Management

Sportradar’s API Management Winning Streak

The official provider of data for the NFL, NBA, NHL, International Tennis Federation, and NASCAR, Sportradar US is recognized as the fastest growing sports media provider in North America. With the industry’s most proficient software, it distributes easy-to-consume content and data while setting new standards for speed and accuracy. 

“We initially started using TIBCO Mashery® because it was easy,” says SVP of Technology and Operations Dave Abbott. “We had built API portals in previous lives and understood the challenges and investment. At my last startup, it took a couple million dollars, six people, and six months. For $ 181.50, we built a free-trial functional prototype with Mashery in about three and a half days.

“About a year ago, we ran another evaluation. Mashery held up, and we stuck with it. Reporting is also very important, and from Mashery, it’s adequate on the method level, the service and package levels, and for general traffic.”

Read more about Sportradar’s experience with Mashery, its data feeds, developer portal, store, and user experience.

Hotelbeds Group Puts API Management Problems to Rest

Operating mainly under the Hotelbeds and Bedsonline brands, Hotelbeds Group connects 35,000 travel intermediaries in more than 120 markets to travel providers in over 180 countries who represent more than 120,000 hotels, 21,000 transfer routes, and 12,000 activities.

“The dynamics in our industry are affected by agreements and partnerships that can bring unexpected changes to the market, which is already influenced by world events,” explains Bruno Rodriguez, API evangelist. “Our first goal was quota control. Customers were sending a massive volume of requests to our system and overloading our infrastructure. Because we couldn’t handle the volume, customers were disconnected, so they couldn’t book anything. Obviously, we were losing money. In an increasingly competitive world, we really needed a cloud solution like TIBCO Mashery. We compared at least three or four API managers and made the decision. It was, for us, a way to stay on top of it.”

Learn how Hotelbeds got control of its traffic (at peak, around 6,000 data requests/second) and expanded business with clear metrics, a simple user experience, and stable, fast, reliable API management.

Join the TIBCO customer reference program to have your business transformation story shared globally with the technology industry, and trade and business press. Your story in print, web, and video format can boost your status as a thought leader and increase awareness with technology leaders, helping you raise your company visibility and attract and retain top talent. Email customermarketing@tibco.com today!

Let’s block ads! (Why?)

The TIBCO Blog

Using Version Number to Track Changes in Dynamics 365

TRACK CHANGES 300x225 Using Version Number to Track Changes in Dynamics 365

Every entity in Dynamics 365, both OOB and custom, has a field named [Version Number]. It can’t be seen from the UI, as it is a system field, but it can be seen from when querying the database or when fetching data using FetchXML. This field is used by the system to track changes in a single record, and can be leveraged when doing data migrations/integrations to check if a record has changed from its source.

What is Version Number?

At the database level, Version Number is a rowversion data type (sometimes called timestamp).

Some key details from TechNet:

Each database has a counter that is incremented for each insert or update operation that is performed on a table that contains a rowversion column within the database. This counter is the database rowversion. This tracks a relative time within a database, not an actual time that can be associated with a clock. A table can have only one rowversion column. Every time that a row with a rowversion column is modified or inserted, the incremented database rowversion value is inserted in the rowversion column.

So, to summarize, each time any field on this record gets updated, Version Number will change (tick up an increment).

How to track changes with Version Number

In the context of a one-way integration from one Dynamics 365 to a staging database, version number could be leveraged using SSIS by:

From the Source to Staging Table

A dataflow might look like this:

110317 1848 UsingVersio1 Using Version Number to Track Changes in Dynamics 365

Here, we stage with some records including the Version Number (account in this example). From there we do a lookup in our staging table, matching the accounted to see if it exists and passing the version number stored in the staging DB. If there is no match, we create the record. If there is a match we compare the version number from the source with staging. If these values are different, we delete the current record in staging, and recreate the record with what was queried from the source.

A thing to note, when querying the data directly from on-prem SQL, version number will be a binary data type and must be cast to a string to compare the values in the conditional split. However, if the data is being queried using FetchXML this value will be cast as an integer, but the functionality works the same way.

Why you would use Version Number

There are two main reasons why version number might be a good choice to track these changes. First, because it works at the database level, the version number will change even when a record is updated using a direct SQL update (not recommended or supported, but we’ve seen it happen none the less). Other values, such as modified on, will only update at the software level. Using version number will allow you to catch all changes, even unsupported ones.

The second reason is the simplicity of development. Most people here tend to think that development is simpler using this method, requiring you to log less information in auditing (much as modified on) allowing for more rapid and streamlined development.

Itching to learn more? Explore our blog!

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Digital Transformation using Dynamics CRM and SharePoint platforms

Create customer experiences that boost engagement and value to your business

The concept of “digital transformation” has different meaning to different companies. As digital transformation will look different for every company, I use the image of old payphone booth, converted into WiFi hotspot, to best describe one aspect of Digital Transformation for a telco. Digital transformation is the realization that customers will turn away from brands that don’t align with their needs and expectations, in this case Internet access anywhere.  A top-notch customer experience is the way to keep customers involved and engaged with your brand.

Front-end customer experience is not the only focus of digital transformation. Using technology to redesign operational systems has similar impact on the company’s ability to successfully provide great experiences to the customer. For this reason we recommend to harness the power of Dynamics 365 and SharePoint as the main two platforms to support Digital Transformation.

While most Digital Transformation experts support their ideas with business-to-consumer (B2C) ideas and examples, the transformation we are experiencing today is no different from the Business to Business (B2B) model. Business customers want to do everything, faster, easier, simpler, and anywhere. They are no longer happy with getting information the traditional way. Mailing brochures with price proposals does not meet customers’ expectations any more.

When Microsoft CRM was first released, about 15 years ago, it was mainly used as advanced spreadsheet, to manage and update contact details of leads, contacts, and accounts. Since then Dynamics 365 was developed to manage all touchpoints between the business and its clients. In the B2B model, these touch points are mostly Email communications and documents exchanges between commercial businesses. Business documents are either generated within Dynamics 365 or imported / sync with other ERP systems, and other business documents are expected to be stored SharePoint. Theoretically, these two platforms are designed to hold all business documents the organization generates, from price quotations, orders, invoices, payment receipt, and account’s statement, to service agreements and legal / commercial contracts.

It may look like static process between organizations. The reality is that documents are frequently exchanged and worked on. Price quotations are negotiated, orders are modified, invoices need to be signed off and then paid, parties exchange contracts till all details are agreed upon, artwork and creative ideas go through proofing processes before production.

DoxTray (by www.DynamicsObjects.com ) is cloud application scheduled to be released by December 2017. While companies are using SharePoint to internally collaborate between users, DoxTray aims at providing similar functionality for document collaboration between businesses. DoxTray relies on data stored in Dynamics 365 database, as well as documents stored in SharePoint.

DoxTray provides clients the facility to upload documents such as orders and contracts, download invoices, pay for invoices, and use DoxTray to facilitate exchange of notes between two businesses regarding contracts, orders, quotes and artwork / creative design proofing.

DoxTray is free, with paid add-on options.

If Digital transformation is the vision of your business or your customers, we would like to hear from you.

Read More….

.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Build Cloud-native Applications Using TIBCO BusinessWorks Container Edition on AWS Marketplace

AWS Build Cloud native Applications Using TIBCO BusinessWorks Container Edition on AWS Marketplace

With Amazon Web Services (AWS), you can provision compute power, storage, and other resources, gaining access to a suite of elastic IT infrastructure services as your business demands them. Among many other benefits, one of the major factors why this has been appealing is the ability to control costs in an elastic manner while providing complete flexibility and agility to use infrastructure on-demand.

To further maximize the advantages of the AWS cloud computing delivery model, developers are turning to a cloud-native approach to building and running applications.

A cloud-native application is a program that is designed specifically for a cloud computing architecture. They are designed to take advantage of cloud computing frameworks, which are composed of loosely-coupled cloud services. This means that developers must break down tasks into separate services that can run on several servers in different locations.

This notion of granularity requires solution architects to think more about composite in lieu of monolithic design. In composite design, you bring together a collection of services to create a business application.

When it comes to implementing these composite applications, the need for “integration logic” is critical for success. For example, you may need to:

  • Route and orchestrate incoming API calls to coordinate the workflow on the backend that may require the interaction between multiple backends
  • Simplify the protocol and format mapping issues encountered when interconnecting multiple services
  • Automatically compose outgoing message by transforming and aggregating data coming from different backbends
  • Require packaged adapters to third-party systems or services (such as mainframes) or packaged applications

While the developer can handle this integration logic manually, TIBCO BusinessWorks™ Container Edition was designed to hide the complexity of this integration from developers, allowing them to focus on the business logic of the application, and not the interconnecting of services.

TIBCO BusinessWorks Container Edition, available on the AWS Marketplace

TIBCO BusinessWorks Container Edition and plug-ins for AWS allow you to quickly and easily build cloud-native applications by connecting APIs, microservices and backend systems. With its drag-and-drop graphical development environment, graphical data mapper, and vast library of connectors, you can create cloud-native integration applications and deploy them on AWS, leveraging native features of AWS Elastic Container Service or your choice of Docker-based PaaS built on AWS for container management.

For developers working within the AWS ecosystem, the fact that BusinessWorks Container Edition is available on the AWS Marketplace provides:

  • Quick and easy access to an industry leading integration solution, designed specially for building cloud-native application
  • Consumption based pricing model, where you will pay only for number of containers running per hour
  • The flexibility to scale on demand and manage software cost as you go.

Deep integration with AWS Ecosystem

To simplify deploying, BusinessWorks Container Edition on AWS TIBCO leverages AWS CloudFormation to set up all the necessary resources, collectively known as a CloudFormation stack. This model also allows BusinessWorks Container Edition to integrate seamlessly with a variety AWS Services like EC2 Container Service (ECS), EC2 Container Registry (ECR), Application Load Balancer (ALB), CloudWatch, etc., to leverage their capabilities for container management, logging, auto-scaling, load balancing, service discovery, and much more. This also removes opportunities for manual error, increases efficiency, and ensures consistent configurations over time.

The capabilities provided as part of this integration include:

  • CloudFormation template to set up highly available ECS cluster in an auto-scaling group. CloudFormation automates creation of all the resources required for this task, such as VPC, public and private subnets across 2 AZs, Internet Gateway, NAT Gateway, EC2 instances, etc.
  • Ability to create BusinessWorks Container Edition-based Docker image and push it to ECR
  • CloudFormation template to extend and customize BusinessWorks Container Edition Docker image
  • Ability to download CloudFormation templates and tailor them to suit  your needs
  • AMI to create EC2 instances and set up your own Container Management platform using tools like Kubernetes and Docker Swarm

TIBCO BusinessWorks™ Container Edition and plug-ins for AWS hide the complexity of integrating APIs, microservices, and backend systems from the developers allowing them to focus on the business logic of their applications. You can now access BusinessWorks™ Container Edition on the AWS Marketplace, leveraging a consumption based pricing model allowing you to only pay for the software and AWS resources on an hourly basis and billed by AWS after the usage.

You can learn more about BusinessWorks Container Edition and Plug-ins for AWS on the AWS Marketplace or by visiting our website at: https://www.tibco.com/products/tibco-businessworks

Let’s block ads! (Why?)

The TIBCO Blog

How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365 – Part 2

In my previous article, I demonstrated how to configure a Workflow Process to set any User Lookup field to the Current User using the standard Modified By field to set a custom field named Last Updated By.

image thumb How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

In this article, I provide additional important information that you should be aware of.

Real-Time Workflow Processes

When using a Real-Time Workflow Process, there is an option to configure the Workflow Process to automatically start either after or before a Contact has been updated.

It is important to ensure that the Workflow Process is configured to automatically start After, instead of Before the designated fields on the Contact are changed otherwise the Contact Modified By value used to set the Last Updated By field will contain the previous User who modified the Contact, instead of the current User. This will result in the Last Updated By field being incorrectly set.

image thumb 1 How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

Here is what happens when the configuration for automatically starting the Workflow Process is set to Before the Contact is updated; i.e. Before Record Fields Change.

image thumb 2 How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

When creating a new Contact the Last Updated On and Last Updated By fields are correctly set to the current Date/Time and User.

image thumb 3 How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

However, if another User now updates one of the designated fields on the Contact the Last Updated By field is incorrectly set to the User who previously modified the Contact rather than to the current User.

e.g. incorrectly to this …

image thumb 4 How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

… rather than correctly to this …

image thumb 5 How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

Background Workflow Processes

If using a Background Workflow Process, there is no option to configure the process to automatically start Before a Contact has been updated. A Background Workflow Processes that is configured to automatically start when a Contact has been updated will always start After the Contact has been updated.

image thumb 6 How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

Real-Time vs Background Workflow Processes

In this example used in Part 1 of this article, I chose to use a Real-Time Workflow Process instead of a Background Workflow Process to ensure that the update of the Last Modified On and Last Modified By fields was immediate.

This has the following benefits:

  • The Workflow Process starts immediately. If a Background Workflow Process was used there is a short delay between when the designated fields on the Contact were updated and when the Workflow Process starts. It is possible within this short time for some other User, or process, to update other fields on the Contact resulting in the Modified By field being changed to that other User. Then, when the Workflow Process populates the Last Updated By field it will be set to that other User, rather than to the current User who updated any of the designated fields.
  • The Last Modified On and Last Modified By fields if displayed on the form are immediately updated and visible without needing to refresh the form. If a Background Process was used the form would have to be refreshed before the changes to these fields would be displayed.

Other Scenarios

Sometimes you may encounter a scenario where you want to use a Workflow Process to set a User Lookup field on a record to the current User when there has been no corresponding creation or update of the record on which the User Lookup field is located and therefore no corresponding update of the Modified By field.

This may be because:

  • The Workflow Process is started On-Demand, or manually, by a User without any corresponding changes being made to the record against which it was started.
  • The Workflow Process is automatically triggered by the creation or update of some related record such as an Activity, Case or Opportunity etc. instead of by the creation or update of the record on which the User Lookup field exists.
  • The Update step in the Workflow Process occurs after a Wait Until or Parallel Wait step; i.e. a Timeout period. Note: This scenario only applies to Background Workflow Processes.

In this situation, one of the following two approaches might be used to ensure that the User Lookup field is correctly set to the current User.

First Approach

Configure the Workflow Process with two successive Update steps as follows:

  • Configure the first Update step to change an unused field. This change to the unused field will also cause the Modified By field to be updated. Note: You must ensure that this step always makes a change to the field. You can do this by selecting an unused Date/Time or Text field and configure the Update step to set that field to the Process Execution Time; e.g. for a Contact you could set the unused Pager field to the Process Execution Time. This is a text field. Note: If using an unused Date/Time field the Format must be Date and Time rather than Date Only.

image thumb 7 How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

  • Configure the second Update step to set the Last Updated On and Last Updated By fields from the Process Execution Time and the Modified By field.

image thumb 12 How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

This approach works well if you don’t mind making two successive updates to the record.

Note: If you are using a Real-Time Workflow Process, ensure the Workflow Process is configured to run as the User who made changes to the record rather than the owner of the workflow, otherwise the first Update step will result in the Modified By field being set to the owner of the Workflow Process rather than to the current User. This option does not apply if the Workflow Process is started On Demand, manually, by a User, but is does apply if the Workflow Process was started automatically.

image thumb 8 How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

Hint: You could bundle these two Update steps into one single Action or Real-Time Child Workflow Process and then invoke that Action or Child Workflow Process from any other Workflow Process, Dialog Process or Action whenever required using a single Perform Action or Start Child Workflow step.

The following images show:

  • A Perform Action step in a Background Workflow Process that starts an Action named UpdateLastUpdated Fields. Note: Actions always run in Real-Time.

image thumb 9 How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

  • A StartChildWorkflow step in a Background Workflow Process that starts a Real-Time Child Workflow Process named Update Last Updated Fields.

image thumb 10 How to Set a User Lookup Field to the Current User using a Workflow Process in Microsoft Dynamics 365   Part 2

Second Approach

Configure the Workflow Process to first start a Custom Workflow Activity that then returns the UserId of the User that the Workflow Process is running as and then use an Update step to set the Last Updated On and Last Updated By fields from the Process ExecutionTime and the previously returned UserId.

This approach:

  • works if the Workflow Process is running as the current User.
  • Requires the development and publication of a suitable Custom Workflow Activity that returns the UserId of the User that the Workflow Process is running as.
  • Requires only one rather than two updates to be made by the Workflow Process to the record being updated.

Let’s block ads! (Why?)

Magnetism Solutions Dynamics CRM Blog

LinkedIn plans to teach all its engineers the basics of using AI

 LinkedIn plans to teach all its engineers the basics of using AI

LinkedIn is working to train all of its engineers on the basics of implementing artificial intelligence as part of the company’s drive to make its professional social network smarter.

“The demand for AI across the company has increased enormously,” Deepak Agarwal, head of artificial intelligence at LinkedIn, said during an onstage interview at VB Summit 2017 today. “Everyone wants to have AI as a component of their product.”

The company has launched an AI academy for its engineers to give them a grounding in the basics of implementing artificial intelligence. The idea is that this will make it possible for them to deploy intelligent models in the company’s products.

LinkedIn has a great need for artificial intelligence, since its business is built on the back of recommendations. The professional social network recommends potential connections, jobs, content, marketing opportunities and a wide variety of other interactions.

“AI is like oxygen at LinkedIn, it permeates every single member experience,” Agarwal said. “And just to give you an idea of the scale, we process more than 2PB of data both nearline and offline every single day.”

The academy isn’t designed to give engineers an academic grounding in machine learning as a general discipline. Rather, it’s intended to prepare them to use AI in much the way that they’d use a system like QuickSort, an algorithm for sorting data that’s fed into it. Users don’t have to understand how the underlying system works, they just need to know the right way to implement it.

That’s the goal for LinkedIn, Agarwal said. Thus far, six engineers have made it through the AI academy and are deploying machine learning models in production as a result of what they learned. The educational program still has a ways to go (Agarwal said he’d grade it about a C+ at the moment) but it has the potential to drastically affect LinkedIn’s business.

Agarwal’s comments echo — to a degree — those made earlier in the day by Gil Arditi, the product lead for Lyft’s AI platform team. In Arditi’s view, one of the most pressing problems facing the ride hailing company is a dearth of qualified AI talent.

As it stands, creating a machine learning system still requires a number of skilled practitioners who can help wrangle the data and tweak the parameters needed to optimize the system. Those employees are generally hard to come by and expensive to hire. Distributing AI knowledge across LinkedIn’s entire organization could help the company keep up with its demand for intelligent capabilities.

Let’s block ads! (Why?)

Big Data – VentureBeat