• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Monthly Archives: August 2017

Why Mainframes Matter in the Age of Cloud Computing

August 31, 2017   Big Data

Has cloud computing killed mainframes? You might think so. In fact, mainframes remain supremely important, even in the age of the cloud. Here’s why.

blog cloud Why Mainframes Matter in the Age of Cloud Computing

It’s easy enough to understand why one might think that mainframes are no longer relevant in the present age of “cloud-native” everything. Today, everyone and their mother is moving workloads to the cloud.

Public cloud computing platforms like AWS and Azure are generating record amounts of revenue, while private cloud frameworks, such as OpenStack, are more popular than ever.

You may think that, as a technology introduced more than a half-century ago, mainframes don’t have much to offer in a world dominated by much newer cloud computing technology.

blog banner ZPSaver CS eBook Why Mainframes Matter in the Age of Cloud Computing

Mainframes are the Original Cloud

But they do. The thing is, many of the qualities that make cloud computing so valuable to enterprises are also true of mainframes.

Think about it. Cloud computing has become popular because the cloud gives you:

  • On-demand access to compute and storage resources that can scale virtually without limit.
  • A more cost-efficient way to maintain infrastructure than relying on local commodity servers.
  • The flexibility to spin up different types of software environments. In the cloud, you can run any type of host operating system you want. You can use virtual servers, or you can use containers. You get a lot of flexibility.
  • A lower maintenance burden. When you move workloads to the cloud, you no longer have to maintain lots of individual on-premise servers.

To a large extent, mainframes offer precisely the same benefits. Consider the following points:

  • A mainframe provides a single infrastructure that delivers massive compute and storage resources whenever your applications require them.
  • If you use a mainframe, it’s probably one that you already own. You don’t have to purchase new infrastructure. In addition, a mainframe can last for decades, whereas a commodity server lasts usually just a few years. For both reasons, mainframes offer cost-efficiency.
  • Mainframes give you plenty of software flexibility. You can use a mainframe operating system like z/OS, or you can use Linux. You can even run Docker containers on your mainframe.
  • When you run applications on a mainframe, you only have one machine to maintain. This is simpler than maintaining dozens of on-premise commodity servers.

Mainframes have long offered these benefits, but the cloud computing concept as we know it today has been around only for about a decade.

Essentially, then, mainframes were the original cloud. Going back decades, mainframes allowed companies to build cloud infrastructure avant la lettre (which is a fancy way of saying “before the term cloud infrastructure existed”).

blog mainframe cloud Why Mainframes Matter in the Age of Cloud Computing

Mainframes and Cloud Computing Today

Don’t get me wrong. I’m not anti-cloud. You should move workloads to the cloud when the cloud is the best fit for a particular type of workload.

But if you have a mainframe environment set up, there’s a good chance that your mainframe can provide many of the same benefits as a cloud platform. In that case, you’ll save time, effort and money by continuing to use your mainframe to provide the same cloud-like benefits that it has been delivering for years.

In short, the cloud is not a replacement for your mainframe. It’s just a newer, flashier way of implementing the same types of functionality that mainframes offered starting decades ago.

Related: Expert Interview: Trevor Eddolls on Mainframe’s Role in Today’s IT Infrastructure (including the Cloud)

Of course, using a mainframe effectively today does require integrating your mainframe environment into the rest of your infrastructure — whether it involves the cloud or not. To learn more about how modern organizations are using mainframes in conjunction with cloud environments and other types of infrastructure, check out the State of the Mainframe survey report.

 Why Mainframes Matter in the Age of Cloud Computing

Try Ironstream fbanner Why Mainframes Matter in the Age of Cloud Computing

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Read More

How to get the loss function used in NetTrain?

August 31, 2017   BI News and Info
 How to get the loss function used in NetTrain?

When the third argument to NetTrain is set to Automatic, how can you tell what loss function it actually used and extract it?

Looking at the docs, I don’t see it as a property that can be asked for in the fourth argument.

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

Double-Digit ATM Compromise Growth Continues in US

August 31, 2017   FICO
ATM Hacked Double Digit ATM Compromise Growth Continues in US

While data breaches and ransomware grab the headlines, we’re still seeing fraud growth due to ATM compromises in the US. The fraud growth rate has slowed down from the gangbusters surge we saw in 2015, but consumers and issuers still need to pay attention.

The latest data from the FICO® Card Alert Service, which monitors hundreds of thousands of ATMs and other readers in the US, shows a 39 percent increase in the number of cards compromised at US ATMs and merchants in the first six months of 2017, compared to the same period in 2016. The number of POS device and ATM compromises rose 21 percent in the same period.

Beyond the numbers, at FICO we have seen the rate of fraud pattern changes accelerating over the last two years. As criminals try to beat the system, we are continually adapting the predictive analytics we use to detect compromises.

This year marks the 25th anniversary of FICO’s work in AI and machine learning, which began with Falcon Fraud Manager. Ever since then, we have been in an innovation race with the bad guys. The latest figures show they’re not slowing down. Neither are we.

Tips for Consumers

  • If an ATM looks odd, or your card doesn’t enter the machine smoothly, consider going somewhere else for your cash.
  • Never approach an ATM if anyone is lingering nearby. Never engage in conversations with others around an ATM. Remain in your automobile until other ATM users have left the ATM.
  • If your plastic card is captured inside of an ATM, call your card issuer immediately to report it. Sometimes you may think that your card was captured by the ATM when in reality it was later retrieved by a criminal who staged its capture. Either way, you will need to arrange for a replacement card as soon as possible.
  • Ask your card issuer for a new card number if you suspect that your payment card may have been compromised at a merchant, restaurant or ATM. It’s important to change both your card number and your PIN whenever you experience a potential theft of your personal information.
  • Check your card transactions frequently, using online banking and your monthly statement.
  • Ask your card provider if they offer account alert technology that will deliver SMS text communications or emails to you in the event that fraudulent activity is suspected on your payment card.
  • Update your address and cell phone information for every card you have, so that you can be reached if there is ever a critical situation that requires your immediate attention.

If you’re interested in seeing card fraud trends in another part of the world, check out our European Fraud Map. And follow my fraud commentary on Twitter @FraudBird.

Let’s block ads! (Why?)

FICO

Read More

Rise in Mobile Commerce Fuels Demand for Omnichannel Service

August 31, 2017   CRM News and Info

Omnichannel support and communications are becoming essential for brands, as consumers’ increasing use of mobile devices to make purchases and access content fuels their demand to be able to connect with companies or customer service when and how they want.

However, brands lack the necessary expertise and infrastructure, and are turning to mobile network operators, or MNOs, to acquire omnichannel capabilities, suggests data gathered by the
CMO Council.

The organization considered the survey responses of marketing leaders, along with interviews with senior marketers at various major companies, plus comments from a CMO roundtable at the Mobile World Congress held in Barcelona earlier this year, to reach its conclusions.

Brand Marketer Perspectives

Among the CMO Council’s findings:

  • Eighty-one percent of brand marketers surveyed said their business relied on global customer connectivity, secure digital communication, real-time customer interaction and multichannel content delivery.
  • Forty-nine percent of brand marketers believed communications service providers could play a leadership role in omnichannel transformation by identifying engagement best practices and providing brands with an optimized framework for engagement.

Brand marketers wanted intelligence and insights into their customers from communications providers:

  • Seventy-three percent sought behavioral insights;
  • Forty-three percent sought location-related insights;
  • Forty-two percent sought social and messaging patterns;
  • Forty-two percent sought demographic segmentation;
  • Forty-two percent sought transactional data; and
  • Twenty-eight percent wanted information on Web or mobile activities.

“Brands aren’t asking MNOs to even necessarily lead by example, although there are several brands that look to customer experience commitments at T-Mobile as a best practice,” observed Liz Miller, SVP of marketing at the CMO Council.

“The opportunity here is a partnership that empowers the brand to know far more about their customer, thanks to the data MNOs have readily accessible,” she told CRM Buyer.

On the Carrier Side

Sixty percent of telco marketers thought there was potential for telcos to provide an optimized framework for omnichannel engagement.

However, telcos might not be able to provide the support needed as yet, because they also are struggling to keep pace with the need to transform:

  • Forty-seven percent of telco marketers said their organizations were evolving, but more effort was needed;
  • Fifty percent felt they were failing to meet subscribers’ needs for on-demand, personalized engagements;
  • 50 percent said they were committed to an evolved experience but weren’t there yet; and
  • 35 percent said they had a long way to go.

Keen competition from new players and Web-driven startups using disruptive innovation has made it more difficult for telcos to meet subscribers’ new demands.

Need for Talent, Tech and Processes

Telco marketers blamed lack of talent, technology and processes, as well as inadequate support from senior management and executives.

Paradoxically, telco marketers felt brands were well ahead of them.

“The answer for both talent and technology starts in the same place,” the CMO Council’s Miller noted.

“Know what you need before you buy what you want,” she advised.

“Far too often we grab a new piece of technology and pray it’s the silver bullet that will magically transport us from today’s status quo to CX nirvana. Then, when it doesn’t work, we bring on high-priced talent,” Miller said.

Siloed by Design

“Most brands can’t do omnichannel because they’ve built teams that are siloed by design,” maintained Ray Wang, principal analyst at Constellation Research.

“They have the mobile folks, then they have the Web folks, then they create a digital team — and it all gets very confusing,” he told CRM Buyer. “Eighty-seven percent of omnichannel efforts fail because they lack the right governance and organizational strategy for success.”

MNO Partnership Issues

Most MNOs “have come from traditional telecom roots — not really the pinnacle of innovation or customer service,” observed Rebecca Wettemann, research VP at Nucleus Research.

It will “take a different type of leadership for MNOs to be innovative partners,” she told CRM Buyer.

The CMO Council has teamed with
Open Roads to develop an actionable framework with a common model, an implementation road map, and best practices and processes for more effective and consistent omnichannel engagement.

Adherence will be “100 percent voluntary,” Miller said. “It’s a guide, not a mandate.”
end enn Rise in Mobile Commerce Fuels Demand for Omnichannel Service


Richard%20Adhikari Rise in Mobile Commerce Fuels Demand for Omnichannel ServiceRichard Adhikari has been an ECT News Network reporter since 2008. His areas of focus include cybersecurity, mobile technologies, CRM, databases, software development, mainframe and mid-range computing, and application development. He has written and edited for numerous publications, including Information Week and Computerworld. He is the author of two books on client/server technology.
Email Richard.

Let’s block ads! (Why?)

CRM Buyer

Read More

Supporting Advanced Data Access Scenarios in Tabular 1400 Models

August 31, 2017   Self-Service BI

In the context of this blog article, advanced data access scenarios refers to situations where data access functions inside M expressions cannot easily be represented through data source definitions in Tabular metadata. These scenarios are predominantly encountered when copying and pasting complex M expressions from an Excel workbook or a Power BI Desktop file into a Tabular 1400 model, or when importing Power BI Desktop files into Azure Analysis Services. Often, these complex M expressions require manual tweaking for processing of the Tabular 1400 model to succeed.

Before jumping into these advanced data access scenarios, let’s review two basic concepts that differ between M expressions in Power BI Desktop files and in Tabular 1400 models: Data Source Representations (DSRs) and Credentials/Privacy Settings. Understanding the differences is a prerequisite to mastering the advanced data access scenarios covered later in this article.

In Power BI Desktop, you can write a very simplistic M expression like the following and import the resulting table of database objects into your data model:

let
    Source = Sql.Database("server01", "AdventureWorksDW")
in
    Source

Yet, in Tabular 1400, you first need to define a data source in the model, perhaps called “SQL/server01;AdventureWorksDW”, and then write the M expression like this:

let
    Source = #"SQL/server01;AdventureWorksDW"
in
    Source

The result is the same. So why did Tabular 1400 models introduce a different way of providing the data source information? One of the main reasons is explicit data source definitions facilitate deployments. Instead of editing 50 table expressions in a model as part of a production deployment, you can simply edit a single data source definition and all 50 table expressions access the correct production data source. Another key requirement is programmability. While SSDT Tabular is certainly a main tool for creating Tabular 1400 models, it’s by no means the only one. In fact, many Tabular models are created and deployed in an automated way through scripts or ETL packages. The ability to define data sources in a model programmatically is very important.

Going one level deeper, check out the definition of the Sql.Database function in the Power Query (M) Formula Reference. You can see it is defined as in the Data Access Function column of the following table. Next, look at a structured data source definition for SQL Server in a Model.bim file of a Tabular 1400 model. The data source definition follows the format shown in the Data Source Definition column of the below table. A side-by-side comparison shows you that all parameters of the Sql.Database function are also available in the data source definition. That’s not surprising. However, the Mashup engine only works with M expressions. It doesn’t know anything about Tabular 1400 metadata. During modeling and processing, the tools and the AS engine must translate the data access functions and their data source definitions back and forth with the help of the Mashup engine.

Data Access Function Data Source Definition
Sql.Database(
server as text, database as text,
optional options as nullable record) as table
{
“type”: “structured”,
“name”: “name as text”,
“connectionDetails”: {
“protocol”: “tds”,
“address”: {
“server”: “server as text”,
“database”: “database as text”
},
“authentication”: null,
“query”: “optional query as text”
},
“options”: {optional options as nullable record}
“credential”: {…}
}

As a side note, if your M expressions in Power BI Desktop use data access functions that do not yet have a data source representation in Tabular 1400, then you cannot use these M expressions in your Tabular 1400 model. We first must enable/extend the corresponding connector. Our goal is to add a data source representation to every connector available in Power BI Desktop and then enable these connectors for Tabular 1400.

So far, things are relatively straightforward. But, it gets tricky when credential objects come into the picture. In the M expressions above, where are the credentials to access the data source? Power BI Desktop stores the credentials in a user-specific location on the local computer. This mechanism doesn’t work for Tabular 1400 because these models are hosted on an Analysis Services server. The credentials are stored in the credential property of the structured data source in the model metadata (see the data source definition in the table above).

Given that Power BI Desktop stores the credentials outside of the .pbix file, how does the Mashup engine know which credentials to use for which data source? The crucial detail is the Mashup engine uses the data access parameters to establish the association between the data access functions and their credentials. In other words, if your Power BI Desktop file includes multiple Sql.Database(“server01”, “AdventureWorksDW”) statements, because they all use the same address parameters, there is only one credential for all of them. You can notice this in Power BI Desktop as you are prompted only once for credentials to access a given data source, no matter how often you use the corresponding data access function. By default, Power BI Desktop associates a credential with the highest level of the address parameters, such as the SQL Server server name. But you can select lower levels if the data source supports it, such as server and database name, as shown in the following screenshot.

MExpCredentialLevel Supporting Advanced Data Access Scenarios in Tabular 1400 Models

The tricky part is the same association rules apply to Tabular 1400. Theoretically, each data source definition in a Tabular 1400 model has its own credential object. This is certainly the case at the Tabular metadata layer. But as mentioned earlier, the Mashup engine does not deal with Tabular metadata. The Mashup engine expects M expressions with resolved data access functions. The AS engine must translate the structured data source definitions accordingly. If two data source definitions had the same address parameters, it would no longer be possible to identify their corresponding credential objects uniquely at the M layer. To avoid credential ambiguity, there can be only one data source definition with a given set of address parameters in a Tabular 1400 model. Plans exist to eliminate this restriction in a future compatibility level, but in Tabular 1400 this limitation is important to keep in mind when dealing with advanced data access scenarios.

Now, let’s jump into these advanced data access scenarios with increasing levels of complexity.

As a warmup, look at the following M expression. It’s not very useful, but it illustrates the point. The expression includes two Sql.Database calls with the same address parameters, but with different options. As explained above, in Tabular 1400, you cannot create two separate data source objects for these two function calls. But if you can only create one, then you only get one options record (see the previous table). So what are you going to do with the second command timeout? Can you perhaps use the same command timeout and consolidate both into a single data source definition? What if not?

let
    Source1 = Sql.Database("server01", "AdventureWorksDW", [CommandTimeout=#duration(0, 0, 5, 0)]),
    Source2 = Sql.Database("server01", "AdventureWorksDW", [CommandTimeout=#duration(0, 0, 10, 0)]),
    Combined = Table.Combine({Source1, Source2})
in
    Combined

The solution is not intuitive. Perhaps even misleading. It is fragile, and it breaks dependency analysis in the tools. It also breaks the programmability contract because you can no longer simply update the data source definition and expect your M expressions to pick up the changes. It clearly needs a better implementation in a future compatibility level. But for now, you can simply define a data source with the address parameters shown above and then use the M expression unmodified in Tabular 1400. The credential object is magically applied to all corresponding data access functions that use the same parameters across all M expressions in the model. That’s how the Mashup engine works, as the following screenshot illustrates.

MExpDataAccessDifferentOptions 1024x644 Supporting Advanced Data Access Scenarios in Tabular 1400 Models

Note: This workaround of using a data source merely as a holder of a credential object is not recommended and should be considered a last-resort option. If possible, create a common set of options for all data access calls to the same source and use a single data source object to replace the data access functions. Another option might be to register the same server with multiple names in DNS and then use a different server name in each data source definition.

Perhaps, you can consolidate your data access options and keep things straightforward in your Tabular 1400 model. But what if your data access functions use different native queries as in the following example?

let
    Source1 = Sql.Database("server01", "AdventureWorksDW", [Query="SELECT * FROM dimCustomer WHERE LastName = 'Walker'"]),
    Source2 = Sql.Database("server01", "AdventureWorksDW", [Query="SELECT * FROM dimCustomer WHERE LastName = 'Jenkins'"]),
    Combined = Table.Combine({Source1, Source2})
in
    Combined

Of course, one option is to rewrite the first native query to deliver the combined result so that you can eliminate the second function call. Yet, native query rewrites are not really required if you modified the M expression and used the Value.NativeQuery function, as in the following expression. Now there is only a single Sql.Database function call, which can nicely be replaced with a data source definition in Tabular 1400.

let
    Source = Sql.Database("server01", "AdventureWorksDW"),
    Table1 = Value.NativeQuery(Source, "SELECT * FROM dimCustomer WHERE LastName = 'Walker'"),
    Table2 = Value.NativeQuery(Source, "SELECT * FROM dimCustomer WHERE LastName = 'Jenkins'"),
    Combined = Table.Combine({Table1, Table2})
in
    Combined

Note: Even if you consolidated the queries, as in SELECT * FROM dimCustomer WHERE LastName = ‘Walker’ Or LastName = ‘Jenkins’, avoid putting this native query on the query parameter of the data source definition. The query parameter exists for full compatibility between data access functions and data source definitions. But it shouldn’t be used because it narrows the data source down to a particular query. You could not import any other data from that source. As explained earlier, you cannot define a second data source object with the same address parameters in Tabular 1400. So, define the data source in broadest terms and then use the Value.NativeQuery function to submit a native query.

The next level of challenges revolves around the dynamic definition data source parameters. This technique is occasionally used in Power BI Desktop files to maintain the address parameters for multiple data access function calls centrally. The following screenshot shows an example.

MExpParameterized Data Access Function 1024x727 Supporting Advanced Data Access Scenarios in Tabular 1400 Models

Fortunately, the concept of M-based address parameter definitions is comparable to the concept of a data source definition in Tabular 1400. In most cases, you should be able to define the Tabular 1400 data source as always by simply using the parameter values directly. Then, replace the data access function calls in your M expressions with the reference to the data source, delete the M-based parameter definitions, and the job is done. The result is a common Tabular 1400 M expression.

let
    Source = #"SQL/server01;AdventureWorksDW"
in
    Source

Of course, this technique only works for trivial, single-valued parameter definitions. For more complex scenarios, well, read on.

The ultimate challenge in the advanced data access scenarios is the fully dynamic implementation of a data access function call. There are various flavors. Let’s analyze the following simple example first.

let
    #"Dynamic Function" = Sql.Database,
    Source = #"Dynamic Function"("server01", "AdventureWorksDW")
in
    Source

Clearly, this M expression is accessing a SQL Server database called AdventureWorksDW on server01. You can create a data source definition for this database in Tabular 1400 and replace these lines with a reference to that data source definition, as shown in the sample in the previous section above. This is easy because it wasn’t really a dynamic data source, yet. Here’s a fully dynamic example.

let
    Source = Sql.Database("server01", "Dynamic"),
    DataAccessPoints = Source{[Schema="dbo",Item="DataAccessPoints"]}[Data],
    NavigationTable = Table.AddColumn(DataAccessPoints, "Data", each Connect([FunctionName], [Parameter1], [Parameter2]))
in
    NavigationTable

This M expression retrieves all rows from a DataAccessPoints table in a SQL Server database called Dynamic. It then passes the column values of each row to a custom function called Connect, which then connects to the specified data source. The Connect function is based on the following M expression.

let
    Source = (FunctionName, Param1, Param2) => let
        FunctionTable = Record.ToTable(#shared),
        FunctionRow = Table.SelectRows(FunctionTable, each ([Name] = FunctionName)),
        Function = FunctionRow{0}[Value],
        Result = if Param2 = null then Function(Param1) else Function(Param1, Param2)
    in
        Result
in
    Source

Not only are the address parameters (Param1 and Param2) dynamically assigned, but the name of the data access function itself is also passed in as a parameter (FunctionName). So, unless you know the contents of the DataAccessPoints table, you cannot even determine what data sources this M expression accesses! You must evaluate the expressions against the actual data to know what credentials you need to supply, and you must define privacy settings. In this example, the following screenshot reveals that the expressions connect to an OData feed as well as a SQL Server database, but it could really be any supported data source type.
MExp Fully Dynamic Data Access 1024x718 Supporting Advanced Data Access Scenarios in Tabular 1400 Models
So, if you wanted to use such a dynamic construct, you would have to create the corresponding data source objects with their credentials in your Tabular 1400 model. Power BI Desktop can prompt you for the missing credentials if you added a new entry to the DataAccessPoints table, but Analysis Services cannot because there is no interactive user on a server, and processing would fail. You must add a new matching data source definition with the missing credentials and privacy settings upfront, which somewhat defeats the purpose of a fully dynamic data source definition.

Perhaps you are wondering at this point why we didn’t provide more flexible support for credential handling in Tabular 1400. One of the main reasons is the footprint of the modern Get Data experience on the Mashup engine is already significant. It wasn’t justifiable to take a sledgehammer approach at the Mashup layer. Especially when it concerns security features used across several Microsoft technologies, including Power BI.

Fully dynamic scenarios are certainly interesting and fun, but they are corner cases. The more common advanced data access scenarios can be handled with moderate effort in Tabular 1400. And a future compatibility level is going to remove the limitation of a single data source definition per target parameters. It requires some replumbing deep in the Mashup engine. On the other hand, how Tabular models are going to support fully dynamic data source definitions in future compatibility levels hasn’t yet been decided. One idea is to move credential storage out of the Tabular metadata. Another is to introduce new metadata objects that can be associated with data access functions through their address parameters. Perhaps another is to invent a form of processing handshake. And there may be other options. If you have a great idea, please post a comment or send it our way via ssasprev at microsoft.com, or use any other available communication channels such as UserVoice or MSDN forums.

That’s it for this excursion into handling advanced data access scenarios when moving M expressions from Power BI Desktop to a Tabular 1400 model. One of the next articles will show you how to use legacy provider data sources and native query partitions together with the new structured data sources and M partitions in a Tabular 1400 model. Stay tuned for more Get Data coverage on the Analysis Services Team Blog!

Let’s block ads! (Why?)

Analysis Services Team Blog

Read More

Pizza Time

August 31, 2017   Humor

Can you taste it?

vnM7tz8 Pizza Time

“Pizza.”

Image courtesy of http://imgur.com/gallery/vnM7tz8.

Advertisements

Let’s block ads! (Why?)

Quipster

Read More

How to Create Microsoft Word Templates in Microsoft Dynamics 365

August 31, 2017   Microsoft Dynamics CRM
Word template 300x225 How to Create Microsoft Word Templates in Microsoft Dynamics 365

In a previous blog you’ve learned How to Easily Generate Excel Templates, but what if you wanted to use Microsoft Word? Using Microsoft Word templates in Microsoft Dynamics 365 can be a huge time-saver when you want to provide a professional document for an entity, especially if you need to make the document more than once for more than one records. Let’s get to it!

What are Microsoft Word templates?

Microsoft Word templates are documents that are created once, but can be used on multiple records directly from Microsoft Dynamics 365. Templates can be used to generate anything from thank you notes, to birthday cards, or even a list of opportunities associated with an account by using XML – but before you run to get a developer, don’t worry, no coding is required!

When would you use a Microsoft Word template?

If you wanted to send a note to a key contact on their birthday you could create a “Happy Birthday” template, once you print it off you could give it your own signature to add a personal touch.
Another case would be if you wanted to remind an account of all their opportunities and the associated proposed budgets.
A third example is if you wanted an account to verify the contacts on their team and what levels of authority each has such as purchasing authority, service-ticket creation authority, or IT architecture authority.
Or, anytime that you want information from an Account, Contact, Opportunity or any entity to auto-populate into a Microsoft Word document for mailing, emailing or reporting.

How to Create a Microsoft Word Template?

  1. Navigate to Account View
  2. Select Excel Templates from the Command Bar and Create Excel Template

083017 1812 HowtoCreate1 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. Select Word Template
  2. Select the Desired Entity

083017 1812 HowtoCreate2 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. Select any related entities that you will use in the template such as Opportunities, Quotes, Orders or Invoices. Once you click “Download Template” the download will start automatically, and the resulting file will look like a blank MS Word document.

083017 1812 HowtoCreate3 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. In Microsoft Word, make sure the developer tab is shown in the Ribbon. If it is not, right click the Ribbon and select “Customize the Ribbon” in the right pane, check the box next to Developer and select ok. Once the developer tab is on the ribbon click “Display the XML Mapping Pane” under the developer tab.
  2. Select CRM in the Custom XML Part Drop-down Menu
  3. As you type your document, right-click fields to insert them into the document.

083017 1812 HowtoCreate4 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. You can insert tables as well creating the table in Microsoft Word, select the row, then right click on the opportunity entity at the bottom of the XML Mapping Pane, expand the Insert Content Control section and select repeating.

083017 1812 HowtoCreate5 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. Then populate the table with content control just like before.

083017 1812 HowtoCreate6 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. The next step is to upload the Word Template so that it can be used. Save your word document and navigate back to Dynamics 365 and repeat steps 1-4, but instead of clicking select entity, click on the upload button. Then browse to your file and upload the document.
  2. Once Upload is complete, the template is now available for use on the account page. Test your work by navigating to an account, selecting the ellipsis> Word Templates > Personal Word Templates, and select your desired template.

083017 1812 HowtoCreate7 How to Create Microsoft Word Templates in Microsoft Dynamics 365

083017 1812 HowtoCreate8 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. If you want to make any changes to the document once you download it you can, just like you word in a regular document.

083017 1812 HowtoCreate9 How to Create Microsoft Word Templates in Microsoft Dynamics 365

So, if you’re sending out birthday cards, providing updates or confirming information. If you need to make the same form more than once, create a template and eliminate the busy work of recreating documents.

Be sure to subscribe to our blog so you have immediate updates to all of the new tips and tricks for Microsoft Dynamics 365, and if you would like more in-depth information on any CRM topic check out our CRM Book!

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

How To Keep Up With China’s Rapidly Changing Automotive Industry

August 31, 2017   SAP
282010 GettyImages 158259775 super e1504034958793 How To Keep Up With China’s Rapidly Changing Automotive Industry

Also, while WhatsApp has a large following, China’s 768 million WeChat users have access to far more advanced features, such as electronic payments and online gaming. It seems as if WeChat payments are ubiquitous in China and will soon replace both cash and credit card payments.

The bottom line

In a globally connected world, automotive companies are reinventing themselves. China is a great example of jumping technology stages in an effort to become the leader in electric mobility.

Technology companies recognize the gravitas of China becoming a leader in the automotive world. Recognizing also means listening to the specifics of the market, the tech infrastructure ecosystem, and consumer preferences, and providing breakthrough cloud-based technologies that enable fast-moving players to stay ahead of the game.

This story also appeared on the SAP Community. Follow me @ulimuench.

Let’s block ads! (Why?)

Digitalist Magazine

Read More

CRM Package Deployer For The Win!

August 30, 2017   CRM News and Info
CRM Blog CRM Package Deployer For The Win!

A few years back I wrote about beginning to work with the CRM Package Deployer out of the CRM SDK tools folder. Check out the article here.  Back then, we were liking the flexibility of being able to deploy multiple solutions with sample data and not needing to write our own installer.  We have continued to use that in all of our customer implementations since and it has worked well for us.

With the change to Microsoft Dynamics CRM and the recent rebranding to Dynamics 365, Microsoft has taken the approach of allowing apps by ISV’s, which are tested and approved by Microsoft Dev, be released to their app store. We have successfully navigated this new method and now have two apps published in the AppSource!  Find them here. During the process, we were happy to see Microsoft standing behind their tools and they requested apps to be delivered in a CRM deployment package.  Plus, we had already done most of the work for that.  Win!

There were two items that we came across when getting our apps certified that may save others time when looking to certify their app.

  • We found that it was necessary to use the Dynamics 365 SDK version of the Package Deployer.  The Dynamics CRM 2016 SDK would not connect to the online instances.
  • When placing your Solution files into the package, the files need to exist directly under the package folder instead of being placed in the content folder as recommended in the documentation. The package will build and run independently during your tests, but when sending to Microsoft for certification, their automated package tester will fail. We have mentioned this to the certification team. They are aware that this is happening and are looking to correct the issue.  Until that is fixed, this is the only way to get past the certification process.

It is really great to see Microsoft leveraging their own tools and hope that it will continue to work on and improve the tool going forward. There is certainly plenty of room for improvement, but with the ability to add your own custom code, we can find our way around most of the current limitations.  As new changes come about, I’ll try to pass those changes along in future articles. In the meantime, if you have any questions, let us know!

Written by Bryan Page, Developer at Rockton Software, a Microsoft Dynamics 365 application provider. Check out our latest application, Recurring Billing, now live on the Microsoft AppSource. 

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Pence caught trying to act presidential – deletes tweet as #TrumpRussia moves forward

August 30, 2017   Humor
 Pence caught trying to act presidential   deletes tweet as #TrumpRussia moves forward

Amid the Trumpian bluster of the Arpaio pardon and the ratings Trump wanted because of Hurrican Harvey, Mike Pence was doing his level-best to demonstrate his presidential ability.

He suddenly realized that he was doing what others had discovered too late, that one never outshines god-emperor Trump.

In stark contrast to most of the pictures of President Trump, who is usually golfing, watching television, or signing things he hasn’t read, Pence is reading and communicating with lawmakers the way one usually expects the President to.

So why would Pence, or his staff, delete this Tweet? The only reasonable explanation is that they wish to soothe the President’s paranoid mind, which is constantly watching for challenges to his tenuous grasp on power.

Trump’s hypersensitive ego desperately seeks affirmation that he is still the Most Presidential Man In Washington and would not take kindly to any perceived threat from his running mate.

 Pence caught trying to act presidential   deletes tweet as #TrumpRussia moves forward

Meanwhile…

Special Counsel Robert Mueller is closing in on Trump from so many angles it’s hard to keep track.

It’s only a matter of time before Trump is removed from office.

The only question is whether or not Mike Pence will go down with him and from the looks of it, he will.

Michael Flynn was the beginning of the end for the current administration. The New York Times is reporting that Robert Mueller is now asking for the White House to turn over documents pertaining to Flynn’s employment as a foreign agent and with the Trump campaign. Mueller is supposedly investigating whether Flynn was getting kickbacks from his time on the Trump campaign, which would be classified as fraud.

Flynn has already admitted to the Trump team that he was a paid foreign agent, but he was still hired for the National Security Adviser position. Mike Pence knew Michael Flynn was a criminal, but claimed his innocence anyways.

Mueller’s investigation into these kickbacks brings up the question, did Pence know about those too? If Pence knew about these crimes then he committed obstruction of justice.

Let’s block ads! (Why?)

moranbetterDemocrats

Read More
« Older posts
  • Recent Posts

    • Accelerate Your Data Strategies and Investments to Stay Competitive in the Banking Sector
    • SQL Server Security – Fixed server and database roles
    • Teradata Named a Leader in Cloud Data Warehouse Evaluation by Independent Research Firm
    • Derivative of a norm
    • TODAY’S OPEN THREAD
  • Categories

  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited