Monthly Archives: August 2017

Why Mainframes Matter in the Age of Cloud Computing

Has cloud computing killed mainframes? You might think so. In fact, mainframes remain supremely important, even in the age of the cloud. Here’s why.

blog cloud Why Mainframes Matter in the Age of Cloud Computing

It’s easy enough to understand why one might think that mainframes are no longer relevant in the present age of “cloud-native” everything. Today, everyone and their mother is moving workloads to the cloud.

Public cloud computing platforms like AWS and Azure are generating record amounts of revenue, while private cloud frameworks, such as OpenStack, are more popular than ever.

You may think that, as a technology introduced more than a half-century ago, mainframes don’t have much to offer in a world dominated by much newer cloud computing technology.

blog banner ZPSaver CS eBook Why Mainframes Matter in the Age of Cloud Computing

Mainframes are the Original Cloud

But they do. The thing is, many of the qualities that make cloud computing so valuable to enterprises are also true of mainframes.

Think about it. Cloud computing has become popular because the cloud gives you:

  • On-demand access to compute and storage resources that can scale virtually without limit.
  • A more cost-efficient way to maintain infrastructure than relying on local commodity servers.
  • The flexibility to spin up different types of software environments. In the cloud, you can run any type of host operating system you want. You can use virtual servers, or you can use containers. You get a lot of flexibility.
  • A lower maintenance burden. When you move workloads to the cloud, you no longer have to maintain lots of individual on-premise servers.

To a large extent, mainframes offer precisely the same benefits. Consider the following points:

  • A mainframe provides a single infrastructure that delivers massive compute and storage resources whenever your applications require them.
  • If you use a mainframe, it’s probably one that you already own. You don’t have to purchase new infrastructure. In addition, a mainframe can last for decades, whereas a commodity server lasts usually just a few years. For both reasons, mainframes offer cost-efficiency.
  • Mainframes give you plenty of software flexibility. You can use a mainframe operating system like z/OS, or you can use Linux. You can even run Docker containers on your mainframe.
  • When you run applications on a mainframe, you only have one machine to maintain. This is simpler than maintaining dozens of on-premise commodity servers.

Mainframes have long offered these benefits, but the cloud computing concept as we know it today has been around only for about a decade.

Essentially, then, mainframes were the original cloud. Going back decades, mainframes allowed companies to build cloud infrastructure avant la lettre (which is a fancy way of saying “before the term cloud infrastructure existed”).

blog mainframe cloud Why Mainframes Matter in the Age of Cloud Computing

Mainframes and Cloud Computing Today

Don’t get me wrong. I’m not anti-cloud. You should move workloads to the cloud when the cloud is the best fit for a particular type of workload.

But if you have a mainframe environment set up, there’s a good chance that your mainframe can provide many of the same benefits as a cloud platform. In that case, you’ll save time, effort and money by continuing to use your mainframe to provide the same cloud-like benefits that it has been delivering for years.

In short, the cloud is not a replacement for your mainframe. It’s just a newer, flashier way of implementing the same types of functionality that mainframes offered starting decades ago.

Related: Expert Interview: Trevor Eddolls on Mainframe’s Role in Today’s IT Infrastructure (including the Cloud)

Of course, using a mainframe effectively today does require integrating your mainframe environment into the rest of your infrastructure — whether it involves the cloud or not. To learn more about how modern organizations are using mainframes in conjunction with cloud environments and other types of infrastructure, check out the State of the Mainframe survey report.

 Why Mainframes Matter in the Age of Cloud Computing

Try Ironstream fbanner Why Mainframes Matter in the Age of Cloud Computing

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

How to get the loss function used in NetTrain?

 How to get the loss function used in NetTrain?

When the third argument to NetTrain is set to Automatic, how can you tell what loss function it actually used and extract it?

Looking at the docs, I don’t see it as a property that can be asked for in the fourth argument.

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Double-Digit ATM Compromise Growth Continues in US

ATM Hacked Double Digit ATM Compromise Growth Continues in US

While data breaches and ransomware grab the headlines, we’re still seeing fraud growth due to ATM compromises in the US. The fraud growth rate has slowed down from the gangbusters surge we saw in 2015, but consumers and issuers still need to pay attention.

The latest data from the FICO® Card Alert Service, which monitors hundreds of thousands of ATMs and other readers in the US, shows a 39 percent increase in the number of cards compromised at US ATMs and merchants in the first six months of 2017, compared to the same period in 2016. The number of POS device and ATM compromises rose 21 percent in the same period.

Beyond the numbers, at FICO we have seen the rate of fraud pattern changes accelerating over the last two years. As criminals try to beat the system, we are continually adapting the predictive analytics we use to detect compromises.

This year marks the 25th anniversary of FICO’s work in AI and machine learning, which began with Falcon Fraud Manager. Ever since then, we have been in an innovation race with the bad guys. The latest figures show they’re not slowing down. Neither are we.

Tips for Consumers

  • If an ATM looks odd, or your card doesn’t enter the machine smoothly, consider going somewhere else for your cash.
  • Never approach an ATM if anyone is lingering nearby. Never engage in conversations with others around an ATM. Remain in your automobile until other ATM users have left the ATM.
  • If your plastic card is captured inside of an ATM, call your card issuer immediately to report it. Sometimes you may think that your card was captured by the ATM when in reality it was later retrieved by a criminal who staged its capture. Either way, you will need to arrange for a replacement card as soon as possible.
  • Ask your card issuer for a new card number if you suspect that your payment card may have been compromised at a merchant, restaurant or ATM. It’s important to change both your card number and your PIN whenever you experience a potential theft of your personal information.
  • Check your card transactions frequently, using online banking and your monthly statement.
  • Ask your card provider if they offer account alert technology that will deliver SMS text communications or emails to you in the event that fraudulent activity is suspected on your payment card.
  • Update your address and cell phone information for every card you have, so that you can be reached if there is ever a critical situation that requires your immediate attention.

If you’re interested in seeing card fraud trends in another part of the world, check out our European Fraud Map. And follow my fraud commentary on Twitter @FraudBird.

Let’s block ads! (Why?)

FICO

Built to Run a Business: NetSuite’s Misunderstood Differentiator

Posted by Paul Farrell, VP of Product Marketing, Oracle NetSuite

A key factor in NetSuite’s success from Day One was the overarching goal of the company. NetSuite was built to run a business. The suite includes many outstanding capabilities, built in from the ground up, including: pure cloud architecture; built-in Business Intelligence; customization framework; ecommerce; and more. These are all incredible capabilities, but its core mission — software built to run a business — has always been a subtle misunderstood differentiator.

Built%20to%20Run%20Image Built to Run a Business: NetSuite’s Misunderstood DifferentiatorIt’s amazing how many other competitive products out there started off with a different focus. That focus was not running a business. They were created to run a sales force, a shop floor, a warehouse, an HR team or the accounts. Then out of this primary goal, many years later, other functions where then shoehorned in, built on foundations that were not fit for purpose. For example, there are several solutions out there that are based on simple sales force automation foundations that morphed into CRM and then finally the rest of the business functionality was added. Their mantra is the customer is the center of everything. However, the way you sell to and interact with your customers may be very different from the way you need to financially structure your business because of geographic or tax constraints. However, if the product was designed purely for financials, the account is at the center of everything, forcing you to sell to customers based on your financial structure. The same can be said for the shop floor, where the work center is the center of everything, or the HR system where the employee is the center, etc.

Users want a system that caters to their role. A warehouse manager wants the system to be based around the product and the warehouse as that is their entire world. Not that they want to forget the customer, or the accounting impact but it is not their focus. A product that is centered around the general ledger or the customer is going to get in the way of doing their job.

With NetSuite, the core of the product puts all the key business functions on an equal footing: Account; Customer; Product; Service; Order; Person. They are all seamlessly linked together in a common business model. Then surrounding that are the industries. There are two distinct types of industries service industries (Software, Service, Financial Services) and product industries (Manufacturing, Distribution, Retail). Again, similar to business functions many solutions started off supporting one type of industry and then morphed in to supporting another. As a result, you get a manufacturing product trying to support the services industry and vice versa. It might functionally work but are users going to enjoy creating works orders, work centers and parts, capture WIP and in general use manufacturing terminology and concepts to run a software installation project. The same can be said for a distribution company using functionality that was built to support the financial services industry. It might eventually work, but it will feel like pushing a boulder up a never ending muddy hill. Companies will spend forever customizing it to try and make it fit. All they end up with is an unsupported system, that no one understands, no one likes and barely does the job. NetSuite was built to run product- and service-based businesses from the beginning. Whatever the business it provides the terminology, workflows and reports needed for that type of company.

Moreover, for hybrid businesses it will simultaneously look as though it were designed to run both depending on what part of the business you sit within.

Businesses are made up of many people carrying out multiple distinct roles. One of the real issues that ERP has had over the years is that many times it is purchased by one group and then forced on the others with the promise of producing consistent information and consistent results. For example buying the product that was originally built for CRM, is great for the sales force, clunky for operations ; and a nightmare for finance. Or the product built for finance is great for accounting, clunky for operations and a nightmare for sales. The result is generally long implementations, budget overruns, massive complex customizations, bolt on systems delivering subpar results and locking customers into a version of the product for years to come. NetSuite was built to run a business: from Day One it was optimized to manage the multiple distinct roles every business has. So, when the controller of a software company logs in it looks like it was just written for them. When the warehouse manager logs in it looks like it was written just for them; when the salesperson logs in it looks like it was built just for them. Even when the customer logs in it looks like it was built just for them. NetSuite users therefore learn the product more quickly, enjoy using the product, leading to faster implementations, quick ROI, better quality of data and more productive effective businesses.

SuiteSucces, recently announced by NetSuite, is a manifestation of all of this. SuiteSuccess was engineered to solve unique industry challenges that historically have limited a company’s ability to grow, scale and adapt to change. Most ERP vendors have tried to solve the industry solution problem with templates, rapid implementation methodologies, and custom code (trying to force that round peg into the square hole). NetSuite took a holistic approach to the problem ensuring every single aspect of the solution and delivery is oriented around the customer’s industry and roles within that industry. There are four pillars to the solution.

1) Build: develop a complete unified suite covering all elements required to run a business.

2) Engage: Leading practices for each industry including all capabilities, workflows, BI, reports, roles required for each industry. Allowing customers from the first sales engagement understand the product and how it will support the business.

3) Consume: A re-imagined consumption model using NetSuite’s industry stairways allows companies to consume capabilities based on their business needs.

4) Optimize: Continuous improvement of all aspects and consumption of the solution.

NetSuite was built to run a business and handle the complexity and variances real businesses encounter. That is why NetSuite is the leading cloud ERP, delivering rapid ROI to its customers with unrivalled referencability within the industries it serves.

If you have a business to run, you should always look to NetSuite.

Posted on Wed, August 30, 2017
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

Rise in Mobile Commerce Fuels Demand for Omnichannel Service

Omnichannel support and communications are becoming essential for brands, as consumers’ increasing use of mobile devices to make purchases and access content fuels their demand to be able to connect with companies or customer service when and how they want.

However, brands lack the necessary expertise and infrastructure, and are turning to mobile network operators, or MNOs, to acquire omnichannel capabilities, suggests data gathered by the
CMO Council.

The organization considered the survey responses of marketing leaders, along with interviews with senior marketers at various major companies, plus comments from a CMO roundtable at the Mobile World Congress held in Barcelona earlier this year, to reach its conclusions.

Brand Marketer Perspectives

Among the CMO Council’s findings:

  • Eighty-one percent of brand marketers surveyed said their business relied on global customer connectivity, secure digital communication, real-time customer interaction and multichannel content delivery.
  • Forty-nine percent of brand marketers believed communications service providers could play a leadership role in omnichannel transformation by identifying engagement best practices and providing brands with an optimized framework for engagement.

Brand marketers wanted intelligence and insights into their customers from communications providers:

  • Seventy-three percent sought behavioral insights;
  • Forty-three percent sought location-related insights;
  • Forty-two percent sought social and messaging patterns;
  • Forty-two percent sought demographic segmentation;
  • Forty-two percent sought transactional data; and
  • Twenty-eight percent wanted information on Web or mobile activities.

“Brands aren’t asking MNOs to even necessarily lead by example, although there are several brands that look to customer experience commitments at T-Mobile as a best practice,” observed Liz Miller, SVP of marketing at the CMO Council.

“The opportunity here is a partnership that empowers the brand to know far more about their customer, thanks to the data MNOs have readily accessible,” she told CRM Buyer.

On the Carrier Side

Sixty percent of telco marketers thought there was potential for telcos to provide an optimized framework for omnichannel engagement.

However, telcos might not be able to provide the support needed as yet, because they also are struggling to keep pace with the need to transform:

  • Forty-seven percent of telco marketers said their organizations were evolving, but more effort was needed;
  • Fifty percent felt they were failing to meet subscribers’ needs for on-demand, personalized engagements;
  • 50 percent said they were committed to an evolved experience but weren’t there yet; and
  • 35 percent said they had a long way to go.

Keen competition from new players and Web-driven startups using disruptive innovation has made it more difficult for telcos to meet subscribers’ new demands.

Need for Talent, Tech and Processes

Telco marketers blamed lack of talent, technology and processes, as well as inadequate support from senior management and executives.

Paradoxically, telco marketers felt brands were well ahead of them.

“The answer for both talent and technology starts in the same place,” the CMO Council’s Miller noted.

“Know what you need before you buy what you want,” she advised.

“Far too often we grab a new piece of technology and pray it’s the silver bullet that will magically transport us from today’s status quo to CX nirvana. Then, when it doesn’t work, we bring on high-priced talent,” Miller said.

Siloed by Design

“Most brands can’t do omnichannel because they’ve built teams that are siloed by design,” maintained Ray Wang, principal analyst at Constellation Research.

“They have the mobile folks, then they have the Web folks, then they create a digital team — and it all gets very confusing,” he told CRM Buyer. “Eighty-seven percent of omnichannel efforts fail because they lack the right governance and organizational strategy for success.”

MNO Partnership Issues

Most MNOs “have come from traditional telecom roots — not really the pinnacle of innovation or customer service,” observed Rebecca Wettemann, research VP at Nucleus Research.

It will “take a different type of leadership for MNOs to be innovative partners,” she told CRM Buyer.

The CMO Council has teamed with
Open Roads to develop an actionable framework with a common model, an implementation road map, and best practices and processes for more effective and consistent omnichannel engagement.

Adherence will be “100 percent voluntary,” Miller said. “It’s a guide, not a mandate.”
end enn Rise in Mobile Commerce Fuels Demand for Omnichannel Service


Richard%20Adhikari Rise in Mobile Commerce Fuels Demand for Omnichannel ServiceRichard Adhikari has been an ECT News Network reporter since 2008. His areas of focus include cybersecurity, mobile technologies, CRM, databases, software development, mainframe and mid-range computing, and application development. He has written and edited for numerous publications, including Information Week and Computerworld. He is the author of two books on client/server technology.
Email Richard.

Let’s block ads! (Why?)

CRM Buyer

Supporting Advanced Data Access Scenarios in Tabular 1400 Models

In the context of this blog article, advanced data access scenarios refers to situations where data access functions inside M expressions cannot easily be represented through data source definitions in Tabular metadata. These scenarios are predominantly encountered when copying and pasting complex M expressions from an Excel workbook or a Power BI Desktop file into a Tabular 1400 model, or when importing Power BI Desktop files into Azure Analysis Services. Often, these complex M expressions require manual tweaking for processing of the Tabular 1400 model to succeed.

Before jumping into these advanced data access scenarios, let’s review two basic concepts that differ between M expressions in Power BI Desktop files and in Tabular 1400 models: Data Source Representations (DSRs) and Credentials/Privacy Settings. Understanding the differences is a prerequisite to mastering the advanced data access scenarios covered later in this article.

In Power BI Desktop, you can write a very simplistic M expression like the following and import the resulting table of database objects into your data model:

let
    Source = Sql.Database("server01", "AdventureWorksDW")
in
    Source

Yet, in Tabular 1400, you first need to define a data source in the model, perhaps called “SQL/server01;AdventureWorksDW”, and then write the M expression like this:

let
    Source = #"SQL/server01;AdventureWorksDW"
in
    Source

The result is the same. So why did Tabular 1400 models introduce a different way of providing the data source information? One of the main reasons is explicit data source definitions facilitate deployments. Instead of editing 50 table expressions in a model as part of a production deployment, you can simply edit a single data source definition and all 50 table expressions access the correct production data source. Another key requirement is programmability. While SSDT Tabular is certainly a main tool for creating Tabular 1400 models, it’s by no means the only one. In fact, many Tabular models are created and deployed in an automated way through scripts or ETL packages. The ability to define data sources in a model programmatically is very important.

Going one level deeper, check out the definition of the Sql.Database function in the Power Query (M) Formula Reference. You can see it is defined as in the Data Access Function column of the following table. Next, look at a structured data source definition for SQL Server in a Model.bim file of a Tabular 1400 model. The data source definition follows the format shown in the Data Source Definition column of the below table. A side-by-side comparison shows you that all parameters of the Sql.Database function are also available in the data source definition. That’s not surprising. However, the Mashup engine only works with M expressions. It doesn’t know anything about Tabular 1400 metadata. During modeling and processing, the tools and the AS engine must translate the data access functions and their data source definitions back and forth with the help of the Mashup engine.

Data Access Function Data Source Definition
Sql.Database(
server as text, database as text,
optional options as nullable record) as table
{
“type”: “structured”,
“name”: “name as text”,
“connectionDetails”: {
“protocol”: “tds”,
“address”: {
“server”: “server as text”,
“database”: “database as text”
},
“authentication”: null,
“query”: “optional query as text”
},
“options”: {optional options as nullable record}
“credential”: {…}
}

As a side note, if your M expressions in Power BI Desktop use data access functions that do not yet have a data source representation in Tabular 1400, then you cannot use these M expressions in your Tabular 1400 model. We first must enable/extend the corresponding connector. Our goal is to add a data source representation to every connector available in Power BI Desktop and then enable these connectors for Tabular 1400.

So far, things are relatively straightforward. But, it gets tricky when credential objects come into the picture. In the M expressions above, where are the credentials to access the data source? Power BI Desktop stores the credentials in a user-specific location on the local computer. This mechanism doesn’t work for Tabular 1400 because these models are hosted on an Analysis Services server. The credentials are stored in the credential property of the structured data source in the model metadata (see the data source definition in the table above).

Given that Power BI Desktop stores the credentials outside of the .pbix file, how does the Mashup engine know which credentials to use for which data source? The crucial detail is the Mashup engine uses the data access parameters to establish the association between the data access functions and their credentials. In other words, if your Power BI Desktop file includes multiple Sql.Database(“server01”, “AdventureWorksDW”) statements, because they all use the same address parameters, there is only one credential for all of them. You can notice this in Power BI Desktop as you are prompted only once for credentials to access a given data source, no matter how often you use the corresponding data access function. By default, Power BI Desktop associates a credential with the highest level of the address parameters, such as the SQL Server server name. But you can select lower levels if the data source supports it, such as server and database name, as shown in the following screenshot.

MExpCredentialLevel Supporting Advanced Data Access Scenarios in Tabular 1400 Models

The tricky part is the same association rules apply to Tabular 1400. Theoretically, each data source definition in a Tabular 1400 model has its own credential object. This is certainly the case at the Tabular metadata layer. But as mentioned earlier, the Mashup engine does not deal with Tabular metadata. The Mashup engine expects M expressions with resolved data access functions. The AS engine must translate the structured data source definitions accordingly. If two data source definitions had the same address parameters, it would no longer be possible to identify their corresponding credential objects uniquely at the M layer. To avoid credential ambiguity, there can be only one data source definition with a given set of address parameters in a Tabular 1400 model. Plans exist to eliminate this restriction in a future compatibility level, but in Tabular 1400 this limitation is important to keep in mind when dealing with advanced data access scenarios.

Now, let’s jump into these advanced data access scenarios with increasing levels of complexity.

As a warmup, look at the following M expression. It’s not very useful, but it illustrates the point. The expression includes two Sql.Database calls with the same address parameters, but with different options. As explained above, in Tabular 1400, you cannot create two separate data source objects for these two function calls. But if you can only create one, then you only get one options record (see the previous table). So what are you going to do with the second command timeout? Can you perhaps use the same command timeout and consolidate both into a single data source definition? What if not?

let
    Source1 = Sql.Database("server01", "AdventureWorksDW", [CommandTimeout=#duration(0, 0, 5, 0)]),
    Source2 = Sql.Database("server01", "AdventureWorksDW", [CommandTimeout=#duration(0, 0, 10, 0)]),
    Combined = Table.Combine({Source1, Source2})
in
    Combined

The solution is not intuitive. Perhaps even misleading. It is fragile, and it breaks dependency analysis in the tools. It also breaks the programmability contract because you can no longer simply update the data source definition and expect your M expressions to pick up the changes. It clearly needs a better implementation in a future compatibility level. But for now, you can simply define a data source with the address parameters shown above and then use the M expression unmodified in Tabular 1400. The credential object is magically applied to all corresponding data access functions that use the same parameters across all M expressions in the model. That’s how the Mashup engine works, as the following screenshot illustrates.

MExpDataAccessDifferentOptions 1024x644 Supporting Advanced Data Access Scenarios in Tabular 1400 Models

Note: This workaround of using a data source merely as a holder of a credential object is not recommended and should be considered a last-resort option. If possible, create a common set of options for all data access calls to the same source and use a single data source object to replace the data access functions. Another option might be to register the same server with multiple names in DNS and then use a different server name in each data source definition.

Perhaps, you can consolidate your data access options and keep things straightforward in your Tabular 1400 model. But what if your data access functions use different native queries as in the following example?

let
    Source1 = Sql.Database("server01", "AdventureWorksDW", [Query="SELECT * FROM dimCustomer WHERE LastName = 'Walker'"]),
    Source2 = Sql.Database("server01", "AdventureWorksDW", [Query="SELECT * FROM dimCustomer WHERE LastName = 'Jenkins'"]),
    Combined = Table.Combine({Source1, Source2})
in
    Combined

Of course, one option is to rewrite the first native query to deliver the combined result so that you can eliminate the second function call. Yet, native query rewrites are not really required if you modified the M expression and used the Value.NativeQuery function, as in the following expression. Now there is only a single Sql.Database function call, which can nicely be replaced with a data source definition in Tabular 1400.

let
    Source = Sql.Database("server01", "AdventureWorksDW"),
    Table1 = Value.NativeQuery(Source, "SELECT * FROM dimCustomer WHERE LastName = 'Walker'"),
    Table2 = Value.NativeQuery(Source, "SELECT * FROM dimCustomer WHERE LastName = 'Jenkins'"),
    Combined = Table.Combine({Table1, Table2})
in
    Combined

Note: Even if you consolidated the queries, as in SELECT * FROM dimCustomer WHERE LastName = ‘Walker’ Or LastName = ‘Jenkins’, avoid putting this native query on the query parameter of the data source definition. The query parameter exists for full compatibility between data access functions and data source definitions. But it shouldn’t be used because it narrows the data source down to a particular query. You could not import any other data from that source. As explained earlier, you cannot define a second data source object with the same address parameters in Tabular 1400. So, define the data source in broadest terms and then use the Value.NativeQuery function to submit a native query.

The next level of challenges revolves around the dynamic definition data source parameters. This technique is occasionally used in Power BI Desktop files to maintain the address parameters for multiple data access function calls centrally. The following screenshot shows an example.

MExpParameterized Data Access Function 1024x727 Supporting Advanced Data Access Scenarios in Tabular 1400 Models

Fortunately, the concept of M-based address parameter definitions is comparable to the concept of a data source definition in Tabular 1400. In most cases, you should be able to define the Tabular 1400 data source as always by simply using the parameter values directly. Then, replace the data access function calls in your M expressions with the reference to the data source, delete the M-based parameter definitions, and the job is done. The result is a common Tabular 1400 M expression.

let
    Source = #"SQL/server01;AdventureWorksDW"
in
    Source

Of course, this technique only works for trivial, single-valued parameter definitions. For more complex scenarios, well, read on.

The ultimate challenge in the advanced data access scenarios is the fully dynamic implementation of a data access function call. There are various flavors. Let’s analyze the following simple example first.

let
    #"Dynamic Function" = Sql.Database,
    Source = #"Dynamic Function"("server01", "AdventureWorksDW")
in
    Source

Clearly, this M expression is accessing a SQL Server database called AdventureWorksDW on server01. You can create a data source definition for this database in Tabular 1400 and replace these lines with a reference to that data source definition, as shown in the sample in the previous section above. This is easy because it wasn’t really a dynamic data source, yet. Here’s a fully dynamic example.

let
    Source = Sql.Database("server01", "Dynamic"),
    DataAccessPoints = Source{[Schema="dbo",Item="DataAccessPoints"]}[Data],
    NavigationTable = Table.AddColumn(DataAccessPoints, "Data", each Connect([FunctionName], [Parameter1], [Parameter2]))
in
    NavigationTable

This M expression retrieves all rows from a DataAccessPoints table in a SQL Server database called Dynamic. It then passes the column values of each row to a custom function called Connect, which then connects to the specified data source. The Connect function is based on the following M expression.

let
    Source = (FunctionName, Param1, Param2) => let
        FunctionTable = Record.ToTable(#shared),
        FunctionRow = Table.SelectRows(FunctionTable, each ([Name] = FunctionName)),
        Function = FunctionRow{0}[Value],
        Result = if Param2 = null then Function(Param1) else Function(Param1, Param2)
    in
        Result
in
    Source

Not only are the address parameters (Param1 and Param2) dynamically assigned, but the name of the data access function itself is also passed in as a parameter (FunctionName). So, unless you know the contents of the DataAccessPoints table, you cannot even determine what data sources this M expression accesses! You must evaluate the expressions against the actual data to know what credentials you need to supply, and you must define privacy settings. In this example, the following screenshot reveals that the expressions connect to an OData feed as well as a SQL Server database, but it could really be any supported data source type.
MExp Fully Dynamic Data Access 1024x718 Supporting Advanced Data Access Scenarios in Tabular 1400 Models
So, if you wanted to use such a dynamic construct, you would have to create the corresponding data source objects with their credentials in your Tabular 1400 model. Power BI Desktop can prompt you for the missing credentials if you added a new entry to the DataAccessPoints table, but Analysis Services cannot because there is no interactive user on a server, and processing would fail. You must add a new matching data source definition with the missing credentials and privacy settings upfront, which somewhat defeats the purpose of a fully dynamic data source definition.

Perhaps you are wondering at this point why we didn’t provide more flexible support for credential handling in Tabular 1400. One of the main reasons is the footprint of the modern Get Data experience on the Mashup engine is already significant. It wasn’t justifiable to take a sledgehammer approach at the Mashup layer. Especially when it concerns security features used across several Microsoft technologies, including Power BI.

Fully dynamic scenarios are certainly interesting and fun, but they are corner cases. The more common advanced data access scenarios can be handled with moderate effort in Tabular 1400. And a future compatibility level is going to remove the limitation of a single data source definition per target parameters. It requires some replumbing deep in the Mashup engine. On the other hand, how Tabular models are going to support fully dynamic data source definitions in future compatibility levels hasn’t yet been decided. One idea is to move credential storage out of the Tabular metadata. Another is to introduce new metadata objects that can be associated with data access functions through their address parameters. Perhaps another is to invent a form of processing handshake. And there may be other options. If you have a great idea, please post a comment or send it our way via ssasprev at microsoft.com, or use any other available communication channels such as UserVoice or MSDN forums.

That’s it for this excursion into handling advanced data access scenarios when moving M expressions from Power BI Desktop to a Tabular 1400 model. One of the next articles will show you how to use legacy provider data sources and native query partitions together with the new structured data sources and M partitions in a Tabular 1400 model. Stay tuned for more Get Data coverage on the Analysis Services Team Blog!

Let’s block ads! (Why?)

Analysis Services Team Blog

Pizza Time

Can you taste it?

vnM7tz8 Pizza Time

“Pizza.”

Image courtesy of http://imgur.com/gallery/vnM7tz8.

How to Create Microsoft Word Templates in Microsoft Dynamics 365

Word template 300x225 How to Create Microsoft Word Templates in Microsoft Dynamics 365

In a previous blog you’ve learned How to Easily Generate Excel Templates, but what if you wanted to use Microsoft Word? Using Microsoft Word templates in Microsoft Dynamics 365 can be a huge time-saver when you want to provide a professional document for an entity, especially if you need to make the document more than once for more than one records. Let’s get to it!

What are Microsoft Word templates?

Microsoft Word templates are documents that are created once, but can be used on multiple records directly from Microsoft Dynamics 365. Templates can be used to generate anything from thank you notes, to birthday cards, or even a list of opportunities associated with an account by using XML – but before you run to get a developer, don’t worry, no coding is required!

When would you use a Microsoft Word template?

If you wanted to send a note to a key contact on their birthday you could create a “Happy Birthday” template, once you print it off you could give it your own signature to add a personal touch.
Another case would be if you wanted to remind an account of all their opportunities and the associated proposed budgets.
A third example is if you wanted an account to verify the contacts on their team and what levels of authority each has such as purchasing authority, service-ticket creation authority, or IT architecture authority.
Or, anytime that you want information from an Account, Contact, Opportunity or any entity to auto-populate into a Microsoft Word document for mailing, emailing or reporting.

How to Create a Microsoft Word Template?

  1. Navigate to Account View
  2. Select Excel Templates from the Command Bar and Create Excel Template

083017 1812 HowtoCreate1 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. Select Word Template
  2. Select the Desired Entity

083017 1812 HowtoCreate2 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. Select any related entities that you will use in the template such as Opportunities, Quotes, Orders or Invoices. Once you click “Download Template” the download will start automatically, and the resulting file will look like a blank MS Word document.

083017 1812 HowtoCreate3 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. In Microsoft Word, make sure the developer tab is shown in the Ribbon. If it is not, right click the Ribbon and select “Customize the Ribbon” in the right pane, check the box next to Developer and select ok. Once the developer tab is on the ribbon click “Display the XML Mapping Pane” under the developer tab.
  2. Select CRM in the Custom XML Part Drop-down Menu
  3. As you type your document, right-click fields to insert them into the document.

083017 1812 HowtoCreate4 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. You can insert tables as well creating the table in Microsoft Word, select the row, then right click on the opportunity entity at the bottom of the XML Mapping Pane, expand the Insert Content Control section and select repeating.

083017 1812 HowtoCreate5 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. Then populate the table with content control just like before.

083017 1812 HowtoCreate6 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. The next step is to upload the Word Template so that it can be used. Save your word document and navigate back to Dynamics 365 and repeat steps 1-4, but instead of clicking select entity, click on the upload button. Then browse to your file and upload the document.
  2. Once Upload is complete, the template is now available for use on the account page. Test your work by navigating to an account, selecting the ellipsis> Word Templates > Personal Word Templates, and select your desired template.

083017 1812 HowtoCreate7 How to Create Microsoft Word Templates in Microsoft Dynamics 365

083017 1812 HowtoCreate8 How to Create Microsoft Word Templates in Microsoft Dynamics 365

  1. If you want to make any changes to the document once you download it you can, just like you word in a regular document.

083017 1812 HowtoCreate9 How to Create Microsoft Word Templates in Microsoft Dynamics 365

So, if you’re sending out birthday cards, providing updates or confirming information. If you need to make the same form more than once, create a template and eliminate the busy work of recreating documents.

Be sure to subscribe to our blog so you have immediate updates to all of the new tips and tricks for Microsoft Dynamics 365, and if you would like more in-depth information on any CRM topic check out our CRM Book!

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

How To Keep Up With China’s Rapidly Changing Automotive Industry

282010 GettyImages 158259775 super e1504034958793 How To Keep Up With China’s Rapidly Changing Automotive Industry

Also, while WhatsApp has a large following, China’s 768 million WeChat users have access to far more advanced features, such as electronic payments and online gaming. It seems as if WeChat payments are ubiquitous in China and will soon replace both cash and credit card payments.

The bottom line

In a globally connected world, automotive companies are reinventing themselves. China is a great example of jumping technology stages in an effort to become the leader in electric mobility.

Technology companies recognize the gravitas of China becoming a leader in the automotive world. Recognizing also means listening to the specifics of the market, the tech infrastructure ecosystem, and consumer preferences, and providing breakthrough cloud-based technologies that enable fast-moving players to stay ahead of the game.

This story also appeared on the SAP Community. Follow me @ulimuench.

Let’s block ads! (Why?)

Digitalist Magazine

DAX Reanimator Series: Moving Averages Controlled by Slicer (Part 1)

upload 3 300x150 DAX Reanimator Series: Moving Averages Controlled by Slicer (Part 1)

In 2013, Rob showed how touse a disconnected table and slicer to show moving averages with a variable time period. That post built on an earlier post, which steps through the process of creating moving averages.

What-If-Parameters, teased at MDIS in June 2017, have just been released to Power BI. This is a great new way to get disconnected tables into Power BI. But, it should also help more people than ever discover the joy of disconnected tables. With the latest update of Power BI, you can now CLICK A BUTTON to create new what-if-parameters. As a result, the button creates a series of numbers in a disconnected table and a harvester measure to get the selected value. As an added bonus, you can even add the slicer to the page at the same time.

So, could we use the what-if-parameters to get the same results? Let’s try it. To begin with, we’ll need a Sales and Calendar table and a DAX measure for Units Sold.

Step 1: Create a new What-if-ParameterMA Length parameter 1 DAX Reanimator Series: Moving Averages Controlled by Slicer (Part 1)

From the Modeling tab, click the New Parameter button to bring up the  window. This window creates a table and a DAX harvester measure. You can also add a slicer to the page. Values for Minimum, Maximum, and Increment are required. As a great touch, there’s an optional AlternateResult. This default value is in effect if no slicers are active, or if the user clicks more than one slicer.

     = SELECTEDVALUE (Table[ColumnName, [AlternateResult] )

In a recent post, Matt Allington points out that this function is the same as:

     =IF(HASONEVALUE(Table[Column]), VALUES(Table[Column]), )

Step 2: View your new disconnected table and slicer
MA Length parameter 2 1 DAX Reanimator Series: Moving Averages Controlled by Slicer (Part 1)

The new table now displays in the fields area. Before, we had to make these tables in Power Query, or with the Enter Data button in Power BI. But no more! Now, the What-if button creates a table for us using DAX. No query or pasting required. By the way, this new tool is  only available in Power BI. Neither the What-If-Parameter nor the DAX functions behind it are yet available in Excel.

The What-If-Parameter button created this tableMA Length parameter table e1504038063750 DAX Reanimator Series: Moving Averages Controlled by Slicer (Part 1)

Note the new GENERATESERIES() function!  This is how the table looks in Data view. There’s a table formula at the top and the results on the rows of the table.

     [MA Length] = GENERATESERIES(-12, 12, 1).

This formula plugs in the values from the window: start of -12, end of 12, and move up by 1. Increment can be any positive number like 2, 5, or .05.

     GENERATESERIES ( StartValue, EndValue, [IncrementValue] )

The formulas below are from Rob’s original post. I made a couple of changes. Firstly,I replaced the original harvester measure.

     [Variable Moving Sum] =
     VAR CurrentDate =
          IF (
               [MA Length Value] > 0,
               FIRSTDATE ( ‘Calendar'[Date] ),
               LASTDATE ( ‘Calendar'[Date] )
          )
     RETURN
     CALCULATE([Units Sold],
          DATESINPERIOD(Calendar[Date],
          CurrentDate,
          [MA Length Value],Month
          )          
     )
     [Variable Moving Average] =
     [Variable Moving Sum] /
          CALCULATE( DISTINCTCOUNT(Calendar[Year Month]),
               DATESINPERIOD(Calendar[Date],
               LASTDATE(Calendar[Date]),
               [MA Length Value], Month
          )
     )

Secondly, I set the first argument for DATESINPERIOD based on whether the range is forward or back. The original post advised this step to fix the time range for forward averaging.

Here’s the syntax for DATESINPERIOD():

     DATESINPERIOD(,,,)

How do we get the right number of months going forward?

DATESINPERIOD casts ahead or back x months, starting with the start date parameter. Going back, there’s no problem because it starts with the last date of the current month, and therefore includes the full current month. Going forward, the measure should begin with the first date of the current month in order to include the full month. Instead, if the last date is used for forward averaging, the period will include one day from the current month. And this makes the count of months one more than it should be! For example, with three months forward, the measure would divide by four instead of three. So, the formula for [Variable Moving Sum] here corrects this issue. Since our variable switches between FIRSTDATE() and LASTDATE(),  this gives us the right count of months.

After creating these measures, we can put [Variable Moving Average] on the chart. Then, we can apply color to the sales and moving average lines. In addition, we also have new formatting options: markers and line style.

Step 3: Create & format your moving average chartVariable Moving Average Graph 1 DAX Reanimator Series: Moving Averages Controlled by Slicer (Part 1)

The AlternateValue of the harvester measure is -3. Thus, the moving average displays at three months back, even without a slicer selection. All great stuff!

Personally, I love these new What-if-parameters. And I especially love GENERATESERIES(). However, with fine tuning, we can go beyond what what’s possible out of the box. So stay tuned later this year for some cool stuff in Part 2!

Thanks to Reid Havens, for designing the moving averages dashboard. Download it below!

Download the Power BI Desktop (.pbix) Report Here

X

Get Your Files

Let’s block ads! (Why?)

PowerPivotPro