Tag Archives: Setting

Be Realistic When Setting an RPO

Vision Solutions, which recently merged with Syncsort, has been in the business of high availability and disaster recovery for more than two decades. The article below about setting a realistic Recovery Point Objective originally appeared on their blog.

Recovery Point Objective (RPO) is a key metric when outlining a Disaster Recovery plan. This particular metric refers to the amount of data that can be lost due to various disaster scenarios, and as you might expect, the lower the number the better the plan. That’s only true to a point, however.

If asked, nearly every end-user will immediately respond that they need a zero-byte RPO.  It’s an instinctive reaction, and in a perfect world you’d provide zero data loss to everyone who asked for it. We don’t, however, live in a perfect world; and there are many factors that make true zero-byte RPO a less-than-feasible metric to shoot for.

Three Considerations When Setting Your Recovery Point Objective

First, most applications simply cannot benefit from a zero-byte RPO. Nearly all Windows-based applications use memory caching and other technologies that mean some amount of data is temporarily stored in volatile memory (typically in RAM). Any disruption to power would cause that data to be rejected when the system starts up again – no matter if caused by a temporary power fault or a full-scale failover situation.

Luckily, nearly all Windows apps are designed to deal with this minimal amount of data loss quickly and automatically when the application is brought back online. Microsoft SQL, for example, will simply check databases against checkpoints in logs and play forward and/or roll back transactions. What this means, however, is that there will always be some amount of data that will be lost to any disaster, no matter how fast your replication systems are.

blog banner 2017 State of Resilience Report Be Realistic When Setting an RPO

Secondly, while synchronous replication tools can ensure that not a single byte of data is committed to the source disk system unless it is also written to the target disk, they do so at the expense of performance. Within a single fiber-connected network this isn’t too much of a problem, the replication happens quickly enough that I/O speeds aren’t significantly reduced.

However, across a WAN, the requirement that each I/O be written to the target before it is committed to the source will create massive disk I/O bottlenecks. The only way to fix this is to continue to increase the size of the WAN link to cost-prohibitive proportions.

Finally, when confronted with the staggeringly higher costs that occur as you get closer and closer to zero-byte RPO’s, end-users and business units will usually start to admit that they can afford to lose a certain amount of data and still be okay in the event of a declared disaster.

So, always remember three things when configuring your Recovery Point Objective metrics:

  1. Applications typically can’t benefit from true zero-byte RPO
  2. Systems do exist to reach that metric, but they require massive amounts of expensive infrastructure
  3. When pressed, most business units and end-users will admit that they can be flexible on RPO metrics when faced with these two facts.

Plan for the most effective RPO for you, but be ready to realize that zero-byte RPO may be both fiscally and technically out of reach in the majority of circumstances. Get as close to it as you can without breaking either your network, or the bank.

Vision Solutions offers high availability and disaster recovery solutions to ensure you can keep critical apps online and restore any lost data in the event of a disaster. Download their free 2017 State of Resilience report to review the latest trends in disaster recovery planning.

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Introduction to a new Dynamics 365 Setting – Allow text wrapping in form fields labels and values

Microsoft’s Dynamics 365 July Update includes a new simple, yet very useful new setting called “Allow text wrapping in form fields labels and values”. This setting is available in “System Settings” under “General” tab.

image thumb Introduction to a new Dynamics 365 Setting   Allow text wrapping in form fields labels and values

In this blog, I will be going through what this setting does, and how it changes the way a label and value of a field is presentation on a form.
I created a custom attribute called “Magnetism Auckland Office Printer ID”. I added it to a form section which has “Field Label Width” set to 115. I saved and published all the changes.

image thumb 1 Introduction to a new Dynamics 365 Setting   Allow text wrapping in form fields labels and values

To see what my field label and its value looked like without using “Allow text wrapping in form fields labels and values”, I went to system settings and set its value to “No”. I created a new Contact record, and following is what “Magnetism Auckland Office Printer ID” field’s label and value looked like on the form.

image thumb 9 Introduction to a new Dynamics 365 Setting   Allow text wrapping in form fields labels and values

To see what difference the new setting “Allow text wrapping in form fields labels and values” would make, I went back to system setting and changed its value to “Yes”. Then I reopened the same contact record, and following is what “Magnetism Auckland Office Printer ID” field’s label and value looked like on the form.

image thumb 3 Introduction to a new Dynamics 365 Setting   Allow text wrapping in form fields labels and values

The label of the field was successfully wrapped. However, the value stored inside the field wasn’t wrapped. The value I stored in the field was a string of 52 characters long but only first 41 characters were visible on the form. This could be because this setting doesn’t wrap a single string. To confirm, I populated the value in “Magnetism Solutions Auckland Office Printer ID” with two strings of 21 characters long each. However, this time the value stored inside the field was wrapped too.

image thumb 4 Introduction to a new Dynamics 365 Setting   Allow text wrapping in form fields labels and values

Another thing I wanted to check was whether if the value is still wrapped even if the label of the field is not too long. To test this case, I changed the label of field “Magnetism Auckland Office Printer ID” to “Printer ID”, saved and published the changes. Then I reopened the same contact record, and still the value stored in “Magnetism Solutions Auckland Office Printer ID” (with label changed to “Printer ID”) was wrapped.

image thumb 5 Introduction to a new Dynamics 365 Setting   Allow text wrapping in form fields labels and values

However, there was one more thing that I was curious about. How many lines of field label can be wrapped and displayed without anything fading? To test this, I changed label again to make it even longer – “Magnetism Solutions Auckland Office Wireless Printer Identification Number”. Then I saved and published the changes. Then I reopened the same contact Record.  This time however, only first two lines were completely visible, and rest of the label text was either missing or started to fade away.

image thumb 6 Introduction to a new Dynamics 365 Setting   Allow text wrapping in form fields labels and values

Let’s block ads! (Why?)

Magnetism Solutions Dynamics CRM Blog

New White Paper! Mainframe Data as a Service: Setting Your Most Valuable Data Free

Enterprises across the globe still run critical processes with mainframes. Estimates range as high as 80% of transactional corporation data is in mainframe systems. Unfortunately, in most cases the mainframe data in these enterprise organizations remains inaccessible to new data analytics tools that are emerging on a variety of open source and cloud platforms, including Hadoop and Spark.

There is a tremendous opportunity to surface mainframe data into this new world of fast-moving open technology. Organizations that have freed up mainframe data for cross-enterprise consumption are achieving greater agility, flexibility and lower costs.

Our latest white paper, Mainframe Data as a Service: Setting Your Most Valuable Data Free, dives deeper into the complex challenge of connecting Big Iron to Big Data and explores how extending Data as a Service (DaaS) to mainframe data opens up a large and often impenetrable source of valuable corporate information.

blog banner whitepaper MF DaaS New White Paper! Mainframe Data as a Service: Setting Your Most Valuable Data Free

Download the white paper now!

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Setting up data alerts on streaming data sets

Streaming data sets in Power BI are a cool feature that allows you to analyze data as it occurs. I was playing around with setting up this cool twitter demo using Flow as described here by Sirui: https://powerbi.microsoft.com/en-us/blog/push-rows-to-a-power-bi-streaming-dataset-without-writing-any-code-using-microsoft-flow/ but was thinking, wouldn’t it be cool if I can get alerts based on the data that comes in. For example I want to get an alert when I get more than 20 negative Power BI tweets in the last hour. Unfortunately you cannot create measure at this time to add any logic but there is a way. Let’s take a look.

If you follow the above instruction you will end up with a dataset in Power BI that gets fed tweets and their sentiments real time. One change I made to the flow above is that I turned on “Historic data analysis

 Setting up data alerts on streaming data sets

This gives me a dataset that I can build reports on and analyze data in the past similar to a push api dataset, more on this here.

So now I have this dataset I can start creating reports and dashboards (you can also add more data as described here to make things a bit more interesting):

 Setting up data alerts on streaming data sets

and pin them to your dashboard:

 Setting up data alerts on streaming data sets

So far so good but now I want to see only the count of tweets with sentiment <= 0.5 in the last hour. Here is where the trouble starts as I can’t express last hour in the report designer. Luckily there is another smart feature that will help us here called Q & A, I can just ask the question.

 Setting up data alerts on streaming data sets

This immediately gives me the answer I need, just not completely in the right shape as I need this to be a single card, not a chart for data alerts to work. There is also an option for me to change this manually, so in this case I open the viz pane and select card:

 Setting up data alerts on streaming data sets

Now I pin it to the dashboard and after renaming the tile I have the number of negative tweets in the last hour:

 Setting up data alerts on streaming data sets

Now as last step I can configure my data alert to send me an alert when I get more then 20 negative tweets in hour:

 Setting up data alerts on streaming data sets

Done  Setting up data alerts on streaming data sets again goes to show you the power of Q&A. Pretty cool scenario and NO code required …

Let’s block ads! (Why?)

Kasper On BI

6 Best Practices for Setting Up a Lead Nurturing Program

6 Best Practices for Setting Up a Lead Nurturing Program 351x200 6 Best Practices for Setting Up a Lead Nurturing Program

The best practices for setting up a lead-nurturing program include:

1. Develop your lead nurturing program one step at a time.

“Going from whatever you’re currently doing (likely a variation of treating all leads equally) to executing a complex lead-management strategy doesn’t have to happen in one giant step. For these organizations, achieving a strong nurture-marketing strategy (let alone the next step to closed-loop) should be seen as a multistep process.” (Heinz)

2. Identify ‘nurture-able’ leads carefully.

“Identify your ‘nurture-able’ contacts. What is the population of leads and contacts that you are free to nurture without complicating the efforts of sales or customer service (if you plan to nurture current accounts)? Add lapsed customers to the mix if it makes sense to do so. Note: The goal here is not to build the biggest database you can. There’s probably a segment of people you could nurture that would ‒ fairly or unfairly ‒ regard even a modest nurturing program as a relentless carpet-bombing ‘spampaign’ from a vendor they hope never to hear from again.” (Scearce)

“Define an ideal lead, and treat them differently: Create a relatively simple definition of an ideal prospect, and start with two buckets ‒ Do they qualify or do they not? For example, does an ideal prospect need to come from a company of a particular size? From a particular titled contact? From a particular industry? Set just two to three criteria, and start triaging leads accordingly.” (Heinz)

“I would implement nurture programs that map the sales cycle to the buying cycle. The typical buying cycle flow follows: need, learn, evaluate, negotiate, purchase, implement, advocate.” (Eisenberg)

3. Start with a single campaign and plan to segment in the future.

“Start a single nurture campaign for all leads. Sure, it would be great to have different nurture segments by industry, expected close date, reason the deal may be delayed, and so forth. But if you’re just starting out nurturing leads, start with a single nurture campaign for all leads. A monthly newsletter, or a regular webinar offer, or even an occasional free white paper offer can keep you top of mind with prospects not yet ready to buy. Get complicated later, but get something going to those latent prospects right away.” (Heinz)

“There are many ways for you to segment your contacts. You could segment by geography, or create-date, or lead source, etc. But at this point, you should be looking for the Venn diagram overlap of ‘known recent interest in a high-value product my company sells’ and ‘population of significant quantity.’ Leads and contacts with these two attributes are very good inputs to your content strategy.” (Scearce)

4. Remember that content is king.

“An important area to address related to nurturing is content. A first step would be to take an inventory of what content you have. You likely have more useful content than you realize. This includes whitepapers, videos, blog posts, articles, use cases, guest blogs, etc. Categorize them by the consumer they were designed for (technical sponsor vs. business sponsor, industry, and stage of interest/purchase). You can reuse content in very effective ways by breaking them into series pieces. Also, less is more with delivering content. People will get lost in long emails; better to present content in list form, such as Top 10 lists, 3 Things Your Competitors Are Doing, etc. Videos are easy, and people like watching short 90-second videos with success stories, use cases, industry trends.” (Vanella)

5. Act locally, think globally. 

“For those with worldwide businesses, I would create nurture programs for various regions, such as Europe, so that you are localizing the communications.” (Eisenberg)

6. Don’t invest in software right off the bat.

“When setting up a lead nurturing program, don’t buy software. It’s too early for that. You don’t know what you need; you may not require every bell and whistle in the foreseeable future. Familiarize yourself with what is possible given the budget and expertise available to you. When you know your audience and have a plan to engage them, you have a good sense of what you need your software to do. This is important because there’s a wee bit of an arms race going on in the marketing automation space. The leading and emerging vendors in this category are competing for a fast-growing but still-nascent market.” (Scearce)

In closing 

There’s no time like the present to stop losing valuable prospects and start grooming them into qualified leads. Setting up a lead-nurturing program doesn’t have to be a daunting task. The most important takeaway here is that you get in the game now, and take one step at a time. Identify your “nurture-able” leads, start with a single campaign, add in some content marketing, and worry about automating (and purchasing the needed software) down the road.

Have any tips for creating top-notch lead-nurturing campaigns? Let us know in the comments below!

Let’s block ads! (Why?)

Act-On Blog

Setting up Azure Disk Encryption for a Virtual Machine with PowerShell

As I discussed in my previous blog post, I opted to use Azure Disk Encryption for my virtual machines in Azure, rather than Storage Service Encryption. Azure Disk Encryption utilizes Bitlocker inside of the VM. Enabling Azure Disk Encryption involves these Azure services:

  • Azure Active Directory for a service principal
  • Azure Key Vault for a KEK (key encryption key) which wraps around the BEK (bitlocker encryption key)
  • Azure Virtual Machine (IaaS)

Following are 4 scripts which configures encryption for an existing VM. I initially had it all as one single script, but I purposely separated them. Now that they are modular, if you already have a Service Principal and/or a Key Vault, you can skip those steps. I have my ‘real’ version of these scripts stored in an ARM Visual Studio project (same logic, just with actual names for the Azure services). These PowerShell templates go along with other ARM templates to serve as source control for our Azure infrastructure.

As any expert will immediately know by looking at my scripts below, I’m pretty much a PowerShell novice. So, be kind dear reader. My purpose is to document the steps, the flow,add some commentary, and to pull together a couple pieces I found on different documentation pages. 

Step 1: Set up Service Principal in AAD

<#
.SYNOPSIS
Creates Service Principal in Azure Active Directory
.DESCRIPTION
This script creates a service principal in Azure Active Directory. 
A service principal is required to enable disk encryption for VM.
.NOTES
File Name: CreateAADSvcPrinForDiskEncryption.ps1
Author : Melissa Coates
Notes: Be sure the variables in the input area are completed, following all standard naming conventions.
The $  aadSvcPrinAppPassword needs to be removed before saving this script in source control.
.LINK
Supporting information: 

https://blogs.msdn.microsoft.com/azuresecurity/2015/11/16/explore-azure-disk-encryption-with-azure-powershell/


https://docs.microsoft.com/en-us/azure/security/azure-security-disk-encryption

#>

#-----------------------------------------

#Input Area
$  subscriptionName = 'MyAzureSubscriptionDev'
$  aadSvcPrinAppDisplayName = 'VMEncryptionSvcPrinDev'
$  aadSvcPrinAppHomePage = 'http://FakeURLBecauseItsNotReallyNeededForThisPurpose'
$  aadSvcPrinAppIdentifierUri = 'https://DomainName.com/VMEncryptionSvcPrinDev'
$  aadSvcPrinAppPassword = 'SuperStrongPassword'

#-----------------------------------------

#Manual login into Azure
Login-AzureRmAccount -SubscriptionName $  subscriptionName

#-----------------------------------------

#Create Service Principal App to Use For Encryption of VMs
$  aadSvcPrinApplication = New-AzureRmADApplication -DisplayName $  aadSvcPrinAppDisplayName -HomePage $  aadSvcPrinAppHomePage -IdentifierUris $  aadSvcPrinAppIdentifierUri -Password $  aadSvcPrinAppPassword
New-AzureRmADServicePrincipal -ApplicationId $  aadSvcPrinApplication.ApplicationId

Step 2: Create Azure Key Vault

<#
.SYNOPSIS
Creates Azure Key Vault.
.DESCRIPTION
This script does the following:
1 - Creates a key vault in Azure.
2 - Allows the Azure Backup Service permission to the key vault.
This is required if Recovery Vault will be used for backups.
A key vault is required to enable disk encryption for VM.
.NOTES
File Name: ProvisionAzureKeyVault.ps1
Author : Melissa Coates
Notes: Be sure the variables in the input area are completed, following all standard naming conventions.
The key vault must reside in the same region as the VM which will be encrypted.
A Premium key vault is being provisioned so that an HSM key can be created for the KEK.
The 262044b1-e2ce-469f-a196-69ab7ada62d3 ID refers to the Azure Key Vault (which is why it is not a variable).
.LINK
Supporting information: 

https://blogs.msdn.microsoft.com/azuresecurity/2015/11/16/explore-azure-disk-encryption-with-azure-powershell/


https://docs.microsoft.com/en-us/azure/security/azure-security-disk-encryption

#>

#-----------------------------------------

#Input Area
$  subscriptionName = 'MyAzureSubscriptionDev'
$  resourceGroupName = 'MyDevRG'
$  keyVaultName = 'KeyVault-Dev'
$  keyVaultLocation = 'East US 2'

#-----------------------------------------

#Manual login into Azure
#Login-AzureRmAccount -SubscriptionName $  subscriptionName

#-----------------------------------------

#Create Azure Key Vault
New-AzureRmKeyVault -VaultName $  keyVaultName -ResourceGroupName $  resourceGroupName -Location $  keyVaultLocation -Sku 'Premium'

#-----------------------------------------

#Permit the Azure Backup service to access the key vault
Set-AzureRmKeyVaultAccessPolicy -VaultName $  keyVaultName -ResourceGroupName $  resourceGroupName -PermissionsToKeys backup,get,list -PermissionsToSecrets get,list -ServicePrincipalName 262044b1-e2ce-469f-a196-69ab7ada62d3

Step 3: Connect Service Principal with Key Vault

<#
.SYNOPSIS
Enables the service principal for VM disk encryption to communicate with Key Vault.
.DESCRIPTION
This script does the following:
A - Allows service principal the selective permissions to the key vault so
that disk encryption functionality works.
B - Creates a KEK (Key Encryption Key). For Disk Encryption, a KEK is required 
in addition to the BEK (BitLocker Encryption Key).
Prerequisite 1: Service Principal name (see CreateAADSvcPrinForVMEncryption.ps1)
Prerequisite 2: Azure Key Vault (see ProvisionAzureKeyVault.ps1)
.NOTES
File Name: EnableSvcPrinWithKeyVaultForDiskEncryption.ps1
Author : Melissa Coates
Notes: Be sure the variables in the input area are completed, following all standard naming conventions.
The key vault must reside in the same region as the VM being encrypted.
The key type can be either HSM or Software (HSM offers additional security but does require a Premium key vault). 
.LINK
Supporting information: 

https://blogs.msdn.microsoft.com/azuresecurity/2015/11/16/explore-azure-disk-encryption-with-azure-powershell/


https://docs.microsoft.com/en-us/azure/security/azure-security-disk-encryption

#>

#Input Area
$  subscriptionName = 'MyAzureSubscriptionDev'
$  resourceGroupName = 'MyDevRG'
$  aadSvcPrinAppDisplayName = 'VMEncryptionSvcPrinDev'
$  keyVaultName = 'KeyVault-Dev'
$  keyName = 'VMEncryption-KEK'
$  keyType = 'HSM'

#-----------------------------------------

#Manual login into Azure
#Login-AzureRmAccount -SubscriptionName $  subscriptionName

#-----------------------------------------

#Allow the Service Principal Permissions to the Key Vault
$  aadSvcPrinApplication = Get-AzureRmADApplication -DisplayName $  aadSvcPrinAppDisplayName
Set-AzureRmKeyVaultAccessPolicy -VaultName $  keyVaultName -ServicePrincipalName $  aadSvcPrinApplication.ApplicationId -PermissionsToKeys 'WrapKey' -PermissionsToSecrets 'Set' -ResourceGroupName $  resourceGroupName

#-----------------------------------------

#Create KEK in the Key Vault
Add-AzureKeyVaultKey -VaultName $  keyVaultName -Name $  keyName -Destination $  keyType

#-----------------------------------------

#Allow Azure platform access to the KEK
Set-AzureRmKeyVaultAccessPolicy -VaultName $  keyVaultName -ResourceGroupName $  resourceGroupName -EnabledForDiskEncryption

Step 4: Enable Disk Encryption

<#
.SYNOPSIS
Enables disk encryption for a VM.
.DESCRIPTION
This script enables disk encryption for an Azure virtual machine.
Prerequisite 1: Service Principal name (see CreateAADSvcPrinForDiskEncryption.ps1)
Prerequisite 2: Azure Key Vault (see ProvisionAzureKeyVault.ps1)
Prerequisite 3: Permissions to Key Vault for Service Principal (see EnableSvcPrinWithKeyVaultForDiskEncryption.ps1)
.NOTES
File Name: EnableAzureDiskEncryption.ps1
Author : Melissa Coates
Notes: Be sure the variables in the input area are completed, following all standard naming conventions.
Azure Disk Encryption (ADE) 
The Azure VMs must already exist and be running.
To verify when completed: Get-AzureRmVmDiskEncryptionStatus -ResourceGroupName $  resourceGroupName -VMName $  vmName
.LINK
Supporting information: 

https://blogs.msdn.microsoft.com/azuresecurity/2015/11/16/explore-azure-disk-encryption-with-azure-powershell/


https://docs.microsoft.com/en-us/azure/security/azure-security-disk-encryption

#>

#-----------------------------------------

#Input Area
$  subscriptionName = 'MyAzureSubscriptionDev'
$  resourceGroupName = 'MyDevRG'
$  keyVaultName = 'KeyVault-Dev'
$  keyName = 'VMEncryption-KEK'
$  vmName = 'VMName-Dev'

#-----------------------------------------

#Manual login into Azure
#Login-AzureRmAccount -SubscriptionName $  subscriptionName

#-----------------------------------------

#Enable Encryption on Virtual Machine
$  keyVault = Get-AzureRmKeyVault -VaultName $  keyVaultName -ResourceGroupName $  resourceGroupName
$  diskEncryptionKeyVaultUrl = $  KeyVault.VaultUri
$  keyVaultResourceId = $  KeyVault.ResourceId
$  keyEncryptionKeyUri = Get-AzureKeyVaultKey -VaultName $  keyVaultName -KeyName $  keyName 
Set-AzureRmVMDiskEncryptionExtension -ResourceGroupName $  resourceGroupName -VMName $  vmName -AadClientID $  aadSvcPrinApplication.ApplicationId -AadClientSecret $  aadSvcPrinAppPassword -DiskEncryptionKeyVaultUrl $  diskEncryptionKeyVaultUrl -DiskEncryptionKeyVaultId $  KeyVaultResourceId -KeyEncryptionKeyUrl $  keyEncryptionKeyUri.Id -KeyEncryptionKeyVaultId $  keyVaultResourceId

Let’s block ads! (Why?)

Blog – SQL Chick

Setting your Default Address List for Microsoft Dynamics CRM

In Microsoft Dynamics CRM, your contacts are moved by default into your Contacts address list in Outlook.  However, Outlook defaults to your Global Address list, which means you do not automatically see your CRM contacts in Outlook.  In this article, we will assist you with updating your default address list.

To set your default address list, click the Address Book button on the Outlook ribbon.

1 3 300x84 Setting your Default Address List for Microsoft Dynamics CRM

Once you have opened the address book, click Tools.

2 3 300x218 Setting your Default Address List for Microsoft Dynamics CRM

Then choose options.

3 4 Setting your Default Address List for Microsoft Dynamics CRM

Once you are in Options, take the following steps:

1.       Start with contact folders
2.       Choose Contacts in the “When opening” dropdown
3.       Click OK

4 3 284x300 Setting your Default Address List for Microsoft Dynamics CRM

You now have the default list set to see your CRM contacts.  For more tips, sign up for our email newsletter by visiting http://www.toplineresults.com/contact-us/.

Let’s block ads! (Why?)

CRM Software Blog

Why 2017 is setting up to be the year of GPU chips in deep learning

TTlogo 379x201 Why 2017 is setting up to be the year of GPU chips in deep learning

It’s been a while since anyone in the tech world really cared about hardware, but that could change in 2017, thanks to the ascendance of deep learning.

According to Forrester analyst Mike Gualtieri, we may be entering a golden age of hardware driven largely by graphics processing units, or GPUs. This alternative to traditional CPU technology is optimized to process tasks in parallel, rather than sequentially. This makes GPU chips a good fit for training deep learning models, which involves crunching enormous volumes of data again and again until the models can recognize images, parse natural speech or recommend products to online shoppers.

“There will be a hardware renaissance because of the compute needs to train these models,” Gualtieri said.

Need grows for GPU tools

GPU technology has been around for decades, but only recently has it gained traction among enterprises. It was traditionally used to enhance computer graphics, as the name suggests. But as deep learning and artificial intelligence have grown in prominence, the need for fast, parallel computation to train models has increased.

“A couple years ago, we wouldn’t be looking at special hardware for this,” said Adrian Bowles, founder of analyst firm STORM Insights Inc., based in Boston. “But with [deep learning], you have a lot of parallel activities going on, and GPU-based tools are going to give you more cores.”

He said he expects the market for GPU chips to heat up in the year ahead. NVIDIA, one of the first companies to start marketing GPU chips for analytics, teamed up with IBM and Microsoft this year to put its GPU chips in server technology aimed at cognitive applications. The biggest player in the processing market, Intel, may also look to get deeper into GPU technology.

GPUs mean changes for developers

Bowles said the growing popularity of this hardware, in some cases, may mean developers will have to learn new ways of working. Since the manner in which a GPU processes data is so different from a CPU, applications will need to be built differently to take full advantage of the benefits. This means developers will need more training to keep their skills sharp going forward.

Similarly, more enterprises will start to build their data architectures around GPU technology, said Tom Davenport, founder of the International Institute for Analytics, based in Portland, Ore., in a webcast. He said more businesses are looking at ways to implement image-recognition technology, which is built around deep learning models. Larger companies, like Google, Uber and Tesla, are also pushing forward with autonomous vehicles, which lean heavily on learning models. The growing role for these types of tasks will increase the demand for GPUs.

“I think we’re going to start seeing more computing architectures that focus specifically on GPUs,” Davenport said. “That’s been a part of the success of these deep learning models for image recognition and other applications.”

GPU technology could also enhance the trend of the citizen data scientist, said Paul Pilotte, technical marketing manager at MathWorks Inc., based in Natick, Mass. He pointed out that Amazon this year introduced GPU chips to its Elastic Compute Cloud platform. The fact it’s all hosted means users don’t need to be hardware experts. This lowers the bar to entry and makes the technology available to a wider user base.

“The workflows for deep learning, prescriptive analytics and big data will become more accessible,” Pilotte said.

Let’s block ads! (Why?)


SearchBusinessAnalytics: BI, CPM and analytics news, tips and resources

Setting a new pdf Distribution and estimating its parameters by MLE

 Setting a new pdf Distribution and estimating its parameters by MLE

Good morning. I need to create a pdf and esteimate its parameters by MLE method. The distribution is a power law ($ f(x)=C(k)\;x^{-k}$ ) and I need to estimate the $ k$ parameter and $ C(k)$ is determinated by the normalization condition $ \int_a^b\;C(k)\;x^{-k}=1$ . I have tried to define the power distribution with ProbabilityDistribution function usin, for example, C=5, but when I want to plot it, Mathematica doesn’t display the plot. What I’ve been writing is

distr=ProbabilityDistribution[5/x^1.05,{x,0,Infinity}]
PDF[distr,x]
Plot[%,{x,0,1}]

NOTE: when I try to define de distribution function I’m setting $ k=1.05$ , just for knowing if I’m doing it well.

Could you help me, please?

Thanks for your answers.

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Setting Up a PC for Azure Cortana Intelligence Suite Development

Some development aspects of the Cortana Intelligence Suite can occur in the Azure Portal. There are also some additional client tools which are helpful, or potentially required, to fully create solutions. This is a quick checklist of tools you probably want to install on a development machine for purposes of working on the analytics, BI, and/or data warehousing elements of Azure.

1. SQL Server Management Studio (SSMS)

The latest version of SSMS is recommended for compatibility with Azure services, as well as backwards compatible with all SQL Server versions back to 2008.

Download SSMS:  https://msdn.microsoft.com/en-us/library/mt238290.aspx

2. Visual Studio 2015

The latest 2015 version of Visual Studio is recommended for full functionality for the newest components. If you choose to do a customized VS installation, be sure to select the option to install Microsoft Web Developer Tools. (If you don’t, when you try to install the Azure SDK later it won’t install properly because prerequisites are missing. Yeah, yeah, I’ve been there.)

If you don’t have a license available, look into using the VS Community edition.

Download VS 2015: https://www.visualstudio.com/downloads/download-visual-studio-vs 

Note: in addition to “Visual Studio 2015,” there’s also a “Visual Studio 15 Preview.” The 15 Preview is *not* the same thing as Visual Studio 2015, even though 15 is in its name. So, just watch out for that in terms of naming.


3. SQL Server Data Tools (SSDT) for Visual Studio 2015

Here’s is where you gain the ability to create BI projects: SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), and SQL Server Reporting Services (SSRS). These SQL Server BI projects aren’t considered part of Cortana Intelligence Suite, but if you’re creating an analytics, BI, and/or data warehousing solution you may need at least one of types of BI projects as part of the overall solution.

With the latest version of SSDT for VS 2015, you’ll also be able to interact with all versions of SQL Server, as well as Azure SQL Database and Azure SQL Data Warehouse.

Example of what an Azure SQL Data Warehouse cloud resource looks like from within Visual Studio (SSDT):

 Setting Up a PC for Azure Cortana Intelligence Suite Development

4. Azure SDK

The Azure SDK sets up lots of libraries; the main features we are looking for from the Azure SDK right away are (a) the ability to use the Cloud Explorer within Visual Studio, and (b) the ability to create ARM template projects for automated deployment purposes. In addition to the Server Explorer we get from Visual Studio, the Cloud Explorer from the SDK gives us another way to interact with our resources in Azure. 

Example of what the Cloud Explorer pane looks like in Visual Studio (by Resource Group, and by Resource Type):

 Setting Up a PC for Azure Cortana Intelligence Suite Development

Example of what you’ll see related to Azure Resource Groups after the Azure SDK is installed:

 Setting Up a PC for Azure Cortana Intelligence Suite Development

Example of various QuickStart project types available (none of these are directly related to Cortana Intelligence Suite, but might factor into your overall solution):

 Setting Up a PC for Azure Cortana Intelligence Suite Development

5.  Relevant Visual Studio Extensions

These are important extensions for working with Cortana Intelligence Suite at this time:
-Microsoft Azure Data Lake Tools for Visual Studio 2015 <–Automatically installed as part of the Azure SDK
-Microsoft Azure HDInsight Tools for Visual Studio 2015 <–Automatically installed as part of the Azure SDK
-Microsoft Azure Data Factory Tools for Visual Studio 2015

At the time of this writing (June 2016), Azure Data Factory Tools are not automatically installed with the Azure SDK. That will probably change at some point I would guess.

Example of the Cortana Intelligence Suite projects you’ll see after the Azure extensions are installed (a U-SQL project is associated with the Azure Data Lake Analytics service):

 Setting Up a PC for Azure Cortana Intelligence Suite Development

6.  Microsoft Azure Storage Explorer and/or AzCopy

I really like the new standalone Azure Storage Explorer for uploading and downloading files to Azure Storage. AzCopy is another alternative – AzCopy is a command line utility instead of a graphical UI, so it’s better suited for automation and scripting purposes. 

Example of what Azure Storage Explorer looks like:

 Setting Up a PC for Azure Cortana Intelligence Suite Development

Let’s block ads! (Why?)

Blog – SQL Chick