Tag Archives: Setting

New White Paper! Mainframe Data as a Service: Setting Your Most Valuable Data Free

Enterprises across the globe still run critical processes with mainframes. Estimates range as high as 80% of transactional corporation data is in mainframe systems. Unfortunately, in most cases the mainframe data in these enterprise organizations remains inaccessible to new data analytics tools that are emerging on a variety of open source and cloud platforms, including Hadoop and Spark.

There is a tremendous opportunity to surface mainframe data into this new world of fast-moving open technology. Organizations that have freed up mainframe data for cross-enterprise consumption are achieving greater agility, flexibility and lower costs.

Our latest white paper, Mainframe Data as a Service: Setting Your Most Valuable Data Free, dives deeper into the complex challenge of connecting Big Iron to Big Data and explores how extending Data as a Service (DaaS) to mainframe data opens up a large and often impenetrable source of valuable corporate information.

blog banner whitepaper MF DaaS New White Paper! Mainframe Data as a Service: Setting Your Most Valuable Data Free

Download the white paper now!

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Setting up data alerts on streaming data sets

Streaming data sets in Power BI are a cool feature that allows you to analyze data as it occurs. I was playing around with setting up this cool twitter demo using Flow as described here by Sirui: https://powerbi.microsoft.com/en-us/blog/push-rows-to-a-power-bi-streaming-dataset-without-writing-any-code-using-microsoft-flow/ but was thinking, wouldn’t it be cool if I can get alerts based on the data that comes in. For example I want to get an alert when I get more than 20 negative Power BI tweets in the last hour. Unfortunately you cannot create measure at this time to add any logic but there is a way. Let’s take a look.

If you follow the above instruction you will end up with a dataset in Power BI that gets fed tweets and their sentiments real time. One change I made to the flow above is that I turned on “Historic data analysis

 Setting up data alerts on streaming data sets

This gives me a dataset that I can build reports on and analyze data in the past similar to a push api dataset, more on this here.

So now I have this dataset I can start creating reports and dashboards (you can also add more data as described here to make things a bit more interesting):

 Setting up data alerts on streaming data sets

and pin them to your dashboard:

 Setting up data alerts on streaming data sets

So far so good but now I want to see only the count of tweets with sentiment <= 0.5 in the last hour. Here is where the trouble starts as I can’t express last hour in the report designer. Luckily there is another smart feature that will help us here called Q & A, I can just ask the question.

 Setting up data alerts on streaming data sets

This immediately gives me the answer I need, just not completely in the right shape as I need this to be a single card, not a chart for data alerts to work. There is also an option for me to change this manually, so in this case I open the viz pane and select card:

 Setting up data alerts on streaming data sets

Now I pin it to the dashboard and after renaming the tile I have the number of negative tweets in the last hour:

 Setting up data alerts on streaming data sets

Now as last step I can configure my data alert to send me an alert when I get more then 20 negative tweets in hour:

 Setting up data alerts on streaming data sets

Done  Setting up data alerts on streaming data sets again goes to show you the power of Q&A. Pretty cool scenario and NO code required …

Let’s block ads! (Why?)

Kasper On BI

6 Best Practices for Setting Up a Lead Nurturing Program

6 Best Practices for Setting Up a Lead Nurturing Program 351x200 6 Best Practices for Setting Up a Lead Nurturing Program

The best practices for setting up a lead-nurturing program include:

1. Develop your lead nurturing program one step at a time.

“Going from whatever you’re currently doing (likely a variation of treating all leads equally) to executing a complex lead-management strategy doesn’t have to happen in one giant step. For these organizations, achieving a strong nurture-marketing strategy (let alone the next step to closed-loop) should be seen as a multistep process.” (Heinz)

2. Identify ‘nurture-able’ leads carefully.

“Identify your ‘nurture-able’ contacts. What is the population of leads and contacts that you are free to nurture without complicating the efforts of sales or customer service (if you plan to nurture current accounts)? Add lapsed customers to the mix if it makes sense to do so. Note: The goal here is not to build the biggest database you can. There’s probably a segment of people you could nurture that would ‒ fairly or unfairly ‒ regard even a modest nurturing program as a relentless carpet-bombing ‘spampaign’ from a vendor they hope never to hear from again.” (Scearce)

“Define an ideal lead, and treat them differently: Create a relatively simple definition of an ideal prospect, and start with two buckets ‒ Do they qualify or do they not? For example, does an ideal prospect need to come from a company of a particular size? From a particular titled contact? From a particular industry? Set just two to three criteria, and start triaging leads accordingly.” (Heinz)

“I would implement nurture programs that map the sales cycle to the buying cycle. The typical buying cycle flow follows: need, learn, evaluate, negotiate, purchase, implement, advocate.” (Eisenberg)

3. Start with a single campaign and plan to segment in the future.

“Start a single nurture campaign for all leads. Sure, it would be great to have different nurture segments by industry, expected close date, reason the deal may be delayed, and so forth. But if you’re just starting out nurturing leads, start with a single nurture campaign for all leads. A monthly newsletter, or a regular webinar offer, or even an occasional free white paper offer can keep you top of mind with prospects not yet ready to buy. Get complicated later, but get something going to those latent prospects right away.” (Heinz)

“There are many ways for you to segment your contacts. You could segment by geography, or create-date, or lead source, etc. But at this point, you should be looking for the Venn diagram overlap of ‘known recent interest in a high-value product my company sells’ and ‘population of significant quantity.’ Leads and contacts with these two attributes are very good inputs to your content strategy.” (Scearce)

4. Remember that content is king.

“An important area to address related to nurturing is content. A first step would be to take an inventory of what content you have. You likely have more useful content than you realize. This includes whitepapers, videos, blog posts, articles, use cases, guest blogs, etc. Categorize them by the consumer they were designed for (technical sponsor vs. business sponsor, industry, and stage of interest/purchase). You can reuse content in very effective ways by breaking them into series pieces. Also, less is more with delivering content. People will get lost in long emails; better to present content in list form, such as Top 10 lists, 3 Things Your Competitors Are Doing, etc. Videos are easy, and people like watching short 90-second videos with success stories, use cases, industry trends.” (Vanella)

5. Act locally, think globally. 

“For those with worldwide businesses, I would create nurture programs for various regions, such as Europe, so that you are localizing the communications.” (Eisenberg)

6. Don’t invest in software right off the bat.

“When setting up a lead nurturing program, don’t buy software. It’s too early for that. You don’t know what you need; you may not require every bell and whistle in the foreseeable future. Familiarize yourself with what is possible given the budget and expertise available to you. When you know your audience and have a plan to engage them, you have a good sense of what you need your software to do. This is important because there’s a wee bit of an arms race going on in the marketing automation space. The leading and emerging vendors in this category are competing for a fast-growing but still-nascent market.” (Scearce)

In closing 

There’s no time like the present to stop losing valuable prospects and start grooming them into qualified leads. Setting up a lead-nurturing program doesn’t have to be a daunting task. The most important takeaway here is that you get in the game now, and take one step at a time. Identify your “nurture-able” leads, start with a single campaign, add in some content marketing, and worry about automating (and purchasing the needed software) down the road.

Have any tips for creating top-notch lead-nurturing campaigns? Let us know in the comments below!

Let’s block ads! (Why?)

Act-On Blog

Setting up Azure Disk Encryption for a Virtual Machine with PowerShell

As I discussed in my previous blog post, I opted to use Azure Disk Encryption for my virtual machines in Azure, rather than Storage Service Encryption. Azure Disk Encryption utilizes Bitlocker inside of the VM. Enabling Azure Disk Encryption involves these Azure services:

  • Azure Active Directory for a service principal
  • Azure Key Vault for a KEK (key encryption key) which wraps around the BEK (bitlocker encryption key)
  • Azure Virtual Machine (IaaS)

Following are 4 scripts which configures encryption for an existing VM. I initially had it all as one single script, but I purposely separated them. Now that they are modular, if you already have a Service Principal and/or a Key Vault, you can skip those steps. I have my ‘real’ version of these scripts stored in an ARM Visual Studio project (same logic, just with actual names for the Azure services). These PowerShell templates go along with other ARM templates to serve as source control for our Azure infrastructure.

As any expert will immediately know by looking at my scripts below, I’m pretty much a PowerShell novice. So, be kind dear reader. My purpose is to document the steps, the flow,add some commentary, and to pull together a couple pieces I found on different documentation pages. 

Step 1: Set up Service Principal in AAD

<#
.SYNOPSIS
Creates Service Principal in Azure Active Directory
.DESCRIPTION
This script creates a service principal in Azure Active Directory. 
A service principal is required to enable disk encryption for VM.
.NOTES
File Name: CreateAADSvcPrinForDiskEncryption.ps1
Author : Melissa Coates
Notes: Be sure the variables in the input area are completed, following all standard naming conventions.
The $  aadSvcPrinAppPassword needs to be removed before saving this script in source control.
.LINK
Supporting information: 

https://blogs.msdn.microsoft.com/azuresecurity/2015/11/16/explore-azure-disk-encryption-with-azure-powershell/


https://docs.microsoft.com/en-us/azure/security/azure-security-disk-encryption

#>

#-----------------------------------------

#Input Area
$  subscriptionName = 'MyAzureSubscriptionDev'
$  aadSvcPrinAppDisplayName = 'VMEncryptionSvcPrinDev'
$  aadSvcPrinAppHomePage = 'http://FakeURLBecauseItsNotReallyNeededForThisPurpose'
$  aadSvcPrinAppIdentifierUri = 'https://DomainName.com/VMEncryptionSvcPrinDev'
$  aadSvcPrinAppPassword = 'SuperStrongPassword'

#-----------------------------------------

#Manual login into Azure
Login-AzureRmAccount -SubscriptionName $  subscriptionName

#-----------------------------------------

#Create Service Principal App to Use For Encryption of VMs
$  aadSvcPrinApplication = New-AzureRmADApplication -DisplayName $  aadSvcPrinAppDisplayName -HomePage $  aadSvcPrinAppHomePage -IdentifierUris $  aadSvcPrinAppIdentifierUri -Password $  aadSvcPrinAppPassword
New-AzureRmADServicePrincipal -ApplicationId $  aadSvcPrinApplication.ApplicationId

Step 2: Create Azure Key Vault

<#
.SYNOPSIS
Creates Azure Key Vault.
.DESCRIPTION
This script does the following:
1 - Creates a key vault in Azure.
2 - Allows the Azure Backup Service permission to the key vault.
This is required if Recovery Vault will be used for backups.
A key vault is required to enable disk encryption for VM.
.NOTES
File Name: ProvisionAzureKeyVault.ps1
Author : Melissa Coates
Notes: Be sure the variables in the input area are completed, following all standard naming conventions.
The key vault must reside in the same region as the VM which will be encrypted.
A Premium key vault is being provisioned so that an HSM key can be created for the KEK.
The 262044b1-e2ce-469f-a196-69ab7ada62d3 ID refers to the Azure Key Vault (which is why it is not a variable).
.LINK
Supporting information: 

https://blogs.msdn.microsoft.com/azuresecurity/2015/11/16/explore-azure-disk-encryption-with-azure-powershell/


https://docs.microsoft.com/en-us/azure/security/azure-security-disk-encryption

#>

#-----------------------------------------

#Input Area
$  subscriptionName = 'MyAzureSubscriptionDev'
$  resourceGroupName = 'MyDevRG'
$  keyVaultName = 'KeyVault-Dev'
$  keyVaultLocation = 'East US 2'

#-----------------------------------------

#Manual login into Azure
#Login-AzureRmAccount -SubscriptionName $  subscriptionName

#-----------------------------------------

#Create Azure Key Vault
New-AzureRmKeyVault -VaultName $  keyVaultName -ResourceGroupName $  resourceGroupName -Location $  keyVaultLocation -Sku 'Premium'

#-----------------------------------------

#Permit the Azure Backup service to access the key vault
Set-AzureRmKeyVaultAccessPolicy -VaultName $  keyVaultName -ResourceGroupName $  resourceGroupName -PermissionsToKeys backup,get,list -PermissionsToSecrets get,list -ServicePrincipalName 262044b1-e2ce-469f-a196-69ab7ada62d3

Step 3: Connect Service Principal with Key Vault

<#
.SYNOPSIS
Enables the service principal for VM disk encryption to communicate with Key Vault.
.DESCRIPTION
This script does the following:
A - Allows service principal the selective permissions to the key vault so
that disk encryption functionality works.
B - Creates a KEK (Key Encryption Key). For Disk Encryption, a KEK is required 
in addition to the BEK (BitLocker Encryption Key).
Prerequisite 1: Service Principal name (see CreateAADSvcPrinForVMEncryption.ps1)
Prerequisite 2: Azure Key Vault (see ProvisionAzureKeyVault.ps1)
.NOTES
File Name: EnableSvcPrinWithKeyVaultForDiskEncryption.ps1
Author : Melissa Coates
Notes: Be sure the variables in the input area are completed, following all standard naming conventions.
The key vault must reside in the same region as the VM being encrypted.
The key type can be either HSM or Software (HSM offers additional security but does require a Premium key vault). 
.LINK
Supporting information: 

https://blogs.msdn.microsoft.com/azuresecurity/2015/11/16/explore-azure-disk-encryption-with-azure-powershell/


https://docs.microsoft.com/en-us/azure/security/azure-security-disk-encryption

#>

#Input Area
$  subscriptionName = 'MyAzureSubscriptionDev'
$  resourceGroupName = 'MyDevRG'
$  aadSvcPrinAppDisplayName = 'VMEncryptionSvcPrinDev'
$  keyVaultName = 'KeyVault-Dev'
$  keyName = 'VMEncryption-KEK'
$  keyType = 'HSM'

#-----------------------------------------

#Manual login into Azure
#Login-AzureRmAccount -SubscriptionName $  subscriptionName

#-----------------------------------------

#Allow the Service Principal Permissions to the Key Vault
$  aadSvcPrinApplication = Get-AzureRmADApplication -DisplayName $  aadSvcPrinAppDisplayName
Set-AzureRmKeyVaultAccessPolicy -VaultName $  keyVaultName -ServicePrincipalName $  aadSvcPrinApplication.ApplicationId -PermissionsToKeys 'WrapKey' -PermissionsToSecrets 'Set' -ResourceGroupName $  resourceGroupName

#-----------------------------------------

#Create KEK in the Key Vault
Add-AzureKeyVaultKey -VaultName $  keyVaultName -Name $  keyName -Destination $  keyType

#-----------------------------------------

#Allow Azure platform access to the KEK
Set-AzureRmKeyVaultAccessPolicy -VaultName $  keyVaultName -ResourceGroupName $  resourceGroupName -EnabledForDiskEncryption

Step 4: Enable Disk Encryption

<#
.SYNOPSIS
Enables disk encryption for a VM.
.DESCRIPTION
This script enables disk encryption for an Azure virtual machine.
Prerequisite 1: Service Principal name (see CreateAADSvcPrinForDiskEncryption.ps1)
Prerequisite 2: Azure Key Vault (see ProvisionAzureKeyVault.ps1)
Prerequisite 3: Permissions to Key Vault for Service Principal (see EnableSvcPrinWithKeyVaultForDiskEncryption.ps1)
.NOTES
File Name: EnableAzureDiskEncryption.ps1
Author : Melissa Coates
Notes: Be sure the variables in the input area are completed, following all standard naming conventions.
Azure Disk Encryption (ADE) 
The Azure VMs must already exist and be running.
To verify when completed: Get-AzureRmVmDiskEncryptionStatus -ResourceGroupName $  resourceGroupName -VMName $  vmName
.LINK
Supporting information: 

https://blogs.msdn.microsoft.com/azuresecurity/2015/11/16/explore-azure-disk-encryption-with-azure-powershell/


https://docs.microsoft.com/en-us/azure/security/azure-security-disk-encryption

#>

#-----------------------------------------

#Input Area
$  subscriptionName = 'MyAzureSubscriptionDev'
$  resourceGroupName = 'MyDevRG'
$  keyVaultName = 'KeyVault-Dev'
$  keyName = 'VMEncryption-KEK'
$  vmName = 'VMName-Dev'

#-----------------------------------------

#Manual login into Azure
#Login-AzureRmAccount -SubscriptionName $  subscriptionName

#-----------------------------------------

#Enable Encryption on Virtual Machine
$  keyVault = Get-AzureRmKeyVault -VaultName $  keyVaultName -ResourceGroupName $  resourceGroupName
$  diskEncryptionKeyVaultUrl = $  KeyVault.VaultUri
$  keyVaultResourceId = $  KeyVault.ResourceId
$  keyEncryptionKeyUri = Get-AzureKeyVaultKey -VaultName $  keyVaultName -KeyName $  keyName 
Set-AzureRmVMDiskEncryptionExtension -ResourceGroupName $  resourceGroupName -VMName $  vmName -AadClientID $  aadSvcPrinApplication.ApplicationId -AadClientSecret $  aadSvcPrinAppPassword -DiskEncryptionKeyVaultUrl $  diskEncryptionKeyVaultUrl -DiskEncryptionKeyVaultId $  KeyVaultResourceId -KeyEncryptionKeyUrl $  keyEncryptionKeyUri.Id -KeyEncryptionKeyVaultId $  keyVaultResourceId

Let’s block ads! (Why?)

Blog – SQL Chick

Setting your Default Address List for Microsoft Dynamics CRM

In Microsoft Dynamics CRM, your contacts are moved by default into your Contacts address list in Outlook.  However, Outlook defaults to your Global Address list, which means you do not automatically see your CRM contacts in Outlook.  In this article, we will assist you with updating your default address list.

To set your default address list, click the Address Book button on the Outlook ribbon.

1 3 300x84 Setting your Default Address List for Microsoft Dynamics CRM

Once you have opened the address book, click Tools.

2 3 300x218 Setting your Default Address List for Microsoft Dynamics CRM

Then choose options.

3 4 Setting your Default Address List for Microsoft Dynamics CRM

Once you are in Options, take the following steps:

1.       Start with contact folders
2.       Choose Contacts in the “When opening” dropdown
3.       Click OK

4 3 284x300 Setting your Default Address List for Microsoft Dynamics CRM

You now have the default list set to see your CRM contacts.  For more tips, sign up for our email newsletter by visiting http://www.toplineresults.com/contact-us/.

Let’s block ads! (Why?)

CRM Software Blog

Why 2017 is setting up to be the year of GPU chips in deep learning

TTlogo 379x201 Why 2017 is setting up to be the year of GPU chips in deep learning

It’s been a while since anyone in the tech world really cared about hardware, but that could change in 2017, thanks to the ascendance of deep learning.

According to Forrester analyst Mike Gualtieri, we may be entering a golden age of hardware driven largely by graphics processing units, or GPUs. This alternative to traditional CPU technology is optimized to process tasks in parallel, rather than sequentially. This makes GPU chips a good fit for training deep learning models, which involves crunching enormous volumes of data again and again until the models can recognize images, parse natural speech or recommend products to online shoppers.

“There will be a hardware renaissance because of the compute needs to train these models,” Gualtieri said.

Need grows for GPU tools

GPU technology has been around for decades, but only recently has it gained traction among enterprises. It was traditionally used to enhance computer graphics, as the name suggests. But as deep learning and artificial intelligence have grown in prominence, the need for fast, parallel computation to train models has increased.

“A couple years ago, we wouldn’t be looking at special hardware for this,” said Adrian Bowles, founder of analyst firm STORM Insights Inc., based in Boston. “But with [deep learning], you have a lot of parallel activities going on, and GPU-based tools are going to give you more cores.”

He said he expects the market for GPU chips to heat up in the year ahead. NVIDIA, one of the first companies to start marketing GPU chips for analytics, teamed up with IBM and Microsoft this year to put its GPU chips in server technology aimed at cognitive applications. The biggest player in the processing market, Intel, may also look to get deeper into GPU technology.

GPUs mean changes for developers

Bowles said the growing popularity of this hardware, in some cases, may mean developers will have to learn new ways of working. Since the manner in which a GPU processes data is so different from a CPU, applications will need to be built differently to take full advantage of the benefits. This means developers will need more training to keep their skills sharp going forward.

Similarly, more enterprises will start to build their data architectures around GPU technology, said Tom Davenport, founder of the International Institute for Analytics, based in Portland, Ore., in a webcast. He said more businesses are looking at ways to implement image-recognition technology, which is built around deep learning models. Larger companies, like Google, Uber and Tesla, are also pushing forward with autonomous vehicles, which lean heavily on learning models. The growing role for these types of tasks will increase the demand for GPUs.

“I think we’re going to start seeing more computing architectures that focus specifically on GPUs,” Davenport said. “That’s been a part of the success of these deep learning models for image recognition and other applications.”

GPU technology could also enhance the trend of the citizen data scientist, said Paul Pilotte, technical marketing manager at MathWorks Inc., based in Natick, Mass. He pointed out that Amazon this year introduced GPU chips to its Elastic Compute Cloud platform. The fact it’s all hosted means users don’t need to be hardware experts. This lowers the bar to entry and makes the technology available to a wider user base.

“The workflows for deep learning, prescriptive analytics and big data will become more accessible,” Pilotte said.

Let’s block ads! (Why?)


SearchBusinessAnalytics: BI, CPM and analytics news, tips and resources

Setting a new pdf Distribution and estimating its parameters by MLE

 Setting a new pdf Distribution and estimating its parameters by MLE

Good morning. I need to create a pdf and esteimate its parameters by MLE method. The distribution is a power law ($ f(x)=C(k)\;x^{-k}$ ) and I need to estimate the $ k$ parameter and $ C(k)$ is determinated by the normalization condition $ \int_a^b\;C(k)\;x^{-k}=1$ . I have tried to define the power distribution with ProbabilityDistribution function usin, for example, C=5, but when I want to plot it, Mathematica doesn’t display the plot. What I’ve been writing is

distr=ProbabilityDistribution[5/x^1.05,{x,0,Infinity}]
PDF[distr,x]
Plot[%,{x,0,1}]

NOTE: when I try to define de distribution function I’m setting $ k=1.05$ , just for knowing if I’m doing it well.

Could you help me, please?

Thanks for your answers.

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Setting Up a PC for Azure Cortana Intelligence Suite Development

Some development aspects of the Cortana Intelligence Suite can occur in the Azure Portal. There are also some additional client tools which are helpful, or potentially required, to fully create solutions. This is a quick checklist of tools you probably want to install on a development machine for purposes of working on the analytics, BI, and/or data warehousing elements of Azure.

1. SQL Server Management Studio (SSMS)

The latest version of SSMS is recommended for compatibility with Azure services, as well as backwards compatible with all SQL Server versions back to 2008.

Download SSMS:  https://msdn.microsoft.com/en-us/library/mt238290.aspx

2. Visual Studio 2015

The latest 2015 version of Visual Studio is recommended for full functionality for the newest components. If you choose to do a customized VS installation, be sure to select the option to install Microsoft Web Developer Tools. (If you don’t, when you try to install the Azure SDK later it won’t install properly because prerequisites are missing. Yeah, yeah, I’ve been there.)

If you don’t have a license available, look into using the VS Community edition.

Download VS 2015: https://www.visualstudio.com/downloads/download-visual-studio-vs 

Note: in addition to “Visual Studio 2015,” there’s also a “Visual Studio 15 Preview.” The 15 Preview is *not* the same thing as Visual Studio 2015, even though 15 is in its name. So, just watch out for that in terms of naming.


3. SQL Server Data Tools (SSDT) for Visual Studio 2015

Here’s is where you gain the ability to create BI projects: SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), and SQL Server Reporting Services (SSRS). These SQL Server BI projects aren’t considered part of Cortana Intelligence Suite, but if you’re creating an analytics, BI, and/or data warehousing solution you may need at least one of types of BI projects as part of the overall solution.

With the latest version of SSDT for VS 2015, you’ll also be able to interact with all versions of SQL Server, as well as Azure SQL Database and Azure SQL Data Warehouse.

Example of what an Azure SQL Data Warehouse cloud resource looks like from within Visual Studio (SSDT):

 Setting Up a PC for Azure Cortana Intelligence Suite Development

4. Azure SDK

The Azure SDK sets up lots of libraries; the main features we are looking for from the Azure SDK right away are (a) the ability to use the Cloud Explorer within Visual Studio, and (b) the ability to create ARM template projects for automated deployment purposes. In addition to the Server Explorer we get from Visual Studio, the Cloud Explorer from the SDK gives us another way to interact with our resources in Azure. 

Example of what the Cloud Explorer pane looks like in Visual Studio (by Resource Group, and by Resource Type):

 Setting Up a PC for Azure Cortana Intelligence Suite Development

Example of what you’ll see related to Azure Resource Groups after the Azure SDK is installed:

 Setting Up a PC for Azure Cortana Intelligence Suite Development

Example of various QuickStart project types available (none of these are directly related to Cortana Intelligence Suite, but might factor into your overall solution):

 Setting Up a PC for Azure Cortana Intelligence Suite Development

5.  Relevant Visual Studio Extensions

These are important extensions for working with Cortana Intelligence Suite at this time:
-Microsoft Azure Data Lake Tools for Visual Studio 2015 <–Automatically installed as part of the Azure SDK
-Microsoft Azure HDInsight Tools for Visual Studio 2015 <–Automatically installed as part of the Azure SDK
-Microsoft Azure Data Factory Tools for Visual Studio 2015

At the time of this writing (June 2016), Azure Data Factory Tools are not automatically installed with the Azure SDK. That will probably change at some point I would guess.

Example of the Cortana Intelligence Suite projects you’ll see after the Azure extensions are installed (a U-SQL project is associated with the Azure Data Lake Analytics service):

 Setting Up a PC for Azure Cortana Intelligence Suite Development

6.  Microsoft Azure Storage Explorer and/or AzCopy

I really like the new standalone Azure Storage Explorer for uploading and downloading files to Azure Storage. AzCopy is another alternative – AzCopy is a command line utility instead of a graphical UI, so it’s better suited for automation and scripting purposes. 

Example of what Azure Storage Explorer looks like:

 Setting Up a PC for Azure Cortana Intelligence Suite Development

Let’s block ads! (Why?)

Blog – SQL Chick

Tech Tip Thursday: setting up a Power BI test tenant

social default image Tech Tip Thursday: setting up a Power BI test tenant

Microsoft’s Guy in a Cube has been providing tips and tricks for Power BI and Business Intelligence on his YouTube channel since 2014. Every Thursday we highlight a different helpful video from his collection.

In this week’s Tech Tip Thursday, Guy in a Cube show us the three steps needed to set up a managed trial Office 365/Azure Active Directory tenant as a test environment. Learn how to test Power BI administration features without having to own your own domain!

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Setting the Record Straight on Mainframe Security and Compliance

Last November, Syncsort announced the results of an industry survey that confirmed the mainframe’s role at center stage in large corporations around the world.  The findings also highlighted how the mainframe has evolved to meet the growing need to process and analyze large volumes of Big Data. The survey also uncovered several other key findings about the mainframe’s use in large businesses, where it still houses the lion’s share of corporate data, is the system of record, and is now at the heart of their Big Data strategies.

Data from the survey also shed light on why mainframe vendors cannot be complacent about mainframe security.

In a recently published column in Enterprise Executive, “Setting the Record Straight on Mainframe Security and Compliance,” Harvey Tessler, a Syncsort founder who has decades of product and market knowledge in IBM mainframe, leveraged survey data to clarify some of the questions about how secure, reliable, and compliant the mainframe is, given endless security threats and cyber-attacks from within and outside businesses, as well as escalating regulatory requirements.

An excerpt of Harvey’s column follows below:

Not surprisingly, in a recent “State of the Mainframe” study, polling  187 respondents including IT Strategic Planners, Directors/Managers, Architects and Administrators at Global Enterprises with $ 1B or more in annual revenues, Syncsort found that 82.9 and 83.4 percent of respondents cited security and availability as key strengths of the mainframe. Even fortresses can be vulnerable if repeatedly attacked — mainframes are not impregnable.

Please rank security of critical enterprise data as a key strength of the mainframe at your company on a scale of 1 being very important to 5 not important.

Mainframe Survey Graph 2 Setting the Record Straight on Mainframe Security and Compliance

The recent “State of the Mainframe” study reflected confidence in mainframe security

This isn’t a mainframe only issue — IT departments need to ensure that their organizations have the security measures in place to protect business data in an increasingly decentralized, Big Iron to Big Data world, with higher and higher volumes of diverse data types coming from multiple data sources, including mainframe, data warehouses, open systems, mobile devices and the IoT. They also have to comply with a number of industry and federal regulations (PCI DSS, FISMA, Sarbanes-Oxley, HIPAA and HITECH) designed to keep sensitive customer information safe.

Organizations are also looking to leverage advanced analytics on Big Data and analytics platforms. In the “State of the Mainframe” study,  when asked to  rank their use of the for transferring Big Iron data to big data and analytics platforms, respondents showed a high level of interest (see figure 2)

In fact, in recent commentary on the “State of the Mainframe” study, Denny Yost, associate publisher and editor-in-chief of Enterprise Systems Media, said, “The survey demonstrates that many big companies are using mainframe as the back-end, transaction hub for their Big Data strategies, grappling with the same data cost and management challenges they used to tackle before, but applying it to more complex use cases with more dauntingly large and diverse amounts of data.”

Please rank your use of the mainframe for transferring data to big data and analytics platforms on a scale of 1 being very important to 5 not important.

Mainframe Survey Graph 1 Setting the Record Straight on Mainframe Security and Compliance

The “State of the Mainframe” study also showed 72 percent of respondents are interested or very interested in Big Iron to Big Data Strategies, with more than half of them choosing the top 2 rankings for importance

Rising to the challenge, IBM continues to reinvent the mainframe for modern computing needs. With the model z13, for example, announced in January 2015, over two billion transactions per day can be crunched, and it “eats mobile application data for lunch.” In the z13, IBM also focused on directly integrating security measures, including embedded cryptography.  Big Blue says the new entry level z13 offers encryption at twice the speed as previous mid-range systems, without compromising performance. Still, of course, there is more to be done to fortify mainframe security and compliance, and software providers need to ante up.

And it’s not just about mainframe anymore.

Recent customer use cases indicate that Big Iron’s evolving future will center on processing and analyzing exploding data volumes, and working together with legacy and new computing platforms and data delivery technologies, all to support data-driven decision-making for the business. This trend reflects the mainframe’s ability to be that hub for emerging Big Data analytics platforms, mobile applications and the IoT.

If you would like to read Harvey’s column in its entirety, click here.  For more information on the State of the Mainframe study, click here.

Let’s block ads! (Why?)

Syncsort blog