• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Create

Hello, I want to create a matrix with a certain number of rows and columns

December 6, 2020   BI News and Info

 Hello, I want to create a matrix with a certain number of rows and columns

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

How to Create Microservices-based Applications for AWS

October 28, 2020   TIBCO Spotfire
TIBCO Microservices 696x392 How to Create Microservices based Applications for AWS

Reading Time: 2 minutes

Market demands are shifting rapidly, with many disruptive forces in motion. Businesses are reacting in a number of different ways to preserve cash, change the way they operate, and accelerate digital business initiatives to capture new value. Today’s disruptions are planting seeds for broad and permanent changes across all markets, so businesses need to act now in order to prepare for what’s to come in the near future. In order to combat these forces, a business needs to be agile so that it can rapidly adapt its operations as well as its products and services to meet the new market conditions. Either way, the business that is able to react quickly maintains resiliency and has a foundation for rapid growth and innovation

A key starting point for increasing business agility is the digital platform, as businesses are operating more with digital services than manual, rigid, paper-based processes. If you aren’t able to rapidly adapt the services and capabilities of your digital platform to stay aligned with the needs of the business, then your underlying application architecture needs to be evolved so that it becomes more agile. One way to build this agility is by evolving to a microservices architecture.

Microservices are very small units of executable code. The industry has long preached the benefits of breaking down large, monolithic applications into smaller units of execution. But technology has evolved in recent years so that now this strategy creates high performing apps. Microservices can be used to break up monoliths into individual, highly cohesive business services that are deployed in containers and serverless environments.  Thus, microservices can each be adapted, deployed, and scaled independently of other microservices. This gives the business a high degree of flexibility to adapt to the digital platform very quickly.   

TIBCO Cloud Integration makes it easy to develop and deploy your business logic in event-driven microservices and functions to AWS.  You can use pre-packaged connectors for AWS to connect to a wide variety of Amazon services to create application logic. The entire application architecture is highly efficient and cost-effective which will accelerate your adoption of AWS technologies.

TIBCO Cloud Integration simplifies the development and deployment of event-driven applications built with microservices and functions to AWS. Once apps are created, you can package your microservices into a Docker Image, and then deploy them into the AWS container management service of your choice including Amazon EKS, ECS, and Fargate for deployment to AWS, or other container management services. They also can be deployed seamlessly to AWS Lambda.

TIBCO’s extensive experience in intelligent connectivity, combined with  AWS’s highly flexible and scalability cloud platform makes for a natural partnership.  TIBCO is an AWS Advanced Technology Partner. We partner with AWS in both technology and business development initiatives.  We have many solutions that run natively on AWS, and that are also available for purchase through the AWS marketplace, not only for connectivity, but also for analytics and machine learning, and data management.

Microservices can each be adapted, deployed, and scaled independently of other microservices. This gives the business a high degree of flexibility to adapt to the digital platform very quickly.   Click To Tweet

To learn more about how to create microservices-based applications for AWS, watch this webinar hosted by BrightTalk. And to learn more about TIBCO Cloud Integration, watch our demos or sign up for a 30-day free trial.

Let’s block ads! (Why?)

The TIBCO Blog

Read More

Trying to create a list that counts the number primes for each remainder class

September 27, 2020   BI News and Info

 Trying to create a list that counts the number primes for each remainder class

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

Researchers create dataset to advance U.S. Supreme Court gender bias analysis

September 22, 2020   Big Data
 Researchers create dataset to advance U.S. Supreme Court gender bias analysis

Automation and Jobs

Read our latest special issue.

Open Now

University of Washington language researchers and legal professionals recently created a labeled dataset for detection of interruptions and competitive turn-taking in U.S. Supreme Court oral arguments. They then used the corpus of “turn changes” to train AI models to experiment with ways to automatically classify turn changes as competitive or cooperative as a way to analyze gender bias.

“In-depth studies of gender bias and inequality are critical to the oversight of an institution as influential as the Supreme Court,” reads the paper University of Washington researchers Haley Lepp and Gina-Anne Levow published on preprint repository arXiv one week ago. “We find that as the first person in an exchange, female speakers and attorneys are spoken to more competitively than are male speakers and justices. We also find that female speakers and attorneys speak more cooperatively as the second person in an exchange than do male speakers and justices.”

Attorneys who speak before the Supreme Court are allotted 30 minutes of oral argument and are expected to stop talking when a justice speaks. Linguists have observed men interrupting women routinely in professional environments and other settings.

Turn changes are defined as instances when one person stops speaking and another person starts speaking. Short audio clips of each turn change were annotated as competitive or cooperative by 77 members of the U.S. legal community who identify as an attorney, judge, legal scholar, or law student in their second year or higher. Lepp and Levow’s work focuses on measuring whether the turn change was cooperative or competitive, based on oral argument audio the Supreme Court made available, in part because previous work by Deborah Tannen found that interruptions in speech can be part of regular discourse and that the context of the conversation can be a factor.

The paper devoted to gender bias analysis was published days before the death of Supreme Court Justice Ruth Bader Ginsburg at the age of 87. Ginsburg was the second woman ever appointed to the U.S. Supreme Court. As a litigator for the American Civil Liberties Union (ACLU), Ginsburg successfully argued cases before the Supreme Court that greatly extended women’s rights in the United States. On Wednesday and Thursday, she will be the first woman and the first Jewish person in U.S. history to lie in state at the U.S. Capital building for members of the public to say goodbye. She was the longest-serving female justice in U.S. history.

Although voting has already begun in some parts of the country and Ginsburg pleaded in her final days to let the winner of the presidential election fill her vacancy, President Trump is expected to nominate a pick to fill her seat Friday or Saturday. Two Republican Senators pledged not to vote until the presidential election is decided, but Senate Majority Leader Mitch McConnell said just hours after her death that the president’s nominee will get a vote.

Details of the turn changes corpus dataset follow a 2017 study that used automation to identify the number of interruptions that occurred from 2004-2015. The study “Justice, Interrupted: The Effect of Gender, Ideology and Seniority at Supreme Court Oral Arguments” by Tonja Jacobi and Dylan Schweers found that women are interrupted three times as often as male Supreme Court justices are. Female Supreme Court justices were interrupted by attorneys as well as other Supreme Court justices, led by Anthony Kennedy, Antonin Scalia, and William Rehnquist. Scalia and Stephen Breyer also interrupted each other a lot.

A producer of the podcast More Perfect noticed people repeatedly interrupting Ginsburg, which led to an episode on the subject. Jacobi spoke on the podcast and said Ginsburg developed tactics to adapt to frequent interruptions, first by asking to ask a question, then pivoting to ask questions more like male justices who interrupt.

The episode also highlighted that Justice Sonia Sotomayor was found to speak as often as men in the Jacobi study, but has still drawn criticism from media commentators at times for being aggressive. Gender is pervasive in coverage of Supreme Courts, according to a 2016 analysis of media coverage in five democratic countries. The analysis found that generally women who ask questions like male justices are labeled abrasive, militant, or mean by critics.

Last year, the U.S. Supreme Court introduced a rule that justices will try to give attorneys two minutes to speak without interruption at the start of oral arguments.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Intel researchers create AI system that rates similarity of 2 pieces of code

July 29, 2020   Big Data
 Intel researchers create AI system that rates similarity of 2 pieces of code

VB Transform

Watch every session from the AI event of the year

On-Demand

Watch Now

In partnership with researchers at MIT and the Georgia Institute of Technology, Intel scientists say they’ve developed an automated engine — Machine Inferred Code Similarity (MISIM) — that can determine when two pieces of code perform similar tasks, even when they use different structures and algorithms. MISIM ostensibly outperforms current state-of-the-art systems by up to 40 times, showing promise for applications from code recommendation to automated bug fixing.

With the rise of heterogeneous computing — i.e., systems that use more than one kind of processor — software platforms are becoming increasingly complex. Machine programming (a term coined by Intel Labs and MIT) aims to tackle this with automated, AI-driven tools. A key technology is code similarity, or systems that attempt to determine whether two code snippets show similar characteristics or achieve similar goals. Yet building accurate code similarity systems is a relatively unsolved problem.

MISIM works because of its novel context-aware semantic structure (CASS), which susses out the purpose of a given bit of source code using AI and machine learning algorithms. Once the structure of the code is integrated with CASS, algorithms assign similarity scores based on the jobs the code is designed to perform. If two pieces of code look different but perform the same function, the models rate them as similar — and vice versa.

CASS can be configured to a specific context, enabling it to capture information that describes the code at a higher level. And it can rate code without using a compiler, a program that translates human-readable source code into computer-executable machine code. This confers the usability advantage of allowing developers to execute on incomplete snippets of code, according to Intel.

Intel says it’s expanding MISIM’s feature set and moving it from the research to the demonstration phase, with the goal of creating a code recommendation engine to assist internal and external researchers programming across its architectures. The proposed system would be able to recognize the intent behind an algorithm and offer candidate codes that are semantically similar but with improved performance.

That could save employers a few headaches — not to mention helping developers themselves. According to a study published by the University of Cambridge’s Judge Business School, programmers spend 50.1% of their work time not programming and half of their programming time debugging. And the total estimated cost of debugging is $ 312 billion per year. AI-powered code suggestion and review tools like MISIM promise to cut development costs substantially while enabling coders to focus on more creative, less repetitive tasks.

“If we’re successful with machine programming, one of the end goals is to enable the global population to be able to create software,” Justin Gottschlich, Intel Labs principal scientist and director of machine programming research, told VentureBeat in a previous interview. “One of the key things you want to do is enable people to simply specify the intention of what they’re trying to express or trying to construct. Once the intention is understood, with machine programming, the machine will handle the creation of the software — the actual programming.”

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

RetrieveGAN AI tool combines scene fragments to create new images

July 22, 2020   Big Data

VB Transform

Watch every session from the AI event of the year

On-Demand

Watch Now

Researchers at Google, the University of California, Merced, and Yonsei University developed an AI system — RetrieveGAN — that takes scene descriptions and learns to select compatible patches from other images to create entirely new images. They claim it could be beneficial for certain kinds of media and image editing, particularly in domains where artists combine two or more images to capture each’s most appealing elements.

AI and machine learning hold incredible promise for image editing, if emerging research is any indication. Engineers at Nvidia recently demoed a system — GauGAN — that creates convincingly lifelike landscape photos from whole cloth. Microsoft scientists proposed a framework capable of producing images and storyboards from natural language captions. And last June, the MIT-IBM Watson AI Lab launched a tool — GAN Paint Studio — that lets users upload images and edit the appearance of pictured buildings, flora, and fixtures.

By contrast, RetrieveGAN captures the relationships among objects in existing images and leverages this to create synthetic (but convincing) scenescapes. Given a scene graph description — a description of objects in a scene and their relationships — it encodes the graph in a computationally-friendly way, looks for aesthetically similar patches from other images, and grafts one or more of the patches onto the original image.

 RetrieveGAN AI tool combines scene fragments to create new images

The researchers trained and evaluated RetreiveGAN on images from the open source COC-Stuff and Visual Genome data sets. In experiments, they found that it was “significantly” better at isolating and extracting objects from scenes on at least one benchmark compared with several baseline systems. In a subsequent user study where volunteers were given two sets of patches selected by RetrieveGAN and other models and asked the question “Which set of patches are more mutually compatible and more likely to coexist in the same image?,” the researchers report that RetrieveGAN’s patches came out on top the majority of the time.

“In this work, we present a differentiable retrieval module to aid the image synthesis from the scene description. Through the iterative process, the retrieval module selects mutually compatible patches as reference for the generation. Moreover, the differentiable property enables the module to learn a better embedding function jointly with the image generation process,” the researchers wrote. “The proposed approach points out a new research direction in the content creation field. As the retrieval module is differentiable, it can be trained with the generation or manipulation models to learn to select real reference patches that improves the quality.”

Although the researchers don’t mention it, there’s a real possibility their tool could be used to create deepfakes, or synthetic media in which a person in an existing imag is replaced with someone else’s likeness. Fortunately, a number of companies have published corpora in the hopes the research community will pioneer detection methods. Facebook — along with Amazon Web Services (AWS), the Partnership on AI, and academics from a number of universities — is spearheading the Deepfake Detection Challenge. In September 2019, Google released a collection of visual deepfakes as part of the FaceForensics benchmark, which was cocreated by the Technical University of Munich and the University Federico II of Naples. More recently, researchers from SenseTime partnered with Nanyang Technological University in Singapore to design DeeperForensics-1.0, a data set for face forgery detection that they claim is the largest of its kind.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Researchers detail texture-swapping AI that could be used to create deepfakes

July 8, 2020   Big Data

In a preprint paper published on Arxiv.org, researchers at the University of California, Berkeley and Adobe Research describe the Swapping Autoencoder, a machine learning model designed specifically for image manipulation. They claim it can modify any image in a variety ways, including texture swapping, while remaining “substantially” more efficient compared with previous generative models.

The researchers acknowledge that their work could be used to create deepfakes, or synthetic media in which a person in an existing image or video is replaced with someone else’s likeness. In a human perceptual study, subjects were fooled 31% of the time by images created using the Swapping Autoencoder. But they also say that proposed detectors can successfully spot images manipulated by the tool at least 73.9% of the time, suggesting the Swapping Autoencoder is no more harmful than other AI-powered image manipulation tools.

“We show that our method based on an auto-encoder model has a number of advantages over prior work, in that it can accurately embed high-resolution images in real-time, into an embedding space that disentangles texture from structure, and generates realistic output images … Each code in the representation can be independently modified such that the resulting image both looks realistic and reflects the unmodified codes,” the coauthors of the study wrote.

The researchers’ approach isn’t novel in the sense that many AI models can edit portions of images to create new images. For example, the MIT-IBM Watson AI Lab released a tool that lets users upload photographs and customize the appearance of pictured buildings, flora, and fixtures, and Nvidia’s GauGAN can create lifelike landscape images that never existed. But these models tend to be challenging to design and computationally intensive to run.

VB Transform 2020 Online – July 15-17. Join leading AI executives: Register for the free livestream.

 Researchers detail texture swapping AI that could be used to create deepfakes

By contrast, the Swapping Autoencoder is lightweight, using image swapping as a “pretext” task for learning an embedding space useful for image manipulation. It encodes a given image into two separate latent codes — a “structure” code and a “texture” code — intended to represent structure and texture, and during training, the structure code learns to correspond to the layout or structure of a scene while the texture codes capture properties about the scene’s overall appearance.

In an experiment, the researchers trained Swapping Autoencoder on a data set containing images of churches, animal faces, bedrooms, people, mountain ranges, and waterfalls and built a web app that offers fine-grained control over uploaded photos. The app supports global style editing and region editing as well as cloning, with a brush tool that replaces the structure code from another part of the image.

“Tools for creative expression are an important part of human culture … Learning-based content creation tools such as our method can be used to democratize content creation, allowing novice users to synthesize compelling images,” the coauthors wrote.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Create a Safe Place to Work with TIBCO GatherSmart

June 25, 2020   TIBCO Spotfire
TIBCO GatherSmart App 696x432 Create a Safe Place to Work with TIBCO GatherSmart

Reading Time: 2 minutes

As the economy starts to reopen and organizations look to return to work, it’s essential that businesses do so responsibly and safely to prevent further spread of the COVID-19 virus. An important way to ensure a safe place to work is to give organizations immediate and ongoing visibility into employee health status and an easy way to manage the flow of employees back to the office.

That’s why TIBCO LABS™, our own innovation research group that is focused on applying emerging technologies to today’s business problems, has fast-tracked the deployment of a symptom tracking solution, TIBCO GatherSmart™. The GatherSmart solution is designed to help organizations monitor and regularly check on personnel’s health and readiness for coming into the office. It supports immediate visibility into employee status and enables the safe return of personnel to the office.

Leveraging the powerful TIBCO Connected Intelligence platform and the innovative, technical resources of TIBCO LABS, the GatherSmart™ solution provides a safe, easy-to-use tool for all types and sizes of organizations. 

You can watch a  4-minute video for a quick overview of the TIBCO GatherSmart™ solution.

An important way to ensure a safe place to work is to give organizations immediate and ongoing visibility into employee health status and an easy way to manage the flow of employees back to the office. Click To Tweet

Part of the #TIBCO4Good Initiative

GatherSmart is a part of our wider TIBCO4Good mission aimed at contributing our expertise and technology to help communities solve meaningful human challenges with a data-centric approach. GatherSmart grew alongside the amazing efforts of our data science team who began working on a series of dashboards to support communities managing the spread of COVID-19 with actionable insights and deep analytic capabilities back in March. They built a Visual Analytics Hub to help visualize the progress and effects of local interventions. Many groups have used this resource in the fight against COVID-19.

Designed to help employers manage the challenges that come with transitioning back to work, GatherSmart is:

  • Safe and simple to use 
  • Available across multiple mobile devices
  • Frictionless to install
  • Single source of truth to safely manage personnel access to work sites 
  • Not-for-profit solution

Here’s a breakdown of the different components of the solution:

  • Smart Mobile App: The employee mobile app provides a quick, simple way for organizations to assess the readiness of employees to return to the workplace. With a daily health survey form, predetermined by the employer and in compliance with HR policies and privacy standards, employees report their symptoms and receive guidance on whether to work from home or go into the office. If they are given a green light to return to the office, employees will receive a daily digital passport to be scanned on entrance.
  • Employer Control Center: HR and other personnel departments of the employer can use the TIBCO GatherSmart control center to manage access to different offices. Drawing from up-to-date survey data and other analytics, this single dashboard creates a comprehensive view of employees’ status with regional context and allows for quick and easy monitoring and management of personnel.  
 Create a Safe Place to Work with TIBCO GatherSmart
The 3 different components of TIBCO GatherSmart

In alignment with this mission, we are excited to announce that the GatherSmart solution will be free to companies with less than 50 employees and sold at cost to others. We know this affordable solution, in combination with other public safety and government initiatives, will help us all transition safely and responsibly back to work. 

Let’s block ads! (Why?)

The TIBCO Blog

Read More

Is there any better way to create {a, a, a, b, b, b, b, b, c, c} from {{a, 3}, {b, 5}, {c, 2}}?

June 24, 2020   BI News and Info
 Is there any better way to create {a, a, a, b, b, b, b, b, c, c} from {{a, 3}, {b, 5}, {c, 2}}?

My attempt:

data = {{a, 3}, {b, 5}, {c, 2}};

output = Table[#[[1]], #[[2]]] & /@ data // Flatten

{a, a, a, b, b, b, b, b, c, c}

1 Answer

Join @@ Table @@@ data
{a, a, a, b, b, b, b, b, c, c}

or

Join @@ ConstantArray @@@ data
{a, a, a, b, b, b, b, b, c, c}

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

How to Create an Ubuntu PowerShell Development Environment – Part 3

May 31, 2020   BI News and Info

The series so far:

  1. How to Create an Ubuntu PowerShell Development Environment – Part 1
  2. How to Create an Ubuntu PowerShell Development Environment – Part 2
  3. How to Create an Ubuntu PowerShell Development Environment – Part 3

Over the last few years, Microsoft has made great strides in making their software products available on a wider range of platforms beyond Windows. Many of their products will now run on a variety of Linux distributions (often referred to as “distros”), as well as Apple’s macOS platform. This includes their database product, SQL Server.

One way in which Microsoft achieved cross-platform compatibility is through containers. If you aren’t familiar with containers, you can think of them as a stripped-down virtual machine. Only the components necessary to run the application, in this case, SQL Server, are included. The leading tool to manage containers is called Docker. Docker is an application that will allow you to download, create, start and stop, and run containers. If you want a more detailed explanation of containers, please see the article What is a Container on Docker’s website.

Assumptions

For this article, you should understand the concepts of a container, although no experience is required. See the article from Docker referenced in the previous section if you desire more enlightenment on containers. Additionally, this article assumes you are familiar with the SQL language, as well as some basics of PowerShell. Note that throughout this article, when referencing PowerShell, it’s referring to the PowerShell Core product.

The Platform

The previous articles, How to Create an Ubuntu PowerShell Development Environment Part 1 and Part 2, walked through the steps of creating a virtual machine for Linux development and learning. That VM is the basis for this article. All the code demos in this article were created and run in that specific virtual computer. For best results, you should first follow the steps in that article to create a VM. From there, you will be in a good place to follow along with this article. However, they have been tested on other variations of Ubuntu, CentOS, as well as on macOS.

In those articles, I showed not just the creation of the virtual machine, but the steps to install PowerShell and Visual Studio Code (VSCode), tools you will need in order to complete the demos in this article should you wish to follow along.

For the demo, I am assuming you have download the demo files and opened them in Visual Studio Code within the virtual machine, and are executing individual samples by highlighting the code sample and using the F8 key, or by right-clicking on the selected text and picking run.

The Demo

The code samples in this article are part of a bigger sample I provide on my GitHub site. You’ll find the entire project here. There is a zip file included that contains everything in one easy download, or you can look through GitHub and pick and choose the files you want. GitHub also displays Markdown correctly, so you may find it easier to view the project documentation via GitHub rather than in VSCode.

This article uses two specific files, located in the Demo folder: m11-cool-things-1-docker.ps1 and m11-install-docker.sh. While this article will extract the relevant pieces and explain them, you will find it helpful to review the entire script in order to understand the overall flow of the code.

The Beginning

The first thing the PowerShell script does is use the Set-Location cmdlet to set the current location to the folder where you extracted the demo code. This location should have the Demo, Notes, and Extras folders under it.

Next, make sure Docker is installed, and if not, install it. The command to do this is rather interesting.

bash ./Demo/m11-install-docker.sh

bash is very similar to PowerShell; it is both a terminal and a scripting language. It is native to many Linux distros, including the Ubuntu-based ones. This code uses PowerShell to start a bash session and then executes the bash script m11-install-docker.sh. When the script finishes executing, the bash session ends.

Take a look inside that bash script.

if [ -x “$ (command -v docker)“ ]; then

    echo “Docker is already installed“

else

    echo “Installing Docker“

    sudo snap install docker

fi

The first line attempts to run a command that will complete successfully if Docker is installed. If so, it simply displays that information to the screen via the echo command.

If Docker is not installed, then the script will attempt to install Docker using the snap utility. Snap is a package manager introduced in the Ubuntu line of distros; other distros use a manager known as flat packs. On macOS, brew is the package manager of choice. This is one part of the demo you may need to alter depending on your distro. See the documentation for your specific Linux install for more details.

Of course, there are other ways to install Docker. The point of these few lines was to demonstrate how easy it is to run bash scripts from inside your PowerShell script.

Pulling Your Image

A Docker image is like an ISO. Just as you would use an ISO image to create a virtual machine, a Docker image file can be used to generate one or more containers. Docker has a vast library of images, built by itself and by many companies, such as Microsoft. These images are available to download and use in your own environments.

For this demo, you are going to pull the image for SQL Server 2017 using the following command.

sudo docker pull mcr.microsoft.com/mssql/server:2017-latest

The sudo command executes the following docker program with administrative privileges. Docker, as stated earlier, is the application which manages the containers. Then you give the instruction to Docker, pull. Pull is the directive to download a container from Docker’s repositories.

The final piece is the image to pull. The first part, mcr.microsoft.com, indicates this image is stored in the Microsoft area of the Docker repositories. As you might guess, mssql indicates the subfolders containing SQL Server images, and server:2017-latest indicates the version of SQL Server to pull, 2017. The -latest indicates this should be the most currently patched version; however, it is possible to specify a specific version.

Once downloaded, it is a good idea to query your local image cache to ensure the download was successful. You can do so using this simple command.

sudo docker image ls

image tells Docker you want to work with images, and ls is a simple listing command, similar to using ls to list files in the bash shell.

word image 36 How to Create an Ubuntu PowerShell Development Environment – Part 3

Running the Container

Now that the image is in place, you need to create a container to run the SQL Server. Unlike traditional SQL Server configuration, this turns out to be quite simple. The following command is used to not only create the container but run it. Note the backslash at the end of each line is the line continuation character for bash, the interpreter that will run this command (even though you’re in PowerShell). You could also choose to remove the backslashes and just type the command all on one line.

sudo docker run -e ‘ACCEPT_EULA=Y’ -e ‘SA_PASSWORD=passW0rd!’ -p 1433:1433 –name arcanesql -d mcr.microsoft.com/mssql/server:2017-latest

The first part of the line starts by passing the run command into Docker, telling it to create and run a new container. In the first -e parameter you are accepting the end user license agreement. In the second -e parameter, you create the SA (system administrator) password. As you can see, I’ve used a rather simple password, you should definitely use something much more secure.

Next, we need to map a port number for the container using the -p parameter. The first port number will be used to listen on the local computer, the second port number is used in the container. SQL Server listens on port 1433 by default, so we’ll use that for both parts of the mapping.

The next parameter, --name, provides the name for the container; here I’m calling it arcanesql.

In the final parameter, -d, you need to indicate what image file should be used to generate the container. As you can see, the command is using the SQL Server image downloaded in the previous step.

word image 37 How to Create an Ubuntu PowerShell Development Environment – Part 3

You can verify the container is indeed running using the following command.

sudo docker container ls

As with the other commands, the third parameter indicates what type of Docker object to work with, here containers. Like with image, the ls will produce a list of running containers.

word image 38 How to Create an Ubuntu PowerShell Development Environment – Part 3

Installing the SQL Server Module

Now that SQL Server is up and running, it’s time to start interacting with it from PowerShell Core. First, though, install the PowerShell Core SQL Server module.

Install-Module SqlServer

It won’t hurt to run this if the SQL Server module is already installed. If it is PowerShell will simply provide a warning message to that effect.

If you’ve already installed it, and simply want to make sure it is up to date, you can use the cmdlet to update an already installed module.

Update-Module SqlServer

Do note that normally you would not want to include these in every script you write. You would just need to ensure the computer you are running on has the SQL Server module installed, and that you update it on a regular basis, testing your scripts of course after an update. (For more about testing PowerShell code, see my three-part article on Pester, the PowerShell testing framework, beginning with Introduction to Testing Your PowerShell Code with Pester here on SimpleTalk.)

Running Your First Query

The first query will be very simple; it will return a listing of all tables in the master database so you can see how easy it is to interact with SQL Server and learn the basic set of parameters.

The basic cmdlet to work with SQL Server is Invoke-SqlCmd. It requires a set of parameters, so you’ll place those in variables for easy reference.

$ serverName = ‘localhost,1433′

$ dbName = ‘master’

$ userName = ‘sa’

$ pw = ‘passW0rd!’

$ queryTimeout = 50000

$ sql = ‘SELECT * FROM master.INFORMATION_SCHEMA.Tables’

For this exercise, you are running the Docker container on the same computer as your PowerShell session, so you can just use localhost for the server name. Obviously, you’ll replace this with the name of your server when working in other environments. Note that you must append the port number after the server name.

Next, you have the database name you’ll be working with, and for this example, it will be master.

The next two parameters are the username and password. In a real-world environment, you’d be setting up real usernames and passwords, but this demo will be simple and just use the SA (system administrator) account built into SQL Server. The password is the same one used when you created and ran the container using the docker run command.

Next up is the query timeout. How long do should PowerShell wait before realizing no one is answering and give up? The timeout is measured in terms of seconds.

The last parameter is the query to run. Here you are running a simple SELECT statement to list the tables in the master database.

Now that the parameters are established in variables, you are ready to call the Invoke-SqlCmd cmdlet to run the query.

Invoke-Sqlcmd -Query $ sql `

              -ServerInstance $ serverName `

              -Database $ dbName `

              -Username $ userName `

              -Password $ pw `

              -QueryTimeout $ queryTimeout

Here you pass in the variables to each named parameter. Note the backtick symbol at the end of each line except the last. This is the line continuation character; it allows you to spread out lengthy commands across multiple lines to make them easier to read.

In the output, you see a list of each table in the master database.

 How to Create an Ubuntu PowerShell Development Environment – Part 3

Splatting

As you can see, Invoke-SqlCmd has a fairly lengthy parameter set. It will get tiresome to have to repeat this over and over each time you call Invoke-SqlCmd, especially as the bulk of these will not change between calls.

To handle this, PowerShell includes a technique called splatting. With splatting, you create a hash table, using the names of the parameters for the hash table keys, and the values for each parameter as the hash table values.

$ sqlParams = @{ “ServerInstance” = $ serverName

                “Database” = $ dbName

                “Username” = $ userName

                “Password” = $ pw

                “QueryTimeout” = $ queryTimeout

              }

If you look at the syntax in the previous code example, you’ll see that the key values on the left of the hash table above match the parameter names. For this example, load the values from the variables you created, but you could also have hardcoded the values.

So how do you use splatting when calling a cmdlet? Well, that’s pretty simple. In this next example, you’ll load the $ sql variable with a query to create a new database named TeenyTinyDB, and then execute the Invoke-SqlCmd.

$ sql = ‘CREATE DATABASE TeenyTinyDB’

Invoke-Sqlcmd -Query $ sql @sqlParams

Here you call Invoke-SqlCmd, then pass in the query as a named parameter. After that, you pass in the hash table variable sqlParams, but with an important distinction. To make splatting work, you use the @ symbol instead of the normal $ for a variable. When PowerShell sees the @ symbol, it knows to deconstruct the hash table and use the key/values as named parameters and their corresponding values.

There are two things to note. I could have included the $ sql as another value in the hash table. It would have looked like “Query” = $ sql (or the actual query as a hard-coded value). In the demo, I made them separate to demonstrate that it is possible to mix named parameters with splatting. On a personal note, I also think it makes the code cleaner if the values that change on each call are passed as named parameters and the values that remain fairly static to become part of the splat.

Second, the technique of splatting applies to all cmdlets in PowerShell, not just Invoke-SqlCmd. Feel free to implement this technique in your own projects.

When you execute the command, you don’t get anything in return. On the SQL Server, the new database was created, but because you didn’t request anything back, PowerShell simply returns to the command line.

Creating Tables

For the next task, create a table to store the names and URLs of some favorite YouTube channels. Because you’ll be working with the new TeenyTinyDB instead of master, you will need to update the Database key/value pair in the hash table.

$ dbName = ‘TeenyTinyDB’

$ sqlParams[“Database”] = $ dbName

Technically I could have assigned the database name without the need for the $ dbName variable. However, I often find myself using these values in other places, such as an informational message. Perhaps a Write-Debug “Populating $ dbName” message in my code. Placing items like the database name in a variable makes these tasks easy.

With the database value updated, you can now craft a SQL statement to create a table then execute the command by once again using Invoke-SqlCmd.

$ sql = @’

CREATE TABLE [dbo].[FavoriteYouTubers]

(

    [FYTID]       INT            NOT NULL PRIMARY KEY

  , [YouTubeName] NVARCHAR(200)  NOT NULL

  , [YouTubeURL]  NVARCHAR(1000) NOT NULL

)

‘@

Invoke-Sqlcmd -Query $ sql @sqlParams

In this script, you take advantage of PowerShell’s here string capability to spread the create statement over multiple lines. If you are not familiar with here strings, it is the ability to assign a multi-line string to a variable. To start a here string, you declare the variable then make @ followed by a quotation mark, either single quote or double quote, the last thing on the line. Do note it has to be last; you cannot have anything after it such as a comment.

The next one or more lines are what you want the variable to contain. As you can see, here strings make it easy to paste in SQL statements of all types.

To close out a here string, simply put the closing quotation mark followed by the @ sign in the first two positions of a line. This has to be in the first two characters if you attempt to indent the here string won’t work.

With the here string setup, call Invoke-SqlCmd to create the table. As with the previous statement, it doesn’t produce any output, and it simply returns us to the command line.

Loading Data

In this example, create a variable with a SQL query to load multiple rows via an INSERT statement and execute it.

1

2

3

4

5

6

7

8

9

10

11

$ sql = @’

INSERT INTO [dbo].[FavoriteYouTubers]

  ([FYTID], [YouTubeName], [YouTubeURL])

VALUES

  (1, ‘AnnaKatMeow’, ‘https://www.youtube.com/channel/UCmErtDPkJe3cjPPhOw6wPGw’)

, (2, ‘AdultsOnlyMinecraft’, ‘https://www.youtube.com/user/AdultsOnlyMinecraft’)

, (3, ‘Arcane Training and Consulting’, ‘https://www.youtube.com/channel/UCTH58i-Gl1bZeATOeC4f25g’)

, (4, ‘Arcane Tube’, ‘https://www.youtube.com/channel/UCkR0kwYjQ_gngZ8jE3ki7xw’)

, (5, ‘PowerShell Virtual Chapter’, ‘https://www.youtube.com/channel/UCFX97evt_7Akx_R9ovfiSwQ’)

‘@

Invoke-Sqlcmd -Query $ sql @sqlParams

For simplicity, I’ve used a single statement. There are, in fact, many options you could employ. Reading data from a file in a foreach loop and inserting rows as needed, for example.

Like the previous statements, nothing is returned after the query executes, and you are returned to the command prompt.

Reading Data

People are funny. They love putting their data into databases. But then they actually expect to get it back! Pesky humans.

Fortunately, PowerShell makes it easy to return data from SQL Server. Follow the same pattern as before–set up a query and store it in a variable, then use Invoke-SqlCmd to execute it.

$ sql = @’

SELECT [FYTID]

     , [YouTubeName]

     , [YouTubeURL]

  FROM dbo.FavoriteYouTubers

‘@

Invoke-Sqlcmd -Query $ sql @sqlParams

Unlike the previous queries, this actually generates output.

 How to Create an Ubuntu PowerShell Development Environment – Part 3

Here you can see each row of data, and the values for each column. I want to be very precise about what PowerShell returns.

This is a collection of data row objects. Each data row has properties and methods. The sqlserver module converts each column into a property of the data row object.

The majority of the time, you will want to work with the data returned to PowerShell, not just display it to the screen. To do so, first assign the output of Invoke-SqlCmd to a variable.

$ data = Invoke-Sqlcmd -Query $ sql @sqlParams

If you want to see the contents of the variable, simply run just the variable name.

This will display the contents of the collection variable $ data, displaying each row object, and the properties for each row.

 How to Create an Ubuntu PowerShell Development Environment – Part 3

You can also iterate over the $ data collection, here’s a simple example.

foreach($ rowObject in $ data)

{

  “$ ($ rowObject.YouTubeName) is a favorite YouTuber!”

}

This sample produces the following output:

 How to Create an Ubuntu PowerShell Development Environment – Part 3

In this code, I just display a formatted text string, but you could do anything you want to with it, such as writing to an output file.

Cleanup

When I was a kid, mom always taught me to put my toys away. There are many reasons why you would want to remove containers you are no longer using. Testing is one, and you may wish to write a script to spin up a new container, load it with data, then let the testers do their thing. When done, you may wish to stop the container or delete it altogether.

Stopping and Starting Containers

For a first step, use Docker to see what containers are currently running.

sudo docker container ls

The output shows that on the system only one container is running.

 How to Create an Ubuntu PowerShell Development Environment – Part 3

Take the scenario of wanting to shut down the container, but not removing it. Perhaps you want to turn it off when the testers aren’t using it to save money and resources. To do this, simply use the stop command.

After issuing a stop, you should do another listing to ensure it is, in fact, stopped. You might think you could do another container ls, but note I said it lists currently running containers. If you want to see all containers, running or not, you must use a slightly different Docker command.

sudo docker stop arcanesql

sudo docker ps -a

The stop command will stop the container with the name passed in. The ps -a command will list all containers running or not.

 How to Create an Ubuntu PowerShell Development Environment – Part 3

If you look at the STATUS column, on the very right side of the output, you’ll see the word Exited, along with how long in the past it exited. This is the indicator the container is stopped.

In this example, say it is the next morning. The testers are ready to get to work, so start the container back up.

sudo docker container start arcanesql

All that is needed is to issue the start command, specifying container, and provide the name of the container to start.

 How to Create an Ubuntu PowerShell Development Environment – Part 3

As you can see, the status column now begins with Up and indicates the length of time this container has been running.

Deleting a Container

At some point, you will be done with a container. Perhaps testing is completed, or you want to recreate the container, resetting it for the next round of testing.

Removing a container is even easier than creating it. First, you’ll need to reissue the stop command, then follow it with the Docker command to remove (rm) the named container.

sudo docker stop arcanesql

sudo docker rm arcanesql

If you want to be conservative with you keystrokes, you can do this with a single command.

sudo docker rm –force arcanesql

The –force switch will make Docker stop the container if it is still running, then remove it.

You can verify it is gone by running or Docker listing command.

sudo docker ps -a

word image 39 How to Create an Ubuntu PowerShell Development Environment – Part 3

As you can see, nothing is returned. Of course, if you had other containers, they would be listed, but the arcanesql container would be gone.

Removing the Image

Removing the container does not remove the image the container was based on. Keeping the image can be useful for when you are ready to spin up a new container based on the image. Re-run the Docker listing command to see what images are on the system.

sudo docker image ls

The output shows the image downloaded earlier in this article.

word image 40 How to Create an Ubuntu PowerShell Development Environment – Part 3

As you can see, 1.3 GB is quite a bit of space to take up. In addition, you can see that the image was created 2 months ago. Perhaps a new one has come out, and you want to update to the latest—all valid reasons for removing the image.

To do so, use a similar pattern as the one for the container. You’ll again use rm, but specify it is an image to remove and specify the exact name of the image.

sudo docker image rm mcr.microsoft.com/mssql/server:2017-latest

When you do so, Docker will show us what it is deleting.

word image 41 How to Create an Ubuntu PowerShell Development Environment – Part 3

With that done, you can run another image listing using the image ls command to verify it is gone.

word image 42 How to Create an Ubuntu PowerShell Development Environment – Part 3

The image no longer appears. Of course, if you had other images you had downloaded, they would appear here, but the one for the latest version of SQL Server would be absent.

Conclusion

In this article, you saw how to use Docker, from within PowerShell Core, to download an image holding SQL Server 2017. You then created a container from that image.

For the next step, you installed the PowerShell SqlServer module, ran some queries to create a table and populate it with data. You then read the data back out of the database so you could work with it. Along the way, you learned the valuable concept of splatting.

Once finishing the work, you learned how to start and stop a container as well as remove it and the image on which it was based.

This article just scratched the service of what you can do when you combine Docker, SQL Server, and PowerShell Core. As you continue to learn, you’ll find even more ways to combine PowerShell Core, SQL Server and Docker.

Let’s block ads! (Why?)

SQL – Simple Talk

Read More
« Older posts
  • Recent Posts

    • Rickey Smiley To Host 22nd Annual Super Bowl Gospel Celebration On BET
    • Kili Technology unveils data annotation platform to improve AI, raises $7 million
    • P3 Jobs: Time to Come Home?
    • NOW, THIS IS WHAT I CALL AVANTE-GARDE!
    • Why the open banking movement is gaining momentum (VB Live)
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited