The script simply creates a tableauconnectort.tds file and stores it in C:\temp – and the xml content in the file is dynamically referenced as arg(0) and arg(1) when the external tool is called from Power BI Desktop.
Save the script in C:\temp and call it ConnectToTableau.ps1.
The OpenInTableau.pbitool.json file
Next step was to create a pbitool.json file and store it in C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools
{
"version": "1.0",
"name": "Open In Tableau",
"description": "Open connection to desktop model in Tableau ",
"path": "C:/Windows/System32/WindowsPowerShell/v1.0/powershell.exe",
"arguments": "C:/temp/ConnectToTableau.ps1 \"%server%\" \"%database%\"",
"iconData": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAJAAAACQCAYAAADnRuK4AAAABmJLR0QA/wD/AP+gvaeTAAADRklEQVR4nO3dv27TUBiH4WPEitSRS+iCurO0GzdRiS5sXRhAXZhYEAxd2LoUiZtgaxb2iqWXwFiJCzgsqPRPrMb5Jc1x/TxbqgSi5O2xE3+uSwGAUeo2/QRac3R8cla6bvfqB7XOPr19s7e5Z9S2J5t+AoybgIgIiIiAiAiIiICICIiIgIgIiIiAiEziUMbR8cnZovetXbfTlbJ1dbuUy67W80UfP7XDHk83/QQexPVjW/fd9e7trSGPnxqbMCICItLEJqyeljrv593BivbRap0tfNdwH2hVDj58mfuanH5819R+axMBrduQHdvb80BdredT2zEewiaMiICICIiIgIgIiIiAiAiIiICICIiIgIhM4lDGEA5bDGMFIiIgIgIiIiAiAiISTbf1TRK2ZmWTjQvomyRszaomG61ARAREREBEBEREQESaOMdo7eeFjdBYzguzAhEREBHjHP/8fv/i3i8An3/+1dTmowVWICICIiIgIgIiIiAiAiIiICICIiIgIgIiIiAiSx8Lc3Xjcdk/nJ2VWv+/X103+/51dy/9d61ARAREpIlxjilPHvZpbfKwjxWIiICICIiIgIgIiEgTn8KGWmQAfiz/79gH9a1ARG7UP5arG29qBVqHZAXaP5ydDbj7Tqn16v0qXXdZSln4/eo77HFzE+bqxuNy/djW8MdulVLi98smjIiAiNzchI3w6saT1nULv18l3AfqfQrLPnCT80B2ooczD0STRvlF4jp+a/11juVYgYgIiIiAiAiIiICINPEp7Of29txPQC8vLib7qefZq29zX5M/P1439ZpYgYgIiMjSmzCnMY/LKg5bzGMFIiIgIgIiIiAiAiIiICICIiIgIgIiIiAiAiLSxDhHCwzML8cKRERARKJlu2+SsDUPOdnYN0nYmlVNNlqBiAiIiICICIiIgIg08eWZ88Lucl4YkyAgIgIiIiAiAiJinOOWdf0108fKCkREQEQERERARAREREBEBEREQEQERERARCZxKGPw1Y1v3R7y+Kkd9mgioLVPHjZwdeOhWps87GMTRkRARJrYhK1dA1c3fqxGsZ19SOaBhrEJIyIgIgIiIiAiAiIiICICIiIgIgIiIiAAAAAYjb8VJdQbiRXyOAAAAABJRU5ErkJggg=="
}
Test it
Now restart your Power BI desktop and the external tool should be visible in the ribbon
Then open a pbix file with a model and hit the button.
A PowerShell screen will shortly be visible and then Tableau opens the tds file and now we have a new tableau book with a connection to active power bi desktop datamodel.
And we can start to do visualizations that are not yet supported in Power BI –
How can you try it
You can download the files needed from my github repository – link
Feedback
Let me know what you think and if possible share some of the viz that you make.
It is sometimes frustrating to have data that would be interesting for CRM purposes, but that is not inside Dynamics. It is even more frustrating if the data happens to be in one (or more) Microsoft SQL Server databases and not in Dynamics 365 where you would like it to be. It should be easy to connect Dynamics with your SQL Server databases, right?
The data is yours. You keep it in Microsoft software, maybe even in the same domain or the same machine. God, it is so close and, at the same time, so out of reach!
Fortunately, there are two possibilities for getting that data in Dynamics and solving this problem for good. You can:
Migrate the data from SQL Server to D365
Synchronize the data between the SQL Server and D365 regularly
For the latter case, you can actually sync in one direction only (SQL Server to Dynamics) or both directions (bidirectional synchronization ensures data is the same on both systems).
How do I Connect Dynamics to SQL Server?
The key to making this integration work is a third-party software developed by Connecting Software, Connect Bridge, together with the Microsoft SQL Server Linked Server concept. Using these two items makes the procedure straightforward and speedy. Here are the summarized steps:
Connect Bridge – Install, activate and configure Connect Bridge
Microsoft SQL Server Linked Server – Create one that connects to Dynamics 365 CE using Microsoft SQL Server Management Studio
Business logic – Incorporate your rules and specificities into the solution
Using these steps, you end up creating what you could call a Dynamics Linked Server. Let’s see it in action!
In the demo video below, we are going to see a Dynamics database integration example. Following the example, you will see how you could sync your database and your Dynamics 365 CE (Customer Engagement, formerly Dynamics CRM) without coding and in a couple of minutes.
Using Linked Server this way, you can connect Dynamics to your database in SQL Server or your bespoke software, without learning any API and without coding. You’ll use SQL and T-SQL only. This strategy also works well for legacy software modernization or legacy databases.
Using this technique, it is as if you can access the Dynamics365 database without the problems you could cause if you actually went straight to the database. As you are going through the API, everything is perfectly safe.
What is Connect Bridge?
Connect Bridge is a software integration platform that enables this connection to Dynamics with no code. With Connect Bridge and the Dynamics 365 connector, you can either not code at all, as we saw above, or use the programming language of your choice (C#, Java, Python, or many, many others).
Either way, the time necessary to develop the solution you need to get data to or from Dynamics is reduced by up to 90%. After all, you are doing a Dynamics integration using the REST API… without touching the API yourself.
Wondering how this would work in a real-world situation? Check out the following case studies with integrations with Dynamics developed in a week or less by companies across the globe!
Do you think this might be useful for your company?
Get in touch with Connecting Software’s experts, and they will be glad to analyze your specific case and even provide you with a personalized demo. If you have questions or feedback, please click on Ask the Author below, and I will help you out.
By Ana Neto, Connecting Software. Connecting Software creates integration and synchronization software. Connecting Software is a 15-year-old company, with 40 employees spread in 4 different countries.
We are also a proud “Top Member 2019″ at CRMSoftwareBlog.
In the July update of the Power BI Desktop we now can add external tools to the ribbon.
If you install the latest versions of Tabular Editor, DAX Studio and the ALM Toolkit these will be added as tools in the ribbon.
But you can also build and add your own tools.
David Eldersveld (link) has written an excellent series of blogposts about using Python as an external tool – link to part one – and this inspired me to give it a go as well.
Short description of what an external tool really is
An external tool will point to an exe file and you can supply the call to the exe file with arguments including a reference to the %server% and %database%.
The information about the external tool needs to be stored in
C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools
And name the file “<tool name>.pbitool.json”.
This will give me these buttons in my Power BI Desktop
My idea to an external tool
When I build models – I use Excel pivot tables to test and validate my measures and typically I would use DAX Studio to find the localhost port to setup a connection to the currently open PBIX file.
So, I thought it be nice just to click a button in PowerBI Desktop to open a new Excel workbook with a connection to the current model. That would save me a couple of clicks.
If I could create an ODC file when clicking on the button in Power BI and then open the ODC file (Excel is the default application to open these) my idea would work.
I have previously used Rui Romano’s (link) excellent PowerBI powershell tools – link to github and link his blogpost about analyse in Excel – so why not use PowerShell to do this.
Here is a guide to build your own version
Step 1 Create a powershell script
I created a powershell file called ConnectToExcel.ps1 and saved the file in local folder C:\Temp – you can save this where you want it stored. (Link to sample files last in this post)
The script is a modified version of Rui’s function Export-PBIDesktopODCConnection – thank you so much these.
The script contains a function that creates an ODC file where the Datasource and path of the ODC file is determined by to arguments in the function – port and path, The Script also opens Excel and then opens the file.
The scripts contain a
$ args[0]
This will in the end be the value localhost:xxxxx that will be provided when we click the External tool button in Power BI Desktop – and will make more sense after step 2
Notice that I have hardcoded the path where the ODC file will be stored to C:\Temp.
Step 2 Create a .pbitool.json file
The pbitool.json file is relatively simply
Name is the text that will appear in the ribbon.
Description is the tooltip that appears in Power BI Desktop according to the documentation – but it doesn’t work at the moment.
Path is the reference to the exe file you want to activate – and only the exe file.
Arguments is the arguments that you want to pass the exe file – and here we have the to built in references %server% and %database%. Arguments are optional so we could just start Excel or any other program if we wanted .
IconData is the icon that you want to appear in the ribbon – I found an icon via google and then used https://www.base64-image.de/ to convert it to the string.
In this tool we use the Powershell.exe file that can be called with arguments where we specify the script file that we want to be executed and we pass the extra arguments server and database as well – in my script I only use the %server% reference which will give me the server name and portnumber of the local instance.
It means that when the button is clicked in PowerBI Desktop it will execute
The localhost:xxxxxx can is the first argument provided and the value can then be referred to by using $ args[0].
The file must then be stored in C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools and in my case I called it OpenInExcel.pbitool.json.
Depending on your privileges on your computer you might be warned that you need administrative rights to save files in that location.
And if you save the script file elsewhere you need to modify the pbitool.json file.
Step 3 – Test it
Now we are ready to restart Power BI Desktop – and
And it does appear
Next – open a pbix file
This will open a Windows PowerShell window and write the server information
And in the background opens Excel and the ODC file – which results in a pivotable connected to the local instance.
I think the use of PowerShell opens a lot of interesting scenarios for external tools and I look forward to see what other external tools that appear in the community.
Please let me know what you think and if you find it useful.
I am trying to get some references for accessing an SAP/ERP server from Mathematica. It has been suggested that possibly JLink or NetLink could be used. Assuming that I have the server address, username and password required for access, what code from either of the above links would be required to get into an SAP server? Thanks for any advice!
Community Summit, extreme 365 is just around the corner and we can’t hold our excitement to be a part of this year’s very first virtual event. 300+ sessions, inspiring key notes and much more, this Summit is a great way to learn, connect, discover and collaborate without stepping out of your home!
Summits are always the place where you can discover the latest Microsoft Business Applications and related 3rd party apps while speaking with the innovators who created them. It’s a place to find your technology partner and enhance your Microsoft technology stack. And like every year, Inogic has some new productivity app releases and is now a one stop hub to assist you with your Dynamics CRM, Power Platform (PowerApps, Power BI, Power Automate), Field Service, Microsoft Portals Development/Integrations requirements, that you always had in mind!
July 3 | 1:30 PM – 2:00 PM Central European Summer Time
Maplytics™ being quite popular in the Dynamics community, has received quite love in the past few years. Its eminent features Territory Management, Optimized Routing, Appointment Planning and Radius Search are just what the users prefer over other apps. Since it is going to be an engaging session be prepared with your questions. To join this insightful session, you should be logged in and registered to the event.
So, don’t forget to register. Be our guest and let’s make this virtual event even more interesting and enriching with knowledge exchange.
Without further ado, mail us at crm@inogic.com so that we can book your spot at once!
In this session Gaston Cruz is going to cover how to connect to Azure DevOps API in a secure way. His example will show us how to extract metrics of a development team and how to get crucial reporting details of daily basis work and deployments.
Of course it’s going to be great to share some of the reports that we can get connecting Power BI Desktop to data flows (using parameters, functions to populate entities)
Greg is a drilling engineer responsible for monitoring production systems for an oil rig. His business intelligence (BI) dashboard refreshes every 30 minutes. At 3:30 PM, the dashboard refreshes and he notices a spike in a pump ’s temperature and pressure, which means it needs to be replaced — right now. But the information is already too old; the pump has stopped working and production must cease, resulting in valuable production loss.
The ability to analyze real-time data has become paramount in use cases like Greg’s and countless others to keep businesses competitive. With the rise of IoT and ever-increasing data from customer interactions streaming across the enterprise, if we wait to capitalize on it, data loses its value, leading to missed opportunities and significant problems.
TIBCO Spotfire X now makes it easy to connect and visually analyze data in motion like never before. With native support of real-time streaming data, Spotfire® Data Streams pushes continuous updates into Spotfire for real-time analysis. The result is live dashboards of streaming data that allow business users and frontline staff to analyze and act on data insights while they are still relevant. So Greg can anticipate the problem, fix the pump before it fails, and even increase production.
The first truly real-time BI implementation in the industry
The response to Spotfire X has been resoundingly positive, including analyst feedback that validates it as the first truly real-time BI implementation in the industry. There are other technologies that provide streaming dashboards, but none compare to Spotfire’s ability to deliver streaming analytics and explore the data.
Spotfire X and Spotfire Data Streams let you analyze real-time and historical data together, for full situational awareness so you can better respond to conditions in real time, get to the root cause of problems or issues and predict what might happen next. No other BI tool today applies analysis through direct manipulation to streaming and to historical data at the same time. Only Spotfire X allows you to understand all your data as it changes and apply artificial intelligence (AI) and natural language in one beautiful, easy-to-use tool.
From inventory management to financial fraud detection to ground-staff operations and more, the possibilities are endless.
In manufacturing and oil and gas: With equipment sensors, IoT data streams can be added in seconds to Spotfire for predictive maintenance, production forecasting, and more.
In transportation, logistics, and supply chain: Automation and real-time analytics are key for improving customer experience, assessing and acting on security risks in real-time, and optimizing operations in response to changing conditions to keep everything on time.
In banking, insurance, and retail: Applying BI to millions of live transactions in real-time can identify security breaches, spot fraudulent transactions so they can be stopped, or fix non-compliant trades before fines are incurred.
How it Works
Spotfire Data Streams has pre-built connectivity to over 80 data sources as well as custom connectors. The Spotfire Data Streams Server manages data connectivity, storage, continuous queries, alerts, client connectivity, user authentication, and security. At the heart of the server is the continuous query engine that processes high-speed streaming data, creates fully materialized live data tables, manages ad-hoc queries from Spotfire, and continuously pushes live results as conditions change in real time.
At first glance, Hammitt and Becker Safety and Supply may seem to have little in common. They are brands with distinct audiences in different industries with dissimilar business models.
But the two businesses share a common goal: deliver innovative experiences to connect with shoppers in the modern world of retail. By executing on these outside-the-box ideas, each found success and cultivated a loyal following.
Hammitt, a designer of high-end handbags and accessories, did not sell online until about two years ago. The 10-year-old company first built a customer base by hosting private parties in Southern California, which grew into bigger events over time. As the business evolved, it launched an ecommerce web store and partnered with retailers.
CEO Tony Drockton found a way to build a bridge between the online and offline channels because both held tremendous value. Earlier this year, Drockton held a 48-hour pre-sale for a new line of bags and announced it on Facebook Live. As part of the promotion, shoppers could pick up their purchases at a beach party that weekend featuring food, drinks and live music while also raising money for charity. Once word was out, social influencers who partner with Hammitt posted about the sale to spread the message and drum up excitement.
The “beach bash” was a roaring success. Several hundred people showed up and the company raised $ 30,000 for a local educational foundation. Hammitt livestreamed parts of the event and pushed out fundraising updates on social media. The CEO’s creativity turned a standard product release into an opportunity to transform casual customers into brand loyalists.
“We’re in an era today where it’s exceedingly difficult to control the customer experience with the different channels and countless factors that can affect it,” Drockton said. “So we just try to control the brand message as much as possible by thinking about the customer impact of every business decision, then hope that the fans that follow Hammitt will share that with other people. To me that’s the ultimate experiential retail.”
Becker Safety and Supply was also an “offline” company for much of its history. The supplier of safety supplies and equipment in Greeley, Colo., started out as a distributor with a warehouse and no retail presence. As the business grew, it opened a retail space within its warehouse for customers seeking an in-person experience. It will soon launch an ecommerce website that allows clients to purchase boots, tools, fire retardant clothing and more.
“Customers don’t really care that we only have 15 people, that we’re this small, family-run company without a large staff to do development on ecommerce and all these kinds of things,” Vice President Devin Becker said. “They just want to be treated the same way that they would at larger stores with that personal touch that we can provide.”
Becker Safety and Supply’s products are unique because customers need training to know how to use them properly. That inspired the company to host an annual safety expo and stream training sessions on Facebook Live. It also uses Snapchat and Instagram, unconventional social platforms for a B2B retailer, to engage and interact with customers.
Despite the upheaval and uncertainty surrounding retail, these businesses are proof that customers reward outstanding experiences. These brands are finding ways to differentiate themselves from the competition and are primed to excel thanks to that ingenuity.
Retail is not going away – it’s just transforming rapidly. And if you’re in search of inspiration for your own business, the stories of Hammitt and Becker Safety and Supply should provide just that.
Discover the steps your company can take to create memorable shopping experiences in this white paper, Build the Foundation for Great Customer Experiences.
Posted on Thu, November 15, 2018
by NetSuite filed under