• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: “Current

#PowerBI – External tool to connect Excel to the current PBIX file

July 28, 2020   Self-Service BI

In the July update of the Power BI Desktop we now can add external tools to the ribbon.

If you install the latest versions of Tabular Editor, DAX Studio and the ALM Toolkit these will be added as tools in the ribbon.

But you can also build and add your own tools.

David Eldersveld (link) has written an excellent series of blogposts about using Python as an external tool – link to part one – and this inspired me to give it a go as well.

The official documentation can be found here.

Short description of what an external tool really is

An external tool will point to an exe file and you can supply the call to the exe file with arguments including a reference to the %server% and %database%.

The information about the external tool needs to be stored in

C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools

And name the file “<tool name>.pbitool.json”.

 #PowerBI – External tool to connect Excel to the current PBIX file

This will give me these buttons in my Power BI Desktop

 #PowerBI – External tool to connect Excel to the current PBIX file

My idea to an external tool

When I build models – I use Excel pivot tables to test and validate my measures and typically I would use DAX Studio to find the localhost port to setup a connection to the currently open PBIX file.

So, I thought it be nice just to click a button in PowerBI Desktop to open a new Excel workbook with a connection to the current model. That would save me a couple of clicks.

If I could create an ODC file when clicking on the button in Power BI and then open the ODC file (Excel is the default application to open these) my idea would work.

I have previously used Rui Romano’s (link) excellent PowerBI powershell tools – link to github and link his blogpost about analyse in Excel – so why not use PowerShell to do this.

Here is a guide to build your own version

Step 1 Create a powershell script

I created a powershell file called ConnectToExcel.ps1 and saved the file in local folder C:\Temp – you can save this where you want it stored. (Link to sample files last in this post)

The script is a modified version of Rui’s function Export-PBIDesktopODCConnection – thank you so much these.

Function
ET-PBIDesktopODCConnection

{

# modified the https://github.com/DevScope/powerbi-powershell-modules/blob/master/Modules/PowerBIPS.Tools/PowerBIPS.Tools.psm1

# the Function Export-PBIDesktopODCConnection

    [CmdletBinding()]

param

(

[Parameter(Mandatory =
$ false)]

        [string]

$ port,

[Parameter(Mandatory =
$ false)]

        [string]

$ path

)

$ port = $ port

$ odcXml= “<html xmlns:o=””urn:schemas-microsoft-com:office:office””xmlns=””http://www.w3.org/TR/REC-html40″”><head><meta http-equiv=Content-Type content=””text/x-ms-odc; charset=utf-8″”><meta name=ProgId content=ODC.Cube><meta name=SourceType content=OLEDB><meta name=Catalog content=164af183-2454-4f45-964a-c200f51bcd59><meta name=Table content=Model><title>PBIDesktop Model</title><xml id=docprops><o:DocumentProperties xmlns:o=””urn:schemas-microsoft-com:office:office”” xmlns=””http://www.w3.org/TR/REC-html40″”&gt; <o:Name>PBIDesktop Model</o:Name> </o:DocumentProperties></xml><xml id=msodc><odc:OfficeDataConnection xmlns:odc=””urn:schemas-microsoft-com:office:odc”” xmlns=””http://www.w3.org/TR/REC-html40″”&gt; <odc:Connection odc:Type=””OLEDB””>

<odc:ConnectionString>Provider=MSOLAP;Integrated Security=ClaimsToken;Data Source=$ port;MDX Compatibility= 1; MDX Missing Member Mode= Error; Safety Options= 2; Update Isolation Level= 2; Locale Identifier= 1033</odc:ConnectionString>

<odc:CommandType>Cube</odc:CommandType> <odc:CommandText>Model</odc:CommandText> </odc:Connection> </odc:OfficeDataConnection></xml></head></html>”

#the location of the odc file to be opened

$ odcFile = “$ path\excelconnector.odc”

$ odcXml|Out-File $ odcFile -Force

# Create an Object Excel.Application using Com interface

$ objExcel=New-Object -ComObject Excel.Application

# Make Excel visible

$ objExcel.Visible = $ true

# Open the Excel file and save it in $ WorkBook

$ WorkBook = $ objExcel.Workbooks.Open($ odcFile)

}

write $ args[0]

ET-PBIDesktopODCConnection -port $ args[0] -path “C:\Temp”

The script contains a function that creates an ODC file where the Datasource and path of the ODC file is determined by to arguments in the function – port and path, The Script also opens Excel and then opens the file.

The scripts contain a

$ args[0]

This will in the end be the value localhost:xxxxx that will be provided when we click the External tool button in Power BI Desktop – and will make more sense after step 2

Notice that I have hardcoded the path where the ODC file will be stored to C:\Temp.

Step 2 Create a .pbitool.json file

The pbitool.json file is relatively simply

 #PowerBI – External tool to connect Excel to the current PBIX file

Name is the text that will appear in the ribbon.

Description is the tooltip that appears in Power BI Desktop according to the documentation – but it doesn’t work at the moment.

Path is the reference to the exe file you want to activate – and only the exe file.

Arguments is the arguments that you want to pass the exe file – and here we have the to built in references %server% and %database%. Arguments are optional so we could just start Excel or any other program if we wanted .

IconData is the icon that you want to appear in the ribbon – I found an icon via google and then used https://www.base64-image.de/ to convert it to the string.

In this tool we use the Powershell.exe file that can be called with arguments where we specify the script file that we want to be executed and we pass the extra arguments server and database as well – in my script I only use the %server% reference which will give me the server name and portnumber of the local instance.

It means that when the button is clicked in PowerBI Desktop it will execute

C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe C:\temp\connetToExcel.ps1 localhost:xxxxx databasename

The localhost:xxxxxx can is the first argument provided and the value can then be referred to by using $ args[0].

The file must then be stored in C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools and in my case I called it OpenInExcel.pbitool.json.

Depending on your privileges on your computer you might be warned that you need administrative rights to save files in that location.

And if you save the script file elsewhere you need to modify the pbitool.json file.

Step 3 – Test it

Now we are ready to restart Power BI Desktop – and

 #PowerBI – External tool to connect Excel to the current PBIX file

And it does appear

Next – open a pbix file

This will open a Windows PowerShell window and write the server information

 #PowerBI – External tool to connect Excel to the current PBIX file

And in the background opens Excel and the ODC file – which results in a pivotable connected to the local instance.

 #PowerBI – External tool to connect Excel to the current PBIX file

With a connection to the localhost:52510

 #PowerBI – External tool to connect Excel to the current PBIX file

The files

You can download the files needed from here – https://github.com/donsvensen/erikspbiexcelconnector

Feedback

I think the use of PowerShell opens a lot of interesting scenarios for external tools and I look forward to see what other external tools that appear in the community.

Please let me know what you think and if you find it useful.

Let’s block ads! (Why?)

Erik Svensen – Blog about Power BI, Power Apps, Power Query

Read More

Three Supply Chain Trends Accelerate In Response To Current Crisis

May 22, 2020   SAP
 Three Supply Chain Trends Accelerate In Response To Current Crisis

In Episode 33 of the SAP Experts Podcast, Martin Barkman, Senior Vice President & Global Head of Solution Management for Digital Supply Chain at SAP, outlines the supply chain agenda in the age of COVID-19. This post looks at the current state of the discussion among consultancies, SAP experts, and customers about which trends will accelerate throughout the current crisis.

Transparency

In a situation like the current crisis, it is easy to forget where the economy came from. As Hans Thalbauer, then SAP SVP of Digital Supply Chain and Industry 4.0, pointed out in an episode of the SAP Experts Podcast recorded in late 2019, volatile political and social environments were some of the most pressing issues supply chain leaders were facing even before COVID-19 entered the stage. After all, the pre-coronavirus times were those of Brexit, the Sino-American trade wars, and widespread social unrest.

The bad news is: COVID-19 does not replace these types of disruptions, and instead is likely to even exacerbate them. First, the evolving “dance” phase is playing out at different speeds in different countries. Each country is at the constant peril of a second wave necessitating a return of harsh interferences with businesses’ operations (and, as a result, with the supply chains crossing their territory). Second, the crisis is heating up the dynamic of preexisting crises – just take US-China relations or the widening political cleavages within the EU as examples. The good news is: The business world has already learned a great deal about how to handle these types of uncertainties and upheavals. The food supply, for example, has remained stable in most markets even throughout the toughest lockdowns.

This is in part thanks to the visibility into stock levels, demand development, and supply chain constraints afforded by modern business technology. Consultancies like BCG and McKinsey are therefore united in prescribing one concept for handling the uncertainties at hand: A “nerve center” or “control tower” that has visibility over the entire supply chain, rendering the future in alternative scenarios and respective reaction protocols. The phase of volatility and uncertainty ahead, it thus seems, will be a major argument for reinforcing “insight-to-action” capabilities in supply chain management: The ability to act quickly, on the basis of real-time data and predictions.

Automation

Even before COVID-19 entered the scene, automation was high up on the agenda of supply chain leaders, as the promise of producing in a more agile manner with higher transparency and lower costs is a convincing case in itself. Now, reports of consultancies like Kearney and EY highlight that this case was further reinforced by the epidemic.

First, having fewer humans crowded around a given production line reduces the risk of viral transmission among them. This not only serves to protect the workforce but also the continuity of the operations themselves: Regulators keen on enforcing physical distancing rules will have fewer concerns about factories with fewer humans in them, and sick leaves will not have as much of an impact on production plans.

Second, the traumatic experience of the West’s inability to adequately equip its healthcare system with protection equipment and testing kits is reinforcing the political desire to repatriate manufacturing. If this pressure persists (and is perhaps intensified by consumer and/or shareholder preferences), manufacturers could see a paradigmatic shift in the way they operate. As BCG points out, “(artificial intelligence) also allows (manufacturers) to operate a larger number of small, efficient facilities nearer to customers – rather than a few massive factories in low-wage nations – by deploying advanced manufacturing technologies such as 3D printing and autonomous robots that require few workers.”

A recession boosting the adoption of such automation technologies would not be without historical precedent. In fact, it would rather be the norm. According to the Economist Intelligence Unit, it has been in recessions that the adoption of automated processes has really spiked alongside the pressure to lower costs. It would not be a surprise if this historical pattern repeats itself in this crisis.

New business models

Until there is a vaccine against the coronavirus, the economy will be held in an artificially suppressed state – a condition the Economist aptly terms the “90% economy.” As consumers are deprived of income and spending options, a drought in revenue will eat its way up the value chain, creating a shortfall in cash flow on all levels. When it comes to spending and investments, businesses will have to find ways to do more with less. For suppliers, this implies that being able to service increasingly cash-strapped customers will become a competitive advantage.

In the short term, this usually takes the form of more generous payment terms – offering the option to defer payments or to pay in installments, for example. But as the economy will likely persist in dire straits for an extensive period of time, new business models could emerge that more closely align spending and consumption.

Take what KAESER KOMPRESSOREN has been doing for quite some time as an example: Instead of selling its compressors (which would require a considerable upfront investment on the part of its clients), it installs the compressors on clients’ sites while retaining ownership. Then, they charge for the compressed air – a classic pay-per-use model. This allows KAESER KOMPRESSOREN’s customers to upgrade their compressors while aligning their cash outflows with their cash inflows, alleviating their liquidity situation. Another variant on the same theme is the addition of services to an existing product.

Hoval, for example, sells predictive maintenance as a value-added service on top of its heating systems. This adds a revenue stream to Hoval’s business that ensures customer loyalty while not requiring too high of a startup investment.

What’s next

Imagine supply chains once this crisis has abated: There is a good chance that future companies will have deeper insight into their operations thanks to increased transparency, they will do more with less, thanks to a higher level of automation, and they will deliver value to their customers in entirely new ways.

Regardless of how bad the recession is going to play out, the companies that master these three points will be more resilient and more profitable down the line. Yes, the future may be uncertain, but we will surely see some interesting times ahead in supply chain management.

Explore “Global Supply Chain Management For A Rapid And Dynamic Transformation Environment.”

Let’s block ads! (Why?)

Digitalist Magazine

Read More

Fail, Survive or THRIVE? What will be your outcome to the current economic downturn?

May 21, 2020   Microsoft Dynamics CRM

What will be your outcome to the current economic downturn?

The Harvard Business Review studied the last 3 recessions finding very important information that every business leader should review.

A few of the study’s findings:

-Firms that cut costs faster and deeper than rivals don’t necessarily flourish. They have the lowest probability—21%—of pulling ahead of the competition when times get better *

-Companies that reduce costs selectively and invest relatively comprehensively in the future by spending on marketing and other key areas have the highest probability – 37% of regaining pre-recession sales and profitability levels*

-Companies that invest in, for example, marketing during a recession may produce only modest benefits during the recession but adds substantially to sales and profits afterward*

Roaring Out of Recession studied the strategy and performance of 4700 businesses during the past 3 global recessions, breaking down the data into three periods: the three years before a recession, the three years after, and the recession years themselves.

The findings are stark. Seventeen percent of the companies in the study didn’t survive a recession The survivors were painfully slow to recover from the battering. About 80% of them had not yet regained their prerecession growth rates for sales and profits three years after a recession*

Only a small number of companies—approximately 9% of the sample—flourished after a slowdown, outperforming rivals in their industry by at least 10% in terms of sales and profits growth*

Want to learn what businesses did and didn’t do that determined how the recession impacted their business? Join us for a free, informative webinar

Wednesday, June 10th 1pm Eastern Click Here to Register

Speakers

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

The Future Plant: Current Challenges In Asset Performance

May 12, 2020   SAP
 The Future Plant: Current Challenges In Asset Performance

Part 1 of a three-part series

When Horst started his work as a machine technician at a manufacturing plant 20 years ago, asset management looked very different than it looks today. Having climbed up the career ladder to become an asset manager, Horst has created a modern maintenance environment that tackles many of the major problems German manufacturing companies are concerned with.

Horst no longer has to do a daily tour through the plant to note downed-machine issues or check on maintenance-due dates. Instead, Horst uses asset management software that provides him a constant overview of all assets, right from his desk. Every asset is digitally represented by its digital twin and can permanently be monitored via a visual display.

By continuously collecting relevant data, designated devices automatically enrich an asset’s digital twin with information about its current performance and condition. Data analytics algorithms can use this information to generate a set of relevant KPIs throughout each asset’s entire lifecycle.

For Horst, it is crucial to always be prepared for any possible machine breakdown. Therefore, he is especially interested in knowing an asset’s mean time to failure (MTTF) or mean time between failures (MTBF), as well as the frequency of these incidents.

Knowing particular failures, and how often they typically occur with certain assets, helps Horst classify machine problems to common failure modes and get an understanding of when a failure is likely to happen. It also supports him in grouping his assets into certain risk categories depending on how often, how severe, and how detectable failures are occurring with an asset. This entire process is called Failure Mode Analytics – an important analysis for strategic asset management that is strongly enabled by the ability to monitor each asset’s performance.

Two other important KPIs are relevant once a predicted failure occurs: mean time to repair (MTTR) and mean downtime. As the main measures of machine availability, these KPIs are supposed to be relatively low to enable a maximum level of production continuity.

Following the principles of lean management, Horst is constantly engaged in putting appropriate measures in place to reduce the time a machine is down for repair. In this context, respective breakdown costs also play a meaningful role in managing asset performance.

Last, but not least, an asset’s comprehensive performance can be evaluated in the Overall Equipment Effectiveness KPI. This KPI indicates the percentage of time in which an asset is producing only good parts (quality) as fast as possible (performance) with no stop time (availability). Combining the aspects of quality, performance, and availability makes this measure a very powerful tool for Horst in assessing his assets and in gaining data-based knowledge about his overall plant productivity.

The variety of different KPIs makes it possible to have continual, real-time insight into all assets and their performance. For Horst, who always needs to have a profound overview of his assets’ current state, this really makes life easier. More importantly, the asset performance software equips him with a reliable base for decision-making.

While in the past, most decisions were made based on gut feeling, today the digital twin and its KPIs serve as the source for making machine diagnoses and determining asset maintenance routines. Also, standardized KPIs allow comparisons between several groups of assets or across different plants. This makes processes more transparent and more reliable, therefore helping Horst achieve the best possible asset operation.

By enabling technologies for the smart factory, companies are achieving Mission Unstoppable: making facilities management a transparent, manageable process.

Let’s block ads! (Why?)

Digitalist Magazine

Read More

Using the Common Data Service (Current Environment) Power Automate Connector

March 19, 2020   Microsoft Dynamics CRM

Although the new Common Data Service (Current Environment) connector in Power Automate has been available for a while, I am still running into confusion about which connector is which.

Let’s break it down.

Common Data Service Connector

This was the initial CDS connector within Power Automate. This is available in the context of a solution AND outside the context of a solution.

What does this mean?

If I simply navigate to My Flows and create a new Flow, this is not within a solution.

8814.pastedimage1584453724469v1 Using the Common Data Service (Current Environment) Power Automate Connector

You will NOT see the Common Data Service (Current Environment) connector here.

  

4530.pastedimage1584453741557v2 Using the Common Data Service (Current Environment) Power Automate Connector

With this connector, you will only see these Triggers:

4150.pastedimage1584453752898v3 Using the Common Data Service (Current Environment) Power Automate Connector

  

And these Actions

  

3146.pastedimage1584453769287v4 Using the Common Data Service (Current Environment) Power Automate Connector

  

Additionally, you are required to populate the environment manually for each Trigger and Action.

  

6305.pastedimage1584453780015v5 Using the Common Data Service (Current Environment) Power Automate Connector

You can find additional detail on this connector HERE

Common Data Service (Current Environment)

How do I access this connector? This is the most common question I get right now.

This must be within a solution. You can use an existing solution or create a new solution.

5381.pastedimage1584453802555v6 Using the Common Data Service (Current Environment) Power Automate Connector

Open the solution and create a new Flow

  

0640.pastedimage1584453817419v7 Using the Common Data Service (Current Environment) Power Automate Connector

This part is important! Since the original CDS connector is also available within a solution, you will see TWO connectors side-by-side. You want to hover over the connector to ensure you see Common Data Service (Current Environment)

  

3364.pastedimage1584453829322v8 Using the Common Data Service (Current Environment) Power Automate Connector

 This is the connector you want to use. Why? Take a look at the Triggers and Actions:

  

2804.pastedimage1584453840684v9 Using the Common Data Service (Current Environment) Power Automate Connector

 Although there is only one trigger, you have these new options

  

1346.pastedimage1584453852174v10 Using the Common Data Service (Current Environment) Power Automate Connector

As you can see, you do not need to specify the environment manually, as it uses the environment the solution currently resides in.

Here are the Actions:

23323.pastedimage1584453872268v11 Using the Common Data Service (Current Environment) Power Automate Connector

4135.pastedimage1584453872282v12 Using the Common Data Service (Current Environment) Power Automate Connector

You can find additional detail on this connector HERE

Thanks for reading!

Aaron Richards

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

Read More

Teradata Earns Top Ranking in “Current Offering” Category and Named a Leader in Data Management for Analytics Evaluation by Independent Research Firm

February 20, 2020   BI News and Info

Teradata (NYSE: TDC) announced today that Forrester Research has named the company a Leader in “The Forrester Wave™: Data Management For Analytics, Q1 2020,” written by principal analyst Noel Yuhanna, on February 12, 2020. Forrester analyzed and scored the top vendors in the Data Management for Analytics market according to 25 criteria, giving Teradata the highest score in the Current Offering category.

Forrester’s positioning of Teradata in the Data Management for Analytics wave evaluation is based on the company’s Vantage platform, Teradata’s flagship product that delivers analytics, data lakes and data warehouses in a single, unified platform. By using 100% of available data to uncover real-time business intelligence at scale, Teradata Vantage enables businesses to turn analytic insights into answers.

 Teradata Earns Top Ranking in “Current Offering” Category and Named a Leader in Data Management for Analytics Evaluation by Independent Research FirmIn the report, Forrester evaluated Teradata among 14 Data Management for Analytics providers and had this to say: “Teradata offers advanced DMA capabilities, including in-database analytics, distributed query processing, self-service, automation, workload management, and broad security. Teradata Vantage provides the same advanced analytic processing across all deployment options, including Teradata Cloud, public cloud options like AWS, Azure, and Google, and as-a- service and on-premises, which gives clients choice and flexibility.”

Forrester also notes that Teradata’s reference customers, “…like its ease of use, hybrid cloud and independent storage, and compute processing capabilities,” and summarizes that Teradata, “…remains a prominent choice, especially for hybrid deployments where scalability and availability are critical.”

“Our customers are the largest and most data-intensive companies in the world and rely on Teradata to deliver advanced analytic processing, at scale, regardless of their data infrastructure ,” said Chris Twogood, senior vice president of marketing at Teradata. “We believe Teradata’s position as a leader in Forrester’s 2020 evaluation makes clear that our Vantage platform is delivering real business value to our customers, helping them move from analytics to answers with a cloud-forward platform that gives them the choice and flexibility they need.”

Read the complete The Forrester Wave™: Data Management For Analytics, Q1 2020 report here.

Let’s block ads! (Why?)

Teradata United States

Read More

4 Ways to Leverage Your Current Customer Base to Drive More Business

December 12, 2019   CRM News and Info

by Helen Veyna

Last updated December 12, 2019

As marketers, we’re always chasing the next big trend and trying to come up with new and innovative campaigns that will enable us to stand out and resonate with our target audience. What many of us fail to realize, however, is that our own customers can often speak better to our value than a quirky ad or paid campaign, meaning that developing solid and lasting relationships with them can pay off big in the long run. 

shutterstock 1176073267 4 Ways to Leverage Your Current Customer Base to Drive More Business

Leveraging your current customers to secure new leads is one of the most effective marketing techniques at your disposal. Your current customers understand your lead’s pain points, speak their language, and have amazing results to share that show how your solution works. They can make the case for your product in less time than it takes for leads to go through an entire nurture campaign. That’s not saying you should abandon those efforts (because they are extremely important to creating an enjoyable customer journey), but it’s essential that you have several customers who are willing to speak to your organization’s excellence.  

Your current customers aren’t just valuable in helping you attract new leads. When you fully leverage these relationships, you’ll find that there’s immense room for growth within your current customers. If you do your due diligence in collecting customer insights and checking in with clients to make sure they’re getting the most out of your products and services, you’ll discover that many of them are great candidates for upsell or cross-sell opportunities. 

Still not convinced? Keep reading to learn 4 ways to leverage your current customer base to drive more business. 

1. Collaborate With Customers to Develop Success Stories 

As I previously mentioned, your customers likely have great results to share that prove the value of your product or solution. It is also very likely that they are already sharing their success among their marketing friends and putting in a good word for your brand. And while that might help you secure one or two new customers, you’re really shooting yourself in the foot if you don’t find a way to share these stories with a wider audience. 

Developing customer success stories is simple (the hardest part is just scheduling a brief 30-45 minute interview) and can result in significant ROI. For example, a good success story can help you close a deal if a customer is looking for a particular use case that resonates with the challenges they are currently facing. 

To make sure you have all your bases covered, a good place to start building your success story library is by identifying a few happy customers in various industries. Reach out to them and ask if they’d like to schedule a time to talk and collaborate with them to create content pieces you can share through your website, automated nurture campaigns, and social media. 

Don’t worry if you’re new to reaching out to customers to secure interviews — the process is much simpler than you probably think. You can identify happy customers who might be willing to talk to you by speaking with your account managers, reviewing customer surveys and net promoter scores (NPS), and monitoring social media for any positive mentions. You can even place a prominent callout in your customer communications to see if your contacts are interested. Chances are that many of them will jump at the opportunity. After all, who doesn’t love some free publicity? 


rq5Kvcsd7okTUEeCmcVPNn 4 Ways to Leverage Your Current Customer Base to Drive More Business

2. Invite Your Customer Advocates to Spread the Word on the Web

In a world where customers are accustomed to conducting thorough research before making a purchasing decision, establishing a positive web presence is invaluable. To do that, however, you need to customers who are willing to leave a good review on an industry website or rave about you on social media. This sort of advocacy is critical to your success. 

Your target audience likely references popular and trustworthy websites when weighing their options. Inviting your customer advocates to leave a good review for you on these sites can help you put your best foot forward. Although many customers won’t need an incentive to share their feedback, sending a small gift (such as a Starbucks gift card) is a great way to show your appreciation and increase your chances of securing a review. 

You should also make it a priority to have a positive presence on social media, especially since many of your current customers and leads are likely active on these channels. A great way to do that is by encouraging your advocates to share positive results they’ve experienced on their personal profiles or company accounts, making sure to tag your company and use relevant hashtags. This will improve your chances of appearing in relevant searches and help you establish positive brand recognition among your target audience. 

Act-On’s Advanced Social Media Module can help you easily monitor mentions and engage with your customer advocates. Continuing the conversation by commenting on their posts will allow you to keep the momentum going on their own profiles, improving your chances of showing up on your prospect’s feeds. You can also repost any posts that really stand out on your own social media profiles to ensure your followers see them as well. 


3. Ask Your Customers to Serve as References

Your high-intent leads are probably evaluating several options when the time comes to talk to Sales. They’ve probably already read up on everything you have to offer and maybe even sat through a demo, so what they’re looking for at this stage is additional proof that you can truly do what you say you do. 

This is a great time to pull out some of those success stories you’ve developed. Or you can go the extra mile by scheduling some time for your leads to chat with current customers. This allows your prospects to ask questions they wouldn’t be able to ask one of your salespeople and learn more about how their peers are benefiting from your solution. 

To increase your chance of success, you should aim to set up leads with customer advocates who work in similar industries or have similar use cases. Getting as targeted as possible with your customer reference will help ensure that they can provide helpful answers to all your lead’s questions and highlight your organization’s best features.

4. Nurture Your Customers to Improve Retention and Match Them With New Products and Services

We’ve talked a lot about how to leverage your current customer base to charm new leads. A mistake that many companies make, however, is failing to acknowledge the opportunity for growth that exists within their current customer base. 

Your current customers have already demonstrated that they believe in the value of your brand, so half your work is already done. Your task now is to keep them happy and offer them opportunities to grow with your company. 

An easy and effective way to do that is by entering your customers into automated nurture campaigns that share best practices about your product and offer the opportunity to learn about new products and/or services. You can then track engagement to automatically segment customers that are prime upsell or cross-sell opportunities and enter them into relevant campaigns to help them learn more about these new options. 

Last but not least, you should also be tracking customer engagement to identify which customers need a little bit more love. Integrating your CRM with a marketing automation platform can make it easy to flag these contacts so that your account managers can reach out to them directly to offer extra support. Studies show that attracting new customers costs 5X more than retaining current ones, so it’s in your best interest to show them that you’re committed to their success. 

Marketing Automation Can Help You Build Long-Lasting Customer Relationships

Whether your focus is customer retention or converting more leads, the right marketing automation tool allows you to leverage your current customer base with ease. Act-On can help you easily distribute content, develop automated nurture campaigns, collect valuable customer insights, and even manage your social media efforts so you can drive more ROI from your current customers. 

If you want to learn how to leverage marketing automation to strengthen your customer relationships and drive more business, check out our eBook “Optimize the Customer Journey” (linked below). 

And, if you’d like to learn more about Act-On’s capabilities and how our marketing automation platform can help you build a more comprehensive and effective digital marketing strategy, feel free to fill out this brief form to book a demo with one of our digital marketing experts. 

Optimize the Customer Journey CTA 4 Ways to Leverage Your Current Customer Base to Drive More Business

Related Blog Posts

Back to blog home

Posted on December 12, 2019

Let’s block ads! (Why?)

Act-On Blog

Read More

Current And Future Role Of Wearables In Healthcare

June 4, 2019   SAP
 Current And Future Role Of Wearables In Healthcare

When you think about the role of technology in healthcare, the first things that probably come to mind are new treatment methods, fancy diagnostic equipment, and so on. However, we are likely standing on the threshold of a complete transformation of the healthcare industry by something that, at a glance, looks much humbler: wearables that merely provide doctors with information about their patients.

There are already plenty of medical applications for wearables, and the market is projected to grow to $ 60 billion by 2023. Let’s take a look at what this means for the industry.

Personalization

John Hancock, one of the most prominent North American insurers, stated in late 2018 that it intends to stop selling traditional life insurance and will instead adopt interactive policies that track fitness and health data via wearable devices and smartphones. This means the health status of every client will be tracked and considered when the company determines the risks associated with the individual’s lifestyle and habits. Clients with habits associated with shorter lifespans (e.g., smoking, low levels of physical activity, unhealthy diet) will have to pay higher premiums. In the long run, this affects not just the insurance industry but healthcare as well, as clients are incentivized to adopt healthier lifestyles, thus decreasing the strain on healthcare.

Remote patient monitoring

While a small smartwatch isn’t as accurate as a full-scale electrocardiograph with 12 electrodes, it has a significant advantage: you can wear it all day, every day. Measuring pulse throughout the day across a wide range of activities is far from the only health measurement a smartwatch can provide. The more data these devices gather, the greater the possibility for healthcare to establish a preventive model that interprets patient data and calls for treatment before a crisis.

Early diagnosis

The wide adoption of wearables that track patients’ health stats enables the collection of huge amounts of data that can be used to establish patterns with the help of AI and machine learning. The system will be able to use the data collected from each individual to predict potential health problems before they arise, allowing for cheaper and more effective preventive measures, as opposed to treating a disease when it is in full swing.

Medication adherence

Patients tend to forget or forgo taking their medications for many reasons. Wearables can alert people when it is time to take their meds, track when they take them, and inform doctors when they don’t adhere to the prescribed regimen. As a result, medical professionals can monitor how strictly their patients follow instructions at home.

More complete information

Normally, doctors have very limited information about their cases. They have to rely on what patients report about their symptoms and the onset of the illness (data which is often incomplete or imprecise) and the patients’ medical history based on their previous interactions (which, again, may be limited). A wearable device collects all of a patient’s data in real time, it doesn’t make mistakes or forget to report an important symptom. As a result, a doctor gets an exhaustive report on the patient’s condition over time and can make a more thorough analysis, leading to more informed and optimal decision-making.

Cost savings

Widescale adoption of expensive wearables and accompanying tech may look like a significant investment, but it can save healthcare millions of dollars in the long run. These devices allow doctors to track their patients’ conditions at a distance, often eliminating the need to transfer them to a medical facility. Recognizing symptoms at an early stage allows for less expensive treatments.

These applications for wearables in the healthcare industry are still in their infancy, but we are likely to see this market grow at a breakneck pace in the coming years.

For more insights, visit the SAP Personalized Medicine online hub or continue the discussion on Twitter.

Let’s block ads! (Why?)

Digitalist Magazine

Read More

Why You Shouldn’t Hardcode the Current Database Name in Your Views, Functions, and Stored Procedures

March 19, 2019   BI News and Info

“There are only two hard things in Computer Science: cache invalidation and naming things”
Phil Karlton

I’m terrible at naming things. I recently wrote some quick code to reproduce a design problem and demonstrate several options for solutions, and later realized that I’d named my objects dbo.Foo, dbo.FooFoo, and dbo.Wat.

But I feel strongly about a few important principles about referring to objects that are already named in a relational database, specifically SQL Server and Azure SQL Database.

Most of the time, you should use a two part-name for objects in the current database. It’s important to know your database context. When referencing an object in the current database, you should not specify the database name in the reference.

For example, if you’re creating a view, function, or stored procedure…

  • ‘SELECT Col1 from dbo.Foo’ is good
  • ‘SELECT Col1 from CurrentDatabaseName.dbo.Foo’ is not good

This might seem like quibbling, but there’s an important difference here: hardcoding the database name in the second command means that you are restricting the code in this object to only work when the database has that exact name. You’re putting a dependency on that database name.

When might a database need to have multiple names?

It’s quite common to need a database to be able to operate under a different name. Here are a few use cases:

In development, test, and pre-production (staging) environments, it’s a common practice to use different database names than production. This not only allows for multiple iterations of a database to be on the same instance of SQL Server, but the database name can make it more obvious which environment you’re connected to. That makes it less likely to have those moments of “oops, I didn’t mean to run that script against production!”

When branching database code in source control, you may wish to have a different database for the branch you are working on, with specific sample data for that branch. Typically, it’s convenient to keep these databases on the same instance of SQL Server, so they need to have different names.

When building database code to validate that your database objects compile from source, it’s better to not have to hard-code the database name in the build. If you do need to hard-code the name, you need to make sure that only one build server at a time can run a build for that database on that instance, otherwise, multiple builds will code on that database.

What about ‘deferred name resolution’?

If you’ve been working with SQL Server for a while, you might wonder about my comment about building database code, because of a feature called ‘deferred name resolution.’

Deferred name resolution has been around in SQL Server for a long time, and it’s available in all editions, from LocalDB to Enterprise. There’s no setting to enable this; it’s on all the time. This feature allows you to reference objects which don’t exist when you create stored procedures and some functions. SQL Server gives you a line of credit that those objects will exist at the time of execution.

This allows a build operation (which validates that your database code will create properly from source control) to succeed, even if you have hardcoded references to a specific database name which doesn’t exist on the build server — at least when it comes to stored procedures and some functions.

But there are some important gotchas:

  • Deferred name resolution doesn’t work in views and in inline-able functions
  • It’s better for our code to not be dependent upon the database having a specific name, so deferred name resolution isn’t necessarily a great feature for builds, anyway

Rule of thumb: don’t use the database name unless you absolutely have to

If you’re writing a cross-database query, yes, database names may need to come into play in your objects.

However, even with cross-database references, often folks find the dependency issue problematic: synonyms are a common tool to be able to dynamically set cross database references and limit the naming dependency.

As long as you’re querying inside the current database, however, keep it simple: don’t specify the database name.

Commentary Competition

Enjoyed the topic? Have a relevant anecdote? Disagree with the author? Leave your two cents on this post in the comments below, and our favourite response will win a $ 50 Amazon gift card. The competition closes two weeks from the date of publication, and the winner will be announced in the next Simple Talk newsletter.

Let’s block ads! (Why?)

SQL – Simple Talk

Read More

Should You Upgrade Your Current ERP or Find a New One?

February 3, 2019   OnContact
gettyimages 820888668 1024x1024 Should You Upgrade Your Current ERP or Find a New One?

If you have an old, outdated, and/or flawed ERP (Enterprise Resource Planning) software system, then this article is for you! When it is time to improve your software situation, you have two options: upgrade the system from your current provider or start fresh with a new vendor. If this is a consideration for you, here are five questions to aide in your decision-making process.

  1. What does your staff think about the current system?

Ask your employees what they think about the ERP today. Obviously, there will be issues. For example, if one of the problems is inconsistent reporting or a lack thereof, that is important. But, do they like the interface? How does it handle billing and shipping? An upgrade may fix the lack of reporting, but if the interface is bad, an upgrade may not be the ideal fix.

  1. How is your relationship with the ERP provider?

When it comes to your ERP provider, the choice to upgrade or find a new one can be simplified. Do they still exist? Have you been using them for service? Do they nickel and dime you for every little thing? This relationship could be the most critical part to making a final decision.

  1. How is your hardware/software infrastructure setup?

This one is simple. Do you have a server and other infrastructure in place to support a new system? Or if you prefer a cloud environment, do you have the software capabilities to do so when managing it yourself? If not, and it is not in the budget or a priority to get up to par, you are going to have to stick with your current provider.

  1. Does the upgrade offer what a new system would?

When your current system offers an upgrade, does it look like a brand-new system or minor changes to what you currently have? Do you feel like you need a big change or are minor changes okay? Based on your answers to these questions, you should get an inclination of what the right decision for your company is.

  1. Is it going to cost as much as a new system or is it far less?

How does the price to upgrade compare to the price of a new system? If it is more or the same, I would advise strongly that you at least look at a few other vendors. If it is far less, then look and see if it can meet your needs. If it can, it seems cut and dry from there that it is the right decision.

Let’s block ads! (Why?)

Workwise LLC

Read More
« Older posts
  • Recent Posts

    • Accelerate Your Data Strategies and Investments to Stay Competitive in the Banking Sector
    • SQL Server Security – Fixed server and database roles
    • Teradata Named a Leader in Cloud Data Warehouse Evaluation by Independent Research Firm
    • Derivative of a norm
    • TODAY’S OPEN THREAD
  • Categories

  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited