• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: files

Search SQL Server error log files

January 23, 2021   BI News and Info

Each instance of SQL Server logs information about its processing to a file known as the error log. Depending on how long an instance has been up and what is being logged, the log files might be small or large. When the log files are small, they are fairly easy to browse using SQL Server Management Studio (SSMS). But when they are large, it is cumbersome to browse through them to find individual error log messages using SSMS. There are even times when the error log file is so large it can’t even be opened up using SSMS. This article will show you a few different ways to browse and search SQL Server error log files.

Using SSMS to search and filter large SQL Server error log files

When browsing a large error log file with SSMS, it can take a long time just to scroll through the file to find the portion of the log that you might be interested in reviewing. I find it easier to use the search and filter options to find the information in large error log files. I’ll demonstrate how to use these options to find information in large error log files.

Using the search option

The search option is useful for finding the next occurrence of a string of characters in the log. To search, you can just browse through one of the archived log files, as shown in Figure 1.

browsing my error log file to search sql server er Search SQL Server error log files

Figure 1: Browsing my error log file

Figure 1 shows the beginning of the error log file, and the entries are sorted by the date/time from the oldest to the newest. You can see the Search function outlined by a red box at the top of the screenshot. To use the search function, just click on this search icon, which brings up the search dialog shown in Figure 2.

search selection dialog Search SQL Server error log files

Figure 2: Search selection dialog

To search, just enter the string of characters you want to find in the Search for: field. The characters can be case-insensitive or case-sensitive based on whether the Match case check box is checked. You can also search just the Message column or all the columns depending on if the Search Message column only box is checked. When an error log file spans many days, you could uncheck this checkbox to search for a particular date/time string in the log. By doing this, the error log can be reposition to display a specific day in the log in a log file that contains multiple days.

For this demonstration, enter the string error in the Search for: criteria. Once the search criteria are filled in, the Search button is enabled, as shown in Figure 3.

enabling search button search sql server error log Search SQL Server error log files

Figure 3: Enabling Search Button

When clicking the Search button, the error log position is relocated to the first occurrence of the string error, as shown in Figure 4.

repositioned to first occurrence of the string Search SQL Server error log files

Figure 4: Repositioned to first occurrence of the string

Click the Search button again to move to the next message text that contains the string error, as shown in Figure 5.

next occurrence of the string error Search SQL Server error log files

Figure 5: Next occurrence of the string “error”

By reviewing Figure 5, you can see the search function found the string error just a few lines down further in the log (the actual string error is located out of view to the right). By clicking the search button repeatedly, you can progressively work through the large error log file finding all the messages that contain the string error. Once the last message is found, the search will start over from the top if you click the button again.

Using the search button repeatedly could be a little tedious, especially if the log file contains many messages with string error. Another way to find all the messages without clicking and scrolling is to use the filter option.

Using the Filter Option

The filter option makes it a little easier to find all the occurrences of a string in the error log file. It does this by sifting through a large error log file and only displaying those rows that meet the filter criteria. Filtering is handy when you want to view specific log entries in a very large log file. To bring up the filter criteria, you need to click on the Filter options in the Log File Viewer window, as shown in Figure 6.

selecting the filter option Search SQL Server error log files

Figure 6: Selecting the Filter Option

When the filter option is clicked, the dialog box in Figure 7 is shown.

filter options Search SQL Server error log files

Figure 7: Filter Options

As you can see from Figure 7, there are several different filter selection options from which to choose. You can use one, or more of these filter options to identify those error log records you want to display. Table 1 lists the descriptions for each of these different filter options.

Table 1: Descriptions for each filter option

Filter Name

Description

User

The user name that is associated with the log entry

Computer

The computer that is associated with the log entry

Start Date

Log entry must be created on or after this date

End Date

Log entry must be created on or before this date

Message contains text

Log entry message must contain this text (case-insensitive)

Source

The source of the log entry

Instance Name

The instance Name that is associated with the log entry

Event

The event id that is associated with the windows log entry

To demonstrate how to use the filter dialog to find specific error logs, first try to find the ERRORLOG file directory name using the Message contains text filter item. The error log directory name is displayed on an error log line item that contains the string Logging SQL Server messages in the message text. Therefore, all you need to do is enter this string in the Message contains text filter item, check the Apply filter checkbox, and then click on the OK button, as shown in Figure 8.

applying filter Search SQL Server error log files

Figure 8: Applying Filter

After clicking the OK button, only the error log lines that contain the text are displayed, as shown in Figure 9. If the Apply Filter checkbox is not checked, before clicking on the OK button, the filter won’t be applied.

word image 63 Search SQL Server error log files

Figure 9: Results of message text filter

Using the filter item is especially useful for finding those messages that are hidden amongst all the messages you are not interested in. I also find using the Start Date and End Date filters extremely useful to find log entries for a specific date range. The date range filter is handy when the error log file is very large and contains multiple days of error log records.

Out of memory errors when viewing large logs

If SQL Server has been up for a while and the error log has not been cycled, or a lot of messages have been written to the log file over a short time, then the error log might be very large — possibly in the gigabyte size range. If you try to open one of these gigabyte log files using SSMS, a memory exception will occur. Figure 10 shows the out of memory exception that can occur when opening one of the large error log files.

word image 64 Search SQL Server error log files

Figure 10: Out of memory exception when trying to view a large error log file

I got this error when I tried to open one of my large, archived log files that was over 8 GB in size. When this error occurred, some of my log records were loaded into the viewer. I could still use the search option, but I got another memory exception when I tried to use the filter option.

If you are trying to use the SSMS to view large log files and having memory issues, this doesn’t mean you are out of luck. There are other options to view, search and filter these large log files.

Using a text editor to view a large log file

One option to view a large log file is to use a text editor. But it can’t just be any text editor; it needs to be a text editor that can read a large file. I have downloaded and used UltraEdit in the past to open large error log files. I’m not endorsing UltraEdit; I only mention it here because it is one of the editors I have used in the past to look at large log files. Keep in mind that UltraEdit is not free software; you need to have a license to use this product long-term. Before you consider downloading any text editor off the internet, make sure you understand the software’s uses and license requirements being downloaded.

Programmatically searching the error log file

Another option for searching those larger log files is to do it programmatically. SQL Server provides an undocumented extended stored procedure named xp_readerrorlog that can be used to search the error log and the SQL Agent log files.

Listing 1 is an example of how I used this undocumented stored procedure to search the active error log file on one of my instances of SQL Server.

Listing 1: Using xp_readerrorlog to find the location of error log file

exec xp_readerrorlog 0,1,N‘Logging SQL Server messages in file’;

This example searches for the string Logging SQL Server messages in file in the active log file. The output shown in Figure 11 is returned when running the command.

word image 65 Search SQL Server error log files

Figure 11: Output from running code in Listing 1

The log record that identified the file location where the error log messages are being written can be found by searching for this particular string in the active log file.

Even though this stored procedure is undocumented, there are many resources out there that explain how to use it. This stored procedure supports seven parameters. Those parameters are described in Table 2.

Table 2: Parameters for xp_readerrorlog

Parameter

Description

1

Identifies the error log file that you would like to read.  Set this parm to 0 if you’d like to read the current error log.  Or you can set it to either 1, 2, 3, etc. to read one of the historical error log files.

2

Identifies which error log to search.  1, or null for ERRORLOG, or 2 for the SQL Agent log

3

The first string you want to search for in the error log file.

4

The second string you want to search for in the error log file.

5

The start time constraint on searching.

6

The end time constraint on searching. 

7

Sort order of the output (ascending, descending)

Finding all the records in a large log file that contained the word error can easily be done by just changing the search string in parameter 3 of the code in Listing 1. You can write a short T-SQL script to find all the log records from the active SQL Server log file for yesterday and then place them in a temporary table for further analysis using the code in Listing 2.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

– Declare Variables needed

DECLARE @StartDate date,

        @EndDate   date;

– Create temporary table to how error log records

CREATE TABLE #ErrorLogForYesterday (

  LogDate datetime,

  ProcessInfo varchar(max),

  Text varchar(max));

SET @StartDate = dateadd(dd,-1,getdate()); – Yesterdays Date

Set @EndDate = getdate(); – Todays Date

– Extract error log records for yesterday in to temporary table

INSERT INTO #ErrorLogForYesterday EXEC xp_readerrorlog

            0,1,N”,N”,@StartDate,@EndDate;

– Display error log records extracted

SELECT * FROM #ErrorLogForYesterday;

– Cleanup

DROP TABLE #ErrorLogForYesterday;

Listing 2: Code to extract yesterday’s error log records

Programmatically finding error log records makes it easy to build processes to analyze the error log file. Using the method in Listing 2, a DBA could create a series of scripts that could programmatically run the xp_readerrorlog stored procedure to quickly analyze the different error log files.

Reading and Searching SQL Server Error Log Files

When SQL Server creates large error log files, it presents challenges for DBAs to read them. Large log files are cumbersome to scroll through to find errors. Luckily, the log view functionality of SSMS has the Filter and Search features built-in to allow a DBA to find strings within these large log files quickly. Additionally, using TSQL code to call the undocumented xp_readerrorlog stored procedure, allows a DBA to build scripts to read those large log files. Using these different methods to find errors in large SQL Server log files is critical for managing and maintaining SQL Server.

If you like this article, you might also like SQL Server Error Log Configuration – Simple Talk (red-gate.com)

Let’s block ads! (Why?)

SQL – Simple Talk

Read More

Dynamics 365 CE On Premises: Data Files – CRM database autogrowth

January 6, 2021   Microsoft Dynamics CRM

Data Files – CRM database autogrowth

Dynamics 365 CE database usually has 1 data file. If you didn’t change the default configuration, is very probable you have the autogrowth by 1 MB and unlimited. If so, this number is very low and when the file reaches the size and needs to grow, the SQL Server needs to take a little time to increase more 1 MB – it means it will occur very often. In the other hand, let’s suppose you have the autogrowth by 10,240 MB (10 GB) and unlimited. This number is very large and when the file reaches the size and needs to grow, the SQL Server can freeze for a while to take time to increase more 10 GB.

Let’s take a look at the Autogrowth configuration:

autogrowth config Dynamics 365 CE On Premises: Data Files – CRM database autogrowth

According the picture above, we can have the following configuration set:

Autogrowth:

  • Enabled
    • File Growth
      • in Percent
      • in Megabytes
    • Maximum File Sile
      • Limited to (MB)
      • Unlimited
  • Disabled

If you enabled the autogrowth, you need to specify the metrics. You can set to grow in percent or in Megabytes. Also, you can set to grow until the limited value in MB or leave it unlimited (it means the file can grow until the total disk space capacity). Let’s suppose you set your file growth in percent. In this case is pretty common to keep it increasing by 10 percent. It can be very quick if the size of your data file is up to 1 GB (it would increase ~100 MB). But if your data file has 1 TB, 10 percent will increate 100 GB and your SQL Server can be very busy until the file growth finishes.

autogrowth crm Dynamics 365 CE On Premises: Data Files – CRM database autogrowth

Consider changing the configuration of SQL Server data files for MSCRM. One option may be to increase the maximum allocation size manually and disable automatic growth, to prevent SQL Server from taking time to increase 10 GB frequently. Obviously, it requires monitoring to avoid reaching the maximum disk allocation reserved for data files.

Walter Carlin – MBA, MCSE, MCSA, MCT, MCTS, MCPS, MBSS, MCITP, MS

Senior Customer Engineer – Dynamics 365 – Microsoft – Brazil

6661.microsoft Dynamics 365 CE On Premises: Data Files – CRM database autogrowth

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

Read More

Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

June 16, 2020   Self-Service BI

As described in this post – https://www.ehansalytics.com/blog/2020/2/15/avoid-using-excel-xls-files-as-data-source –
there are issues to be aware off when you use xls files instead of xlsx in Power Query. See also this thread
https://social.technet.microsoft.com/Forums/en-US/41f2c8ec-1f2c-4591-ac6a-54764b2a90a7/bug-in-excelworkbookwebcontents-powerquery?forum=powerquery
.

Answering the twitter started by Imke Feldmann (https://twitter.com/TheBIccountant) thread by Ruth Pozuelo (go follow her excellent youtube – channel – link) encouraged me to write this post – – as I claimed we can convert the xls files to xlsx using Power Automate.

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

So here is a guide on how to do it

Convert xls files to xlsx

In the scenario I will use a trigger when an e-mail is received and use a rest API provided by https://cloudconvert.com/.

OBS – This is a paid service where you pay by the minute the conversion takes – price from $ 0.02 to $ 0.01 per minute.

First we start by selecting to build an automated flow and select the trigger “When a new email arrives (V3)”

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

Set the advanced options to only trigger when Attachments is included and include the attachments in the following steps in our flow.

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

As the next step I use an Azure Blob storage to store the file in the e-mail. When selecting the output from the previous step – power automate will automatically create an Apply to each container in which we can refer to each attachment in the mail.

In the Create Blob Action I connect to a blob storage and load the attachment into a preload folder

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

Now add a step where we create an HTTP request within the Apply to each container

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

In order to use the cloud convert REST API we need first to create a Process and use the process ID to create the conversion – documentation here

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

In the body property you specify your own APIKey and tell the process you want to do is a conversion from xls format to xlsx.

Next – add another HTTP request

We use a POST request again

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

And in the URI we use the result from the previous step as it returns a unique address to the process id from cloud convert

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

In the Body property we specify where

{

"input": {

"azureblob": {

"storageaccount": "<NAMEOFBLOBACCOUNT>",

"storageaccesskey": "<AccountKey>",

"container": "preload"

}

},

"file": @{body('Create_blob')?['Name']},

"timeout": 10,

"output": {

"azureblob": {

"storageaccount": "<NAMEOFBLOBACCOUNT>",

"storageaccesskey": "<AccountKey>",

"container": "converted"

}

},

"outputformat": "xlsx"

}

Here is the final overview of the steps needed.

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

Now send an email to the inbox you have connected your trigger to run.

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

And in our preload folder we can see the files

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

And in the converted folder we have the converted xlsx files

 Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

Hope this can help you converting your xls files to xlsx.

This will also make it much easier if you want to be able to setup

Power On!

Let’s block ads! (Why?)

Erik Svensen – Blog about Power BI, Power Apps, Power Query

Read More

Facial recognition startup Megvii files IPO in Hong Kong

August 27, 2019   Big Data
 Facial recognition startup Megvii files IPO in Hong Kong

(Reuters) — Chinese AI firm Megvii Technology, backed by Alibaba, has filed in Hong Kong to conduct an IPO targeting proceeds of at least $ 500 million, two people said, just as the city faces political unrest and its first recession in a decade.

Beijing-based Megvii, widely known for facial recognition platform Face++, may raise as much as $ 1 billion in the initial public offering, said one of the people, who expect the share sale in the fourth quarter of the year.

The filing comes as companies postpone or slow down listing plans in a recession-bound city blighted with nearly three months of anti-government protests, and where the benchmark Hang Seng share price index fell to seven-month lows this month.

Reuters reported last week that China’s biggest ecommerce firm, Alibaba Group, had delayed its up to $ 15 billion Hong Kong listing.

Megvii has decided to press ahead with its IPO plans because it has little business in Hong Kong and expects the unrest to ease later this year, said a third person.

Megvii declined to comment. The people who had direct knowledge of the matter declined to be identified as the information was not public yet.

AI leader

Megvii, founded in 2011 by CEO Yin Qi and two friends from Tsinghua University, would become the first Chinese artificial intelligence firm to go public in Hong Kong.

Its filing comes amid government plans for China to become an international leader in AI, a technology that is becoming increasingly central in various sectors.

Once the preserve of researchers, AI has grabbed the attention of businesses as varied as healthcare and financial services looking to use algorithms to comb through troves of data to recognize patterns and solve problems.

In May, Megvii raised $ 750 million from investors including Bank of China Group Investment and Australia’s Macquarie Group at a valuation of slightly over $ 4 billion.

The company, also backed by Ant Financial, provides facial recognition and other AI technology to governments and companies including Alibaba, Ant Financial, Lenovo Group, and Huawei Technologies.

It booked a loss of 3.35 billion yuan ($ 472 million) on revenue of 1.43 billion yuan last year, widening the loss from 759 million yuan a year earlier. Its adjusted operating profit, which excludes one-off items such as share-based compensation payments, reached 75.7 million yuan last year, showed its draft prospectus.

It will use IPO proceeds primarily for research and development, marketing and sales plus global expansion and strategic investments opportunities, the prospectus showed.

Citigroup, Goldman Sachs and JPMorgan are joint sponsors of the IPO.

Sign up for Funding Daily: Get the latest news in your inbox every weekday.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Capacity raises $13.2 million to index emails, files, and more with AI

August 21, 2019   Big Data

Capacity (formerly Jane.ai), a startup developing a platform that indexes data from apps, teams, and more and enables users to search through the corpus using natural language, today revealed that it recently raised $ 13.2 million in series B funding from undisclosed Midwest angel and private investors. The influx of capital brings the company’s total raised to over $ 20 million, following an $ 8.4 million series A round in June 2018.

“We created [Capacity] to help everyday workers be more successful by eliminating the wasted time and effort that comes from searching for basic workplace information,” said Capacity cofounder and CEO David Karandish, formerly the CEO of Answers.com. “We’ve all grown accustomed to the convenience of on-demand, personalized services and voice-controlled speakers at home, but we have yet to benefit from these same conveniences at work. [Capacity] is an intuitive, intelligent AI-powered Teammate who gives employees instant access to the information they need to do their jobs well.”

Capacity is a service in two parts. Its cloud backend mines information from documents; webpages; email and calendar apps like Gmail and Exchange; customer relationship management (CRM) software like Salesforce and Oracle’s NetSuite; health information and resource services (HIRS) like ADP and Sage; service desk platforms like Zendesk and ServiceNow; and cloud drive providers like Box and OneDrive. The second part is a chatbot with natural language processing capabilities that integrates with popular messaging apps such as Slack and Skype. With Capacity, users can type things like “I need the Centene contract from August 2017” and “How much PTO do I have?” or even instruct it to schedule appointments (“Schedule 15 minutes to meet with David and Josh”) and update the status of sales leads (“Update the status of the Express Scripts deal to ‘won’”).

 Capacity raises $13.2 million to index emails, files, and more with AI

Above: Capacity in action.

Image Credit: Capacity

It’s customizable, too. Capacity can deliver company-wide announcements, like daily news and event notifications, and onboard new hires by providing access to forms that need to be completed. For customers with websites that have FAQ sections, it can be made public-facing to help cut down on customer service requests.

Arguably the real value of Capacity is its ability to improve over time, according to Karandish, thanks to its CoPilot and Expert Finder features. When the chatbot doesn’t know the answer to something, it flags that item for a team member to review and stores it in a database. Additionally, if the answer is a bit more complex than can be communicated in a few sentences, Capacity creates a detailed conversational workflow informed by expert human knowledge.

Capacity was founded in St. Louis, Missouri in 2017 by Karandish and cofounder Chris Sims. Customers include Washington University in St. Louis, electric utility company Ameren Corporation, USA Mortgage, and Kelly Mitchel, Newell Brands, West Community Credit Union, Total Access Urgent Care, Maryville University, Framecad, EXL, and Schaeffer’s Oil.

Sign up for Funding Daily: Get the latest news in your inbox every weekday.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Pinterest files for IPO

March 23, 2019   Big Data
 Pinterest files for IPO

(Reuters) — Pinterest, the owner of the image search website known for the food and fashion photos that its users post, filed for an initial public offering with U.S. regulators on Friday, looking to tap into a red-hot market for new stock offerings.

The filing comes a day after jeans maker Levi’s blockbuster debut, and ride-hailing service providers Lyft and Uber are set to pursue much-anticipated listings. Investors are anticipating 2019 may be one of the most active years ever for tech IPOs.

Pinterest, which plans to list under the symbol “PINS” on the New York Stock Exchange, set a placeholder amount of $ 100 million to indicate the size of the IPO. The final size will change.

Reuters reported in January Pinterest could raise around $ 1.5 billion and that the IPO was likely to come in the first six months of 2019.

The company was valued at $ 12 billion in its last fundraising round in 2017.

The San Francisco-based company has grown rapidly since its founding in 2008, boasting in the regulatory filing that it reaches more than 250 million monthly active users, two thirds of whom are female.

Pinterest said its annual revenue in 2018 was $ 755.9 million, up 60 percent compared to 2017. Nevertheless, it remains unprofitable with a net loss of $ 62.97 million, narrowing from a net loss of $ 130 million a year earlier.

Like Lyft, Pinterest plans to go public with a dual-class share structure to concentrate voting power with Class B shareholders, which include co-Founder, President and Chief Executive Officer Benjamin Silbermann.

However, Pinterest said Class B shares will automatically convert into common shares seven years after the IPO. This conversion will not take effect if these Class B stockholders continue to own at least 50 percent of their shares held at the time of the IPO.

This follows a trade group representing top U.S. pension funds and asset managers asking exchanges to require companies seeking to go public with share classes with unequal voting rights to have plans to equalize them within seven years.

Investors focused on corporate governance have criticized dual-class share structures after the likes of Snapchat parent Snap Inc and meal-kit maker Blue Apron Holdings Inc went public since with little or no voting representation for certain investors.

Goldman Sachs and JP Morgan are the lead underwriters on the Pinterest IPO.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Big Files, Bigger Help

March 12, 2019   BI News and Info
 Big Files, Bigger Help

I made a large program and it’s pretty clunky, I am not great at reducing my program line count. I was wondering if there was a way to ask for help on big notebooks that are above the word count of the questions. I am working on trying to simplify the program and I would like help doing this. I have questions on the notebook itself but I am still getting used to the support here. Please let me know what I should do instead of posting it to dropbox.
Thanx

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

How to Reference Azure Storage Files from Cloud Shell

December 10, 2018   Self-Service BI

Let’s block ads! (Why?)

Blog – SQL Chick

Read More

Using Customization Files for Deploying Custom Code

July 26, 2018   Microsoft Dynamics CRM
customization files 300x225 Using Customization Files for Deploying Custom Code

When using USD in a CRM implementation there is a lot we can accomplish using configuration and out-of-the-box hosted controls, but in implementations that are more complex, we often find ourselves utilizing custom hosted controls to get the job done.

Often these custom hosted controls are comprised simply of “.dll” libraries but sometimes we may need other files or dependencies. In the past, we would have to deploy these files using some distributed installer or script to copy the dependencies into each user’s USD install folder, but no longer!

072618 1712 UsingCustom1 Using Customization Files for Deploying Custom Code

There’s gotta be an easier way!

Now with Customization File records we can easily distribute and update custom controls and functionality to our agent’s machines.

To create one, let’s navigate to Settings > Unified Service Desk > Customization Files and create a new record.

072618 1712 UsingCustom2 Using Customization Files for Deploying Custom Code

Customization Files, like any other USD configuration, are records in Dynamics 365 and should be associated with a USD Configuration record. We can track the version number using the version field, as we increment this number, the cached controls will be overwritten on the user’s machine.

Next, let’s prepare the attachment which will contain our files to distribute. This is a zip file consisting of the files we want to distribute and a “Content Types” file describing what file types should be extracted from the zip.

So, gather your files and create a new file with them named “[Content_Types].xml” which should contain XML like the following:

[Code snippet]

<?xml version=”1.0″ encoding=”utf-8″?>

<Types xmlns=”http://schemas.openxmlformats.org/package/2006/content-types”>

<Default Extension=”dll” ContentType=”application/octet-stream” />

<Default Extension=”config” ContentType=”application/octet-stream” />

<Default Extension=”css” ContentType=”application/octet-stream” />

</Types>

[Code snippet]

Each child of the “Types” element will describe a file type which should be extracted. The above Content Types XML will cause any files ending in .dll, .config, or .css to be extracted. After creating this file, place it in the same folder as the files that you wish to distribute and then zip them up with your favorite archive tool.

072618 1712 UsingCustom3 Using Customization Files for Deploying Custom Code

Finally, attach this zip file to the previously created Customization File.

072618 1712 UsingCustom4 Using Customization Files for Deploying Custom Code

Now when our users start USD, it will pull down, extract, and cache these files in their Local AppData folder at the following location: C:/Users/<UserName>/AppData/Local/Microsoft/UnifiedServiceDes/<OrgUniqueName>/<GUID>/

072618 1712 UsingCustom5 Using Customization Files for Deploying Custom Code

That’s all there is to it!

For more helpful Dynamics 365 tips and tricks, subscribe to our blog!

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

#powerquery – How to handle different decimal separator when importing csv files

July 4, 2018   Self-Service BI

Recently I have been working on a project where the solution should import a csv file exported from a SQL server. For some reason sometimes the data comes with a , (comma) as the decimal separator and other times with . (dot) as the decimal separator.

This meant that when importing the different files, I had to find a way to dynamically change the culture code setting when importing the file.

 #powerquery – How to handle different decimal separator when importing csv files

Let’s try and open the SalesWithDot.csv file in Power BI Desktop

 #powerquery – How to handle different decimal separator when importing csv files

As my Power BI Desktop uses Danish settings and danes uses , as a decimal separator Power BI desktop will think the values is 30.000 instead of 300 etc. – and we have to tell Power Query that the source is from another Culture – so I click Edit

 #powerquery – How to handle different decimal separator when importing csv files

As we can see from the documentation Table.TransformColumnTypes – the function has an optional parameter called Culture –

 #powerquery – How to handle different decimal separator when importing csv files

And by adding “en-US” to our formula bar we can have the right value in the SalesValue column

 #powerquery – How to handle different decimal separator when importing csv files

But when we use this when importing a file where the sales value is formatted with a , as the decimal separator and we use the same culture value (“en-US”) then we have the problem.

 #powerquery – How to handle different decimal separator when importing csv files

And by changing the culture to da-DK it shows the right values

 #powerquery – How to handle different decimal separator when importing csv files

So how can we make Power Query dynamically determine which local to use ?

In this case I know that the column SalesValue will either contain a comma or a dot – and by checking the first value of the file imported I can select which culture to use – this can be done like this

The step before we “Changed Type” is called “Promoted Headers”

 #powerquery – How to handle different decimal separator when importing csv files

To check whether the Sales Value contains a comma – we can use the Text.Contains function

 #powerquery – How to handle different decimal separator when importing csv files

And we can refer to the first value in the SalesValue column like this

#”Promoted Headers”[SalesValue]{0}

Then

Text.Contains(#”Promoted Headers”[SalesValue]{0}, “,”)

Will give us true if the cell has a comma in the cell before its changed to a decimal value.

When we know this we can change the

= Table.TransformColumnTypes(#”Promoted Headers”,{{“Product”, type text}, {“SalesValue”, type number}, {“SalesUnits”, Int64.Type}}, “da-DK”)

To

= Table.TransformColumnTypes(#”Promoted Headers”,{{“Product”, type text}, {“SalesValue”, type number}, {“SalesUnits”, Int64.Type}}, if Text.Contains(#”Promoted Headers”[SalesValue]{0}, “,”) then “da-DK” else “en-US”)

The if statement will then use the da-DK culture if the SalesValue contains a comma.

 #powerquery – How to handle different decimal separator when importing csv files

And in the example file with the dot

 #powerquery – How to handle different decimal separator when importing csv files

You can download an example file here.

Hope you find this useful – Power On!

Let’s block ads! (Why?)

Erik Svensen – Blog about Power BI, Power Apps, Power Query

Read More
« Older posts
  • Recent Posts

    • The Easier Way For Banks To Handle Data Security While Working Remotely
    • 3 Ways Data Virtualization is Evolving to Meet Market Demands
    • Did you find everything you need today?
    • Missing Form Editor through command bar in Microsoft Dynamics 365
    • I’m So Excited
  • Categories

  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited