• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Part

FSI Blog Series, Part IV: Staying Agile in Trying Times

January 22, 2021   Microsoft Dynamics CRM

Welcome to the final installment of our four-part blog series exploring the partnership that exists between HCL-PowerObjects, Microsoft, and Seismic – the world’s most powerful storytelling platform. As we’ve said many times now, our three businesses together provide a unique set of partnership benefits for our customers that includes execution deployment, data analytics, and customized content.

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

FSI Blog Series, Part III: Personalization

January 14, 2021   Microsoft Dynamics CRM

Welcome to Part III of our blog series exploring the partnership that exists between HCL-PowerObjects, Microsoft, and Seismic – the world’s most powerful storytelling platform. Our three businesses together provide a unique set of partnership benefits for our customers that includes execution deployment, data analytics, and customized content. This series focuses on the power of this triumvirate…

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

Managing and Marketing Microsoft Teams Live Events – Part 2

January 13, 2021   CRM News and Info

xwebinar blog 625x417.jpeg.pagespeed.ic.y2Z27Z3biC Managing and Marketing Microsoft Teams Live Events – Part 2

In part one of our 2 part blog series on Microsoft Teams we told you what Microsoft Teams Live Events are and we showed you how is simple to create and market a Teams Live Event. There are 3 simple steps to follow when holding a Microsoft Teams Live Event: planning and set up, marketing, and post event wrap up. In this second part, we will cover marketing your Teams Live event – as well as post event wrap up.

Step three: Marketing your Microsoft Teams Live Event

This may be one of the most important steps in holding your live event. You need to ask yourself who it is that you want to attend your webinar. Do you want primarily new customers, existing customers, or a mix? If you are primarily marketing to existing customers, you may want to do something like add a note to your email signature or send emails to existing customers or prospects.  When engaging new prospects, you may want to do something such as creating a blog about your topic and post it on social media. Don’t forget #hashtags! You may also want to consider joining and posting in LinkedIn groups related to your topic and creating LinkedIn Lead Generation forms to capture sign ups from LinkedIn.

Our marketing automation tool of choice is ClickDimensions. ClickDimensions was made exclusively for and is natively built in Microsoft Dynamics 365. Using ClickDimensions, we built a campaign automation when planning the marketing of our webinar. Campaign automations allow you to set up dynamics paths for a contact or lead based on certain actions. In our fast paced world, campaign automations help simplify and automate daily processes. Campaign automations turn manual processes into automated ones. Campaign automations are not only simple to set up, but extremely beneficial. Not only do they give you piece of mind,  campaign automations increase productivity – freeing you up to focus on other things. Below is an example of our campaign automation.

xTeams la 6 campaign automation 625x361.png.pagespeed.ic.8Gqnaiu QP Managing and Marketing Microsoft Teams Live Events – Part 2

Step four: Post event wrap up of your Microsoft Teams Live Event

After your Microsoft Teams Live Event is over, you still have a few tasks to perform. You can make your job easier by simply prebuilding many of these assets and finishing them after the webinar is completed. For us, we like to upload all of our webinar videos to YouTube. We also save our PowerPoint presentation to a pdf and download both the pdf and the video to our website. You will also want to send an email to all who registered (not just attendees) – thanking them for attending and attaching links to the webinar recording on YouTube and the PowerPoint notes.

Don’t take down the webinar landing page you built for registrations. Use it to continue generating leads. Update the sign up form so that people can watch the video on demand.

In Conclusion

Webinars are among the best marketing platforms. Webinars establish you as an authority in the field and allow you direct contact with your target group. Holding webinars actually saves time and money, as they allow you a global audience without having to travel. Webinars allow you to reach a large amount of people at once. Remember, the hardest thing about doing webinars is just getting started. Decide you want to do it and just get started. You won’t be perfect. You will evolve with time and practice.

Get started today – happy planning!

How Can We Help?

If you have any questions on getting started with Microsoft Teams Live events or if we can help you with any of your Dynamics 365 needs, don’t hesitate to contact us. We would be more than happy to assist you. enCloud9 is a recognized expert in helping organizations like yours get the most out of their Dynamics 365 system.

Watch for details of our upcoming webinars in 2021. Topics to be announced soon and registration will be available on our website, as well as on our Facebook page.

Visit our YouTube channel for other informative webinars or our blog for the most current and up to date information.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Managing and Marketing Teams Live Events – Part 1

January 9, 2021   CRM News and Info

What are Microsoft Teams Live Events?

So, you’ve likely used Microsoft Teams for chat, messages, and meetings, but did you also know that you can also host webinars in Teams? Microsoft Teams Live Events are an extension of Teams meetings – enabling users to broadcast video and meeting content to a large online audience. Teams Live Events are meant for one-to-many communications where the host of the event is leading the interactions and audience participation is primarily to view the content shared by host. It is simple to create and market a Teams Live Event. There are 4 simple steps to follow when holding a Microsoft Teams Live Event: brainstorming, marketing, preparation/performance, and post event wrap up. This blog is part of a two part series. Part one will cover what Teams Live events are – as well as planning setting up your Teams Live Event. Part 2 will cover marketing your Teams Live event – as well as post event wrap up.

With so with so many options for webinar hosting services, why would someone choose to use Microsoft Teams as their platform for hosting webinars?

There are many reasons someone would choose to host webinars in Microsoft Teams using Live Events.

  • Cost – This is a very significant factor in choosing webinar hosting services.  If you are already using Teams, why pay for another service when you already have the functionality?  Getting started with Teams live events is not difficult. Initially, you may have more work to set things up, but once you have it running, the benefits simply outweigh any additional work you may have to do.
  • More Control – This is where Microsoft Teams Live events shines bright. When it comes to moderating a webinar, it is vital to be in full control.  One of the ways Microsoft Teams Live Events allows you to be more in control is in the recording of the webinar. YOU choose when you begin and end the recording, something that is not an option in many other webinar software tools. To start the recording in a Teams live event, simply click on the 3 ellipses in the meeting panel and select Start Recording. When ending the call, it is suggested to end the call for all attendees, so that attendees don’t linger. You also have control over what attendees see. You can choose to show your whole desktop or just a specific window. You can choose to put your camera on or leave it off. Another thing you have control over is the roles of the participants. There are 3 roles – presenter, producer, and attendees.
  • Easy Editing – Once you have your Live Event set up, you can edit the details in Meeting Options.  Here you have control over such things as who can enter the call, who needs to wait in the lobby until they are admitted, or who can present content within the meeting.
  • Attendees access – Attendees will have access to the recording, as well as your PowerPoint presentation – if you choose. Attendees can easily share your content or refer to it at a later time.

Step one: Planning and Set up of your Microsoft Teams Live Event

When planning a webinar you must first make an outline for your webinar. Consider the amount of time you have and what you want to cover in that time. The basic format we follow for our webinars is as follows:

  • Introduction (first five minutes or so) – during this time, we introduce who we are, state the topic of our webinar, let people know that we will be recording the webinar, and covering any rules for the webinar (i.e. – questions at end).
  • Demo (the majority of the time) – Walk through your topic in a meaningful way to engage your viewers. Everyone is busy. We want people to feel that attending our webinar was a good use of their time. Make sure not to focus to much on the technical side of things – focus on the benefits!
  • Wrap up – Now is the time to review questions from your audience. (You may want to have a couple questions ready to go just in case)! Let attendees know that they will be getting an email with follow ups from the webinar.

At the beginning stage of the planning for your webinar is brainstorming.  You must first decide how often you want to do webinars. Will your webinar be a one-off event or part of a continual series? Another question you may want to ask yourself is how long your webinar will last. EnCloud9 holds a monthly 30 minute webinar. Everyone is busy and we felt that for our audience, 30 minutes was appropriate.

The next thing you need to decide on is a topic. It is important to consider your audience. Think about what is new in your industry and what your audience wants to hear about. Look at the analytics of past social media posts and blogs. What has generated the most interest? Look at what your competitors are blogging about and doing webinars on. What is the best way to share your expertise? Will you have a co-presenter or will you bring in an external expert?

And of course giving yourself adequate time to prepare is key. Make sure you have enough time to perform all the work required. Practicing and doing a few trial runs is a good idea.

After deciding on a topic, it is now time to create your live event in Microsoft Teams. Simply go to your calendar and in the upper right corner, next to new meeting, click on the down arrow and choose Live Event.

xset up teams la 1 625x374.png.pagespeed.ic.BD4P9xftNA Managing and Marketing Teams Live Events – Part 1

It is now time to fill in details for your webinar.

When you scroll down, you will see more options, such as whether you want a Q&A session or an attendee engagement report.

*Important to note that when you invite people to your event, you have to chose the role you wish them to have. There are 3 roles: producer, presenter, and attendee. It is important in assigning roles that you understand exactly what each role can and can’t do.

  • Producer – There should only be one producer. The producer is in charge on controlling the live event. The producer sends the event live and also finishes the event. The producer can share their content or that of the presenter. Once the live Q&A session has been switched on, presenters can also act as moderators.
  • Presenter – There can be multiple presenters on your live event. Each presenter can share their audio, live video, or content on a screen sharing – as well as content from other presenters (if you have more than one).
  • Attendee –  Depending on how you set your live event up (public or private), attendees can be external or internal. Attendees can either join the live event by using their invitation link or they can choose to view the event in their web browser of Teams app.

In Conclusion

Webinars are among the best marketing platforms. Webinars establish you as an authority in the field and allow you direct contact with your target group. Holding webinars actually saves time and money, as they allow you a global audience without having to travel. Webinars allow you to reach a large amount of people at once. Remember, the hardest thing about doing webinars is just getting started. Decide you want to do it and just get started. You won’t be perfect. You will evolve with time and practice.

Get started today – happy planning!

How Can We Help?

If you have any questions on getting started with Teams Live events or if we can help you with any of your Dynamics 365 needs, don’t hesitate to contact us. We would be more than happy to assist you. enCloud9 is a recognized expert in helping organizations like yours get the most out of their Dynamics 365 system.

Join our January 2021 webinar about The New Search Experience in Dynamics 365. Topics for future webinars will be announced soon and registration will be available on our website, as well as on our Facebook page.

Visit our YouTube channel for other informative webinars or our blog for the most current and up to date information.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Heaps in SQL Server: Part 4 PFS contention

January 5, 2021   BI News and Info

The series so far:

  1. Heaps in SQL Server: Part 1 The Basics
  2. Heaps in SQL Server: Part 2 Optimizing Reads
  3. Heaps in SQL Server: Part 3 Nonclustered Indexes
  4.  Heaps in SQL Server: Part 4 PFS contention

After looking at the internal structures and the selection of data in heaps in the previous articles, the next articles will describe how DML operations can be optimized on a heap.

Demo set up

I use data from a demo database for all demos for demonstration purposes in articles and conferences. You can download the database [CustomerOrders] here.

In this article, I’ll use an additional database, demo_db. Run the following script to create the demo_db database, a heap to test inserts, and a view pointing to [CustomerOrders].

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

CREATE DATABASE demo_db;

GO

USE demo_db;

GO

CREATE TABLE dbo.Customers

(

Id     INT          NOT NULL,

Name   VARCHAR(200) NOT NULL,

CCode  CHAR(3)      NOT NULL,

State  VARCHAR(200) NOT NULL,

ZIP    CHAR(10)     NOT NULL,

City   VARCHAR(200) NOT NULL,

Street VARCHAR(200) NOT NULL

);

GO

CREATE VIEW dbo.CustomerAddresses

AS

SELECT C.Id,

C.Name,

A.CCode,

A.State,

A.ZIP,

A.City,

A.Street

FROM CustomerOrders.dbo.Customers AS C

INNER JOIN CustomerOrders.dbo.CustomerAddresses AS CA

ON (C.Id = CA.Customer_Id)

INNER JOIN CustomerOrders.dbo.Addresses AS A

ON (CA.Address_Id = A.Id)

WHERE CA.IsDefault = 1;

GO

Standard Procedure – INSERT

When data records are entered in a heap, this process consists of several individual steps that are transparent to the applications. Knowing them leaves room for possible optimization of the process.

Update of PFS

If a data row is stored in a heap and there is not enough space available on the data page, a new data page must be created. The data record can only be saved after the new page has been created.

In the first demo, insert one row into the formerly created table from the created view.

CHECKPOINT;

GO

INSERT INTO dbo.Customers

SELECT *

FROM dbo.CustomerAddresses

WHERE Id = 1;

GO

The above example adds a new record from an existing data source to the new table. Since the table was previously empty, the table structure must first be created. The undocumented function sys.fn_dblog () can be used to determine which tasks Microsoft SQL Server had to perform to insert the record into the table. I used CHECKPOINT to eliminate previous operations from appearing in the results below.

1

2

3

4

5

6

7

8

9

10

11

SELECT ROW_NUMBER() OVER (ORDER BY [Current LSN]) [Step #],

[Current LSN],

Operation,

Context,

AllocUnitName,

[Page ID],

[Slot ID]

FROM sys.fn_dblog(NULL, NULL)

WHERE CONTEXT <> N‘LCX_NULL’

AND AllocUnitName IS NOT NULL;

GO

word image 23 Heaps in SQL Server: Part 4 PFS contention

Figure 1: Recording from the Transaction Log

Step(s)

Operation and Context

Description

1 and 2

LOP_MODIFY_ROW / LCK_PFS

Since data pages are first created for the table, each assignment must be “registered” in the PFS page. A data page and the IAM page are created and registered for the table.

3

LOP_FORMAT_PAGE / LCX_IAM

Creation of the IAM page for table dbo.Customers

4

LOP_MODIFY_ROW / LCX_IAM

Registration of the first data page in IAM page

5 and 6

LOP_ROOT_CHANGE / LCX_CLUSTERED

Registration of table metadata in Microsoft SQL Server system tables

7

LOP_FORMAT_PAGE / LCX_HEAP

Preparation of the data page for the heap for storing the records.

8

LOP_ROOT_CHANGE / LCX_CLUSTERED

Storage of metadata in Microsoft SQL Server system tables

9

LOP_INSERT_ROWS / LCX_HEAP

Insert row in Heap

10

LOP_SET_FREE_SPACE / LCX_PFS

Update of the filling level of the data page for the PFS page

Note:  I describe the system pages and their functions in detail in the article “Heaps – The Basics”.

If further records are entered, the existing data page is filled until it is – in percentage terms – so full that no new records can be saved on it and Microsoft SQL Server has to allocate the next data page in the system.

Run this script to add another 10,000 rows.

1

2

3

4

5

6

7

8

9

10

11

12

CHECKPOINT;

GO

DECLARE @I INT = 2

WHILE @I <= 10000

BEGIN

INSERT INTO dbo.Customers

SELECT * FROM dbo.CustomerAddresses

WHERE   Id = @I;

SET @I += 1;

END

GO

Another 10,000 records will be inserted into the table [dbo].[Customers] with the code above. Afterwards, look into the Transaction log to see the single transactional steps.

word image 24 Heaps in SQL Server: Part 4 PFS contention

Figure 2: PFS updates

You can see that Microsoft SQL Server must update the PFS page several times (line 2, 46, 73, …). This is because the PFS page – only in the case of heaps – needs to be updated every time the next threshold is reached.

Bottleneck PFS

The PFS page “can” become a bottleneck for a heap if many data records are entered in the heap in the shortest possible time. How often the PFS page has to be updated depends mostly on the data record’s size to be saved.

This procedure does not apply to clustered indexes since data records in an index must ALWAYS be “sorted” into the data volume according to the defined index value. Therefore, the search for a “free” space is not carried out via the PFS page but via the value of the key attribute!

Microsoft SQL Server must explicitly check after each insert process whether the PFS page needs to be updated or not. If the above result is reduced to processes on the PFS page, the process is easy to recognize.

word image 25 Heaps in SQL Server: Part 4 PFS contention

Figure 3: Filtered operations from the log for PFS activity

In total – due to the short data record length – the PFS page had to be updated 14 times in order to enter 10,000 data records in the heap.

At first glance, that may not seem like a lot – after all, 10,000 records were entered. However, it can become problematic for the PFS page as soon as more than one process wants to enter data in the table at the same time. To derive – imprecise due to the limitations of my test system! – a trend, I had the latches recorded on the PFS page with the help of an extended event session and then processed the above (wrapped in a stored proc) in parallel with a different number of clients.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

CREATE OR ALTER PROC dbo.InsertCustomerData

@NumOfRecords INT

AS

BEGIN

WHILE @NumOfRecords > 0

BEGIN

INSERT INTO dbo.Customers

SELECT * FROM dbo.CustomerAddresses

WHERE   Id = @NumOfRecords;

SET @NumOfRecords -= 1;

END

END

GO

CREATE EVENT SESSION [track pfs contention]

ON SERVER

ADD EVENT sqlserver.latch_suspend_end

(

    ACTION(package0.event_sequence)

    WHERE

    (

        sqlserver.database_name = N‘demo_db’

        AND sqlserver.is_system = 0

        AND mode >= 0

        AND mode <= 5

    )

    AND class = 28

    AND

    (

        – only check for PFS, GAM, SGAM

        page_id = 1

        OR page_id = 2

        OR page_id = 3

        OR package0.divides_by_uint64(page_id, 8088)

        OR package0.divides_by_uint64(page_id, 511232)

    )

)

ADD TARGET package0.event_file

(

SET filename = N‘T:\TraceFiles\PFS_Contention.xel’,

MAX_FILE_SIZE = 1024,

MAX_ROLLOVER_FILES = 10

)

WITH

(

    MAX_MEMORY = 4096KB,

    EVENT_RETENTION_MODE = ALLOW_SINGLE_EVENT_LOSS,

    MAX_DISPATCH_LATENCY = 5 SECONDS,

    MAX_EVENT_SIZE = 0KB,

    MEMORY_PARTITION_MODE = NONE,

    TRACK_CAUSALITY = OFF,

    STARTUP_STATE = OFF

)

GO

I carried out each series of tests five times to compensate for possible deviations. After each workload, the recordings from the extended event has been analysed with the following query:

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

SELECT CAST(event_Data AS xml) AS StatementData

INTO #EventData

FROM sys.fn_xe_file_target_read_file

(‘T:\TraceFiles\PFS*.xel’, NULL, NULL, NULL);

GO

SELECT * FROM #EventData;

GO

WITH XE

AS

(

SELECT StatementData.value(‘(event/@timestamp)[1]’,‘datetime’) AS [time],

StatementData.value(‘(event/@name)[1]’, ‘VARCHAR(128)’) AS [Event_name],

StatementData.value(‘(event/data[@name=”mode”]/text)[1]’,‘VARCHAR(10)’) AS [mode],

StatementData.value(‘(event/data[@name=”duration”]/value)[1]’,‘int’) AS [duration],

StatementData.value(‘(event/data[@name=”page_type_id”]/text)[1]’,‘VARCHAR(64)’) AS [page_type]

FROM #EventData

)

SELECT XE.page_type,

COUNT_BIG(*) AS num_records,

SUM(XE.duration) AS sum_duration,

AVG(XE.duration) AS avg_duration

FROM XE

GROUP BY

XE.page_type

GO

Processes

1

2

4

8

16

32

64

PFS-Contention

0

1

1

7

7

16

68

avg. duration (µsec)

0

0

27

305

790

1.113

3.446

Runtime (sec)

4,28

5,65

7,68

13,45

23,83

55,93

165,72

avg (µsec)/ row

428

2.825

192

16.813

16.769

17.478

25.894

word image 26 Heaps in SQL Server: Part 4 PFS contention

Figure 4: Dependence on processes to contention(s)

The tests I carried out are not representative because external influences were not properly isolated. Nevertheless, one can deduce from the values that the potential for contention on the PFS page escalates with an increasing number of simultaneous processes.

word image 27 Heaps in SQL Server: Part 4 PFS contention

You know the problem from everyday life; You have to queue longer the more people want to use the same resource (till in the supermarket) at the same time. The bottleneck can be rectified by working with multiple files for the filegroup in which the heap is located – as is also common practice with TEMPDB.

word image 28 Heaps in SQL Server: Part 4 PFS contention

Figure 5: A separate database file for each core

I performed the same workload with 4 database files for the PRIMARY filegroup, and the results have been observed with the Windows Resource Manager:

word image 29 Heaps in SQL Server: Part 4 PFS contention

Figure 6: Relatively even distribution of the write load – better throughput

BTW: Now it is a good time to learn “a few” german words like

Datei = File

Lesen = read

Schreiben = write

It was to be expected that this would ease the situation. You can think of it as a situation in a supermarket where only one till is open at first. As soon as there are many customers in the supermarket, it accumulates in front of the till. Several cash registers are opened, and the situation is more relaxed again.

word image 30 Heaps in SQL Server: Part 4 PFS contention

Figure 7: Significant relaxation for the PFS pages

Bottleneck data structure

Anyone working with heaps must take the data structures into account. The biggest difference in the storage of data between an index and a heap is that data in a heap can be stored anywhere, while indexed tables must store the data according to the index attribute’s value. Storing data in a Heap can result in several problems:

Waste of storage space due to the calculation of the percentage of available storage space on a data page

Waste of memory in the buffer pool, since it is not the data itself that is loaded into the buffer pool, but the data pages on which the data is located

Increased contention on the PFS page if data records are too large and the percentage filling level has to be updated quickly.

Unused memory on a data page

Memory is expensive and, for Microsoft SQL Server, it’s an important component for fast queries. For this reason, you naturally want to avoid the situation where data pages are not completely filled, and thus RAM cannot be used.

To demonstrate this huge discrepancy between a Heap and a Clustered Index, create in the first scenario, a Heap table with a column C2 with a fixed size of 2,000 bytes for the payload. Afterwards, a Stored Procedure inserts 10,000 rows into the Heap table.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

USE demo_db;

GO

DROP TABLE IF EXISTS dbo.Customers;

GO

– Create a demo table

CREATE TABLE dbo.Customers

(

   C1 INT NOT NULL IDENTITY (1, 1),

   C2 CHAR(2000) NOT NULL DEFAULT (‘Testdata’),

);

GO

– Create stored procedure for the INSERT process

CREATE OR ALTER PROC dbo.InsertCustomerData

@NumOfRecords INT

AS

BEGIN

WHILE @NumOfRecords > 0

BEGIN

INSERT INTO dbo.Customers

(C2)

DEFAULT VALUES;

SET @NumOfRecords -= 1;

END

END

GO

– Execution of stored procedures for 10,000 rows

EXEC dbo.InsertCustomerData @NumOfRecords = 10000;

GO

The example above creates the table [dbo].[Customers] and a simple Stored Procedure which gets the number of rows to be inserted from a variable. After the insert process, you can get insights into the data distribution with the next query, which retrieves the physical information about the stored data.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

SELECT page_count,

record_count,

record_count / page_count AS avg_rows_per_page,

avg_page_space_used_in_percent

FROM sys.dm_db_index_physical_stats

(

DB_ID(),

OBJECT_ID(N‘dbo.Customers’, N‘U’),

NULL,

NULL,

N‘DETAILED’

)

WHERE index_level = 0;

GO

word image 31 Heaps in SQL Server: Part 4 PFS contention

With the table’s current design, two or three records (avg) can be stored on one data page. This means that a data page is filled with approx. 50 – 75%. If you change the Heap Table to a Clustered Index Table, the results look completely different!

1

2

3

4

5

6

7

8

9

10

11

12

13

DROP TABLE IF EXISTS dbo.Customers;

GO

– Create a demo table

CREATE TABLE dbo.Customers

(

  C1  INT NOT NULL IDENTITY (1, 1),

  C2  CHAR(2000) NOT NULL DEFAULT (‘Testdata’),

  CONSTRAINT pk_Customers_C1 PRIMARY KEY CLUSTERED (C1)

);

GO

– Execution of stored procedures for 10,000 rows

EXEC dbo.InsertCustomerData @NumOfRecords = 10000;

GO

word image 32 Heaps in SQL Server: Part 4 PFS contention

The reason for this odd behaviour is that Microsoft SQL Server references ONLY to the PFS page when it comes to the storage of a record in a Heap while a Clustered Index always has to follow the restriction of the Clustered Key and stores the record on the position of the key in the table.

A clustered index outperforms – based on the storage consumption – the Heap due to the need to store a record based on the key attribute. But keep in mind that – different from a Heap structure – the INSERT process requires to follow the B-Tree structure when it must safe a record on a data page.

Note

Before you go for a Heap structure, perform some tests to understand your data distribution in the data pages!

Workload when inserting records

The following demonstration shows the dependencies between the row size and the remaining free space on a data page.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

IF OBJECT_ID(N‘dbo.demo_table’, N‘U’) IS NOT NULL

DROP TABLE dbo.demo_table;

GO

– The size of the column C1 will change with every test!

CREATE TABLE dbo.demo_table (C1 CHAR(100) NOT NULL);

GO

– Clear the log file for the analysis of PFS updates

CHECKPOINT;

GO

– This script will run for each test loop and insert

– 10,000 records into the table

BEGIN TRANSACTION InsertRecord;

GO

DECLARE @I INT = 1;

WHILE @I <= 10000

BEGIN

    INSERT INTO dbo.demo_table(C1) VALUES (‘This is a test’);

    SET @I += 1;

END

– Afterwards we count the log entries for the PFS updates

SELECT Context,

COUNT_BIG(*)

FROM sys.fn_dblog(NULL, NULL)

WHERE [Transaction ID] IN

(

SELECT [Transaction ID]

FROM sys.fn_dblog(NULL, NULL)

WHERE [Transaction Name] = N‘InsertRecord’

OR Context = N‘LCX_PFS’

)

GROUP BY

Context;

– and have a look to the avg space used in the heap

SELECT page_count,

avg_page_space_used_in_percent

FROM sys.dm_db_index_physical_stats

(

DB_ID(),

OBJECT_ID(N‘dbo.demo_table’, N‘U’),

0,

NULL,

N‘DETAILED’

);

GO

ROLLBACK TRANSACTION;

GO

The above demonstration has been run with different row sizes. The result of the tests with different row sizes gave the following results:

While the duration of the transaction runtime changes moderately (157 ms – 1.459 ms), the number of updates of the PFS page increases extremely beginning with a record length of 200 bytes (563 – 16.260). Although the PFS page refresh occurs quite frequently, the number of data pages grows moderately (149-5,000). The average filling level of a data page is between 75% and 100%, depending on the size of the row.

The PFS page’s frequent updating is explained by the growing size of a data record since fewer data records fit on one data page and the various thresholds can be reached more quickly.

Record Length

Time (ms)

PFS Update

Pages

Avg. Used space

100

157

563

149

90,36%

200

414

1,397

271

95,26%

500

441

3,199

777

80,91%

1000

595

5,916

1,436

86,79%

2000

920

10,625

3,339

74,31%

3000

1,138

16,069

5,004

74,27%

4000

1,459

16,260

5,000

99,04%

Let’s do a little maths when data are stored on a data page.

Bytes

50%

80%

95%

100%

100

40

64

76

80

200

20

32

38

40

500

8

12

15

16

1000

4

6

7

8

2000

2

3

3

4

3000

1

2

2

2

4000

1

   

2

The above table shows the maximum records which “should” fit on ONE data page when the threshold has exceeded. Please note that with a fill level of 95%, only 403 bytes (8.060 * (1-95%)) are mathematically available on the data page.

If the row size is 100 Bytes, Microsoft SQL Server can store 40 records on ONE data page before the threshold gets updated to 80%. It takes 24 more records before the next update to 95% will happen.

As bigger the row size is as faster will the thresholds be reached. Keep in mind that the row size has an direct impact on the possible contention on the PFS page.

Let’s take a row size of 1,000 bytes for a record. With the 5th record, the PFS gets updated to 80%. When the 6th row (1,000 Bytes) must be stored on a data page, it will fit perfectly. From the table above, you can see the green and red values.

The green values mean that the records can be stored on the data page while the red ones show the records which will request a new data page!

Summary

The aim when inserting new data in a heap is to avoid frequent updates of the PFS pages and to use the available space as max as possible. The next article will show how you can boost the performance when you insert data into a Heap.

Let’s block ads! (Why?)

SQL – Simple Talk

Read More

Azure Availability Blog Series: Part 1 of 3

December 18, 2020   Microsoft Dynamics CRM

The move from on-premises to Azure brings its fair share of paradigm shifts to traditional architecture design, even if adoption of Azure services is not in the plan. Virtual Machine (VM) availability, or uptime, is one area that can cause a lot of issues if not planned for ahead of time, leading to rework and possible downtime. In this three-part blog series, we will look at why availability…

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

FSI Blog Series, Part II: Sales and Marketing Collaboration

December 9, 2020   Microsoft Dynamics CRM

Welcome to Part II of our blog series exploring the partnership that exists between HCL-PowerObjects, Microsoft, and Seismic – the world’s most powerful storytelling platform. Our three businesses together provide a unique set of partnership benefits for our customers that includes execution deployment, data analytics, and customized content. This series focuses on the power of this triumvirate…

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

FSI Blog Series, Part I: Digital Selling

December 6, 2020   Microsoft Dynamics CRM

Our faithful blog readers know that HCL-PowerObjects is long-time, trusted, and award-winning Microsoft partner. After all, we are committed exclusively to selling, implementing, and supporting Dynamics 365 and Microsoft Business Applications. But you may not be as familiar with the synergistic partnership that exists between HCL-PowerObjects, Microsoft, and Seismic – the world’s most powerful…

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

Part::partd Part specification is longer than depth of object

October 19, 2020   BI News and Info

 Part::partd Part specification is longer than depth of object

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

Server-Side Synchronization Series Part 2: How Server-Side Sync reaches out to Exchange

September 9, 2020   Microsoft Dynamics CRM

This second part of the Server-Side Sync blog series will be going over how we reach out to Exchange and what properties are used to determine how far we go back in our search for Exchange items. This part will just be going over Incoming Email, and we will cover ACT in more detail later in the series.

How does Dynamics schedule the interval timings to reach out to Exchange? Do all users reach out at the same time?

Let’s focus on a single user that has been properly configured and test/enabled:

 2 2D00 1 Server Side Synchronization Series Part 2: How Server Side Sync reaches out to Exchange

When test/enable is run on the mailbox for all three Sync methods (Incoming, Outgoing, and ACT), Dynamics 365 reaches out to Exchange through EWS requests sent from servers running the Asynchronous Processing Service. For online, this means that certain server clusters take turns sending requests to the Exchange environment specified in the Email Server Profile (in the above setup, this is Exchange Online). We send several different SOAP requests to Exchange, such as finditem, getitem, senditem, and syncfolderhierarchy. We send a test email to the user’s own mailbox as part of the steps when performing the test. A Full list of EWS operations can be found here: 

https://docs.microsoft.com/en-us/exchange/client-developer/web-service-reference/getitem-operation

After test/enable completes these operations, the “Configuration Test Results” screen above will change to “Success” and a bunch of values are modified on the associated MailboxBase table for the user. Please note the button labeled “Download Mailbox Details” in the header of the mailbox below:

 2 2D00 2 Server Side Synchronization Series Part 2: How Server Side Sync reaches out to Exchange

This button makes a call to the MailboxBase and returns a ton of useful column data that will tell you the next time the user Mailbox record will reach out to Exchange:

 2 2D00 3 Server Side Synchronization Series Part 2: How Server Side Sync reaches out to Exchange

A full list of each value and how to read them to determine when Server-Side Sync will fire next can be found on my blog post here:

How to read Download Mailbox Details

It’s important to note that the “postponemailboxprocessinguntil” value will have 5 minutes added to the exact time the mailbox completes processing, so will always have at least 5 minutes between each Exchange check (time that it takes sync to complete + 5 minutes).

 So were reaching out to Exchange on a regular schedule now, but what does this actually do?

The value we want to focus on for this part is the Dynamics MailboxBase field “ProcessEmailReceivedAfter”. This value is not on the default mailbox form, but can be added or queried through the mailbox entity API. This timedate value is where we store the received time of the last email we successfully promoted from Exchange.

 2 2D00 4 Server Side Synchronization Series Part 2: How Server Side Sync reaches out to Exchange

It’s important to understand this field to understand what we ask Exchange for when we reach out every 5 minutes. Here is the back and forth conversation we have with Exchange each time we reach out for incoming email:

 2 2D00 5 Server Side Synchronization Series Part 2: How Server Side Sync reaches out to Exchange

Since we found a few emails to promote to Dynamics automatically, we will update the “ProcessEmailReceivedAfter” time to the exact received time of the most recent email we promoted. The next time we reach out to Exchange, we will only query for emails since that time, since we know nothing new has entered the Inbox before that value.

But wait, if we only check for emails back to a certain time, what if we move items from a subfolder to the inbox? Wouldn’t they be missed since their received times are before what we have set?

That is correct. If you move emails from a subfolder and we have promoted an item that arrived after the received time of the email you’re moving, we will never evaluate this moved item on subsequent cycles. This is where this field comes in; you can manually modify this time for the specific user context, causing our next sync cycle to go back farther in time to evaluate emails. This would allow the sync session to find the email moved from the subfolder in this scenario.

Wont we also reprocess all the other emails still in the Inbox that we already evaluated? Wont this make duplicates?

We would also evaluate all of the other emails with received times after this timedate value, but duplicate emails cannot occur due to the “MessageID” field on the emails in Exchange. Each email has a unique Message ID, which we pass as part of the email creation within Dynamics.

On the Dynamics SQL side, “MessageID” has a duplicate key constraint, meaning that the column can only have unique values (aside from one specific exception, which we will discuss in a later blog series). This prevents duplicate emails from being created in Dynamics even if they are reprocessed.

The next part of our blog series will go over correlation methods, which is how the emails in the above scenario were automatically promoted based on how the user is configured.

Part 1, Part 3, Part 4

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

Read More
« Older posts
  • Recent Posts

    • Rickey Smiley To Host 22nd Annual Super Bowl Gospel Celebration On BET
    • Kili Technology unveils data annotation platform to improve AI, raises $7 million
    • P3 Jobs: Time to Come Home?
    • NOW, THIS IS WHAT I CALL AVANTE-GARDE!
    • Why the open banking movement is gaining momentum (VB Live)
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited