• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: search

Search SQL Server error log files

January 23, 2021   BI News and Info

Each instance of SQL Server logs information about its processing to a file known as the error log. Depending on how long an instance has been up and what is being logged, the log files might be small or large. When the log files are small, they are fairly easy to browse using SQL Server Management Studio (SSMS). But when they are large, it is cumbersome to browse through them to find individual error log messages using SSMS. There are even times when the error log file is so large it can’t even be opened up using SSMS. This article will show you a few different ways to browse and search SQL Server error log files.

Using SSMS to search and filter large SQL Server error log files

When browsing a large error log file with SSMS, it can take a long time just to scroll through the file to find the portion of the log that you might be interested in reviewing. I find it easier to use the search and filter options to find the information in large error log files. I’ll demonstrate how to use these options to find information in large error log files.

Using the search option

The search option is useful for finding the next occurrence of a string of characters in the log. To search, you can just browse through one of the archived log files, as shown in Figure 1.

browsing my error log file to search sql server er Search SQL Server error log files

Figure 1: Browsing my error log file

Figure 1 shows the beginning of the error log file, and the entries are sorted by the date/time from the oldest to the newest. You can see the Search function outlined by a red box at the top of the screenshot. To use the search function, just click on this search icon, which brings up the search dialog shown in Figure 2.

search selection dialog Search SQL Server error log files

Figure 2: Search selection dialog

To search, just enter the string of characters you want to find in the Search for: field. The characters can be case-insensitive or case-sensitive based on whether the Match case check box is checked. You can also search just the Message column or all the columns depending on if the Search Message column only box is checked. When an error log file spans many days, you could uncheck this checkbox to search for a particular date/time string in the log. By doing this, the error log can be reposition to display a specific day in the log in a log file that contains multiple days.

For this demonstration, enter the string error in the Search for: criteria. Once the search criteria are filled in, the Search button is enabled, as shown in Figure 3.

enabling search button search sql server error log Search SQL Server error log files

Figure 3: Enabling Search Button

When clicking the Search button, the error log position is relocated to the first occurrence of the string error, as shown in Figure 4.

repositioned to first occurrence of the string Search SQL Server error log files

Figure 4: Repositioned to first occurrence of the string

Click the Search button again to move to the next message text that contains the string error, as shown in Figure 5.

next occurrence of the string error Search SQL Server error log files

Figure 5: Next occurrence of the string “error”

By reviewing Figure 5, you can see the search function found the string error just a few lines down further in the log (the actual string error is located out of view to the right). By clicking the search button repeatedly, you can progressively work through the large error log file finding all the messages that contain the string error. Once the last message is found, the search will start over from the top if you click the button again.

Using the search button repeatedly could be a little tedious, especially if the log file contains many messages with string error. Another way to find all the messages without clicking and scrolling is to use the filter option.

Using the Filter Option

The filter option makes it a little easier to find all the occurrences of a string in the error log file. It does this by sifting through a large error log file and only displaying those rows that meet the filter criteria. Filtering is handy when you want to view specific log entries in a very large log file. To bring up the filter criteria, you need to click on the Filter options in the Log File Viewer window, as shown in Figure 6.

selecting the filter option Search SQL Server error log files

Figure 6: Selecting the Filter Option

When the filter option is clicked, the dialog box in Figure 7 is shown.

filter options Search SQL Server error log files

Figure 7: Filter Options

As you can see from Figure 7, there are several different filter selection options from which to choose. You can use one, or more of these filter options to identify those error log records you want to display. Table 1 lists the descriptions for each of these different filter options.

Table 1: Descriptions for each filter option

Filter Name

Description

User

The user name that is associated with the log entry

Computer

The computer that is associated with the log entry

Start Date

Log entry must be created on or after this date

End Date

Log entry must be created on or before this date

Message contains text

Log entry message must contain this text (case-insensitive)

Source

The source of the log entry

Instance Name

The instance Name that is associated with the log entry

Event

The event id that is associated with the windows log entry

To demonstrate how to use the filter dialog to find specific error logs, first try to find the ERRORLOG file directory name using the Message contains text filter item. The error log directory name is displayed on an error log line item that contains the string Logging SQL Server messages in the message text. Therefore, all you need to do is enter this string in the Message contains text filter item, check the Apply filter checkbox, and then click on the OK button, as shown in Figure 8.

applying filter Search SQL Server error log files

Figure 8: Applying Filter

After clicking the OK button, only the error log lines that contain the text are displayed, as shown in Figure 9. If the Apply Filter checkbox is not checked, before clicking on the OK button, the filter won’t be applied.

word image 63 Search SQL Server error log files

Figure 9: Results of message text filter

Using the filter item is especially useful for finding those messages that are hidden amongst all the messages you are not interested in. I also find using the Start Date and End Date filters extremely useful to find log entries for a specific date range. The date range filter is handy when the error log file is very large and contains multiple days of error log records.

Out of memory errors when viewing large logs

If SQL Server has been up for a while and the error log has not been cycled, or a lot of messages have been written to the log file over a short time, then the error log might be very large — possibly in the gigabyte size range. If you try to open one of these gigabyte log files using SSMS, a memory exception will occur. Figure 10 shows the out of memory exception that can occur when opening one of the large error log files.

word image 64 Search SQL Server error log files

Figure 10: Out of memory exception when trying to view a large error log file

I got this error when I tried to open one of my large, archived log files that was over 8 GB in size. When this error occurred, some of my log records were loaded into the viewer. I could still use the search option, but I got another memory exception when I tried to use the filter option.

If you are trying to use the SSMS to view large log files and having memory issues, this doesn’t mean you are out of luck. There are other options to view, search and filter these large log files.

Using a text editor to view a large log file

One option to view a large log file is to use a text editor. But it can’t just be any text editor; it needs to be a text editor that can read a large file. I have downloaded and used UltraEdit in the past to open large error log files. I’m not endorsing UltraEdit; I only mention it here because it is one of the editors I have used in the past to look at large log files. Keep in mind that UltraEdit is not free software; you need to have a license to use this product long-term. Before you consider downloading any text editor off the internet, make sure you understand the software’s uses and license requirements being downloaded.

Programmatically searching the error log file

Another option for searching those larger log files is to do it programmatically. SQL Server provides an undocumented extended stored procedure named xp_readerrorlog that can be used to search the error log and the SQL Agent log files.

Listing 1 is an example of how I used this undocumented stored procedure to search the active error log file on one of my instances of SQL Server.

Listing 1: Using xp_readerrorlog to find the location of error log file

exec xp_readerrorlog 0,1,N‘Logging SQL Server messages in file’;

This example searches for the string Logging SQL Server messages in file in the active log file. The output shown in Figure 11 is returned when running the command.

word image 65 Search SQL Server error log files

Figure 11: Output from running code in Listing 1

The log record that identified the file location where the error log messages are being written can be found by searching for this particular string in the active log file.

Even though this stored procedure is undocumented, there are many resources out there that explain how to use it. This stored procedure supports seven parameters. Those parameters are described in Table 2.

Table 2: Parameters for xp_readerrorlog

Parameter

Description

1

Identifies the error log file that you would like to read.  Set this parm to 0 if you’d like to read the current error log.  Or you can set it to either 1, 2, 3, etc. to read one of the historical error log files.

2

Identifies which error log to search.  1, or null for ERRORLOG, or 2 for the SQL Agent log

3

The first string you want to search for in the error log file.

4

The second string you want to search for in the error log file.

5

The start time constraint on searching.

6

The end time constraint on searching. 

7

Sort order of the output (ascending, descending)

Finding all the records in a large log file that contained the word error can easily be done by just changing the search string in parameter 3 of the code in Listing 1. You can write a short T-SQL script to find all the log records from the active SQL Server log file for yesterday and then place them in a temporary table for further analysis using the code in Listing 2.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

– Declare Variables needed

DECLARE @StartDate date,

        @EndDate   date;

– Create temporary table to how error log records

CREATE TABLE #ErrorLogForYesterday (

  LogDate datetime,

  ProcessInfo varchar(max),

  Text varchar(max));

SET @StartDate = dateadd(dd,-1,getdate()); – Yesterdays Date

Set @EndDate = getdate(); – Todays Date

– Extract error log records for yesterday in to temporary table

INSERT INTO #ErrorLogForYesterday EXEC xp_readerrorlog

            0,1,N”,N”,@StartDate,@EndDate;

– Display error log records extracted

SELECT * FROM #ErrorLogForYesterday;

– Cleanup

DROP TABLE #ErrorLogForYesterday;

Listing 2: Code to extract yesterday’s error log records

Programmatically finding error log records makes it easy to build processes to analyze the error log file. Using the method in Listing 2, a DBA could create a series of scripts that could programmatically run the xp_readerrorlog stored procedure to quickly analyze the different error log files.

Reading and Searching SQL Server Error Log Files

When SQL Server creates large error log files, it presents challenges for DBAs to read them. Large log files are cumbersome to scroll through to find errors. Luckily, the log view functionality of SSMS has the Filter and Search features built-in to allow a DBA to find strings within these large log files quickly. Additionally, using TSQL code to call the undocumented xp_readerrorlog stored procedure, allows a DBA to build scripts to read those large log files. Using these different methods to find errors in large SQL Server log files is critical for managing and maintaining SQL Server.

If you like this article, you might also like SQL Server Error Log Configuration – Simple Talk (red-gate.com)

Let’s block ads! (Why?)

SQL – Simple Talk

Read More

Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In-Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

November 25, 2020   CRM News and Info

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 625x357.jpg.pagespeed.ic.wCFmG3bH y Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

At the heart of any company’s success are its customers. The success and growth curve of any company can be predicted by gauging its customer satisfaction levels and market reputation. One vital component that has a significant effect on these attributes is the quality of customer field service. Are they quick? Are they thorough? Are they reliable? Are they worth the cost? These are just some of the questions that an organization has to prove for every customer they have.

With Dynamics 365 CRM, Microsoft has allowed organizations to leverage data and understand where their customers are and when is the ideal time for them to show up. With the emphasis on a customer-first approach, ‘ask and you shall be given’, has become the norm in customer engagement and support. Booking for services like consultations, deliveries, repairs, etc. at a customer’s fingertips has made convenience a necessity rather than a luxury.

This makes on-field customer servicing an essential business process for many organizations. Companies turn to technology for helping their managers in allocating, routing, and managing their field service agents as well as optimizing and maximizing the overall efficiency of their end-to-end field service process.

This is where Maplytics comes in! Maplytics Field Service Solution handles all your field service requirements as an end-to-end Field Servicing Solution Our efforts in creating this solution have even been rewarded with the community’s constant appreciation and Microsoft’s Preferred Solution badge on Microsoft AppSource.

The best way to grip the power of Maplytics end-to-end Field Service Solution is to see it in action! So let’s explore this with a use case. Consider Josh is the manager at a fast-growing carpentry company ‘BuildsThat Last.’ His company is getting recognition for its quick carpentry services and quality customer support.

Josh has a group of total 7 field service agents under his management and today he is managing one of the highest sales regions, Staten Island, New York. On average, they get a minimum of 10-15 customers daily from this region and his responsibility is to serve all of them. Therefore he has two of his best field agents, William and Joe, prepped to be deployed.

Mindful of the volume of requests, Joshis committed to do an exceptionally good job. He is handy with Maplytics so he knows he will have no trouble allocating route-optimized schedules that will save his field agent’s time and transportation costs. Josh uses the Auto Scheduling feature of Maplytics to do this. First, he plots all Active Cases on the map as shown below and then uses the Auto Scheduling option in Mass Actions to open the Auto Scheduling card.

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 1 625x320.png.pagespeed.ic.mYJC qQMzY Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

Now Josh selects Joe and William as the users for whom the schedule is to be generated. To keep the integrity of previously created schedules intact, Josh thinks to select the checkbox of Consider Existing Meetings. This option would take the existing meeting of his field agents into consideration while generating the schedule. He can also choose whether to add these existing meetings to their current schedule or reschedule them. But when checked, he finds his field agents don’t have any existing meetings on the current day, so he decides to leave it unchecked.

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 2 625x320.png.pagespeed.ic.aJmwt eL5i Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

He then uses the Advanced Preferences option to set a maximum of 10 meetings in a day where each meeting lasts 30 minutes or less. If there was a situation where Josh experienced a spike in requests, he could easily create a multi-day schedule to resolve all requests in a span of multiple days. Once all the preferences are set, he clicks Proceed to create the Auto Schedule.

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 3 625x320.png.pagespeed.ic.9T7NVmBim5 Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

Josh now has an optimized schedule for his field reps that was created within minutes using Auto Scheduling! The color-coded nature of the routes makes it simple to clearly distinguish between Joe’s and William’s routes.

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 4 625x320.png.pagespeed.ic.YVhYT7xjJx Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

Josh then creates an activity for this schedule. He chooses to create an appointment and fills in all the required details as shown below. Once filled, he creates this activity for William and Joe.

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 5.png.pagespeed.ic.oKFBDswIq8 Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

Now let’s transition to the other end of the solution. Once the activity is created by the manager, the field agents can then go to the field and start using the Maplytics app on their mobiles or tablets. They can start navigation on their assigned routes using either Google Maps or Waze App and view turn-by-turn navigation while following the route on the field.

Let’s shift to the perspective of the on-field agent William and explore his experience with Maplytics. William has a schedule assigned to him for the day and he’s ready to get started. Firstly, he opens the schedule assigned to him on the map to understand his day as shown below. Once plotted, he clicks on the Navigate button to start his schedule.

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 6 313x625.png.pagespeed.ic.dnSVX12 o7 Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

As William clicks on the Navigate button, he gets redirected to Google maps where he can easily view the route for turn-by-turn directions. This happens because Williams has set the ‘Navigation Within’ option to Google App in the user configuration detail record.

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 1 300x625.jpg.pagespeed.ic.QBLY6u5rWH Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

Following directions is a piece of cake now with the easy-to-follow directions on the map. With the optimized route and the turn-by-turn navigation, William reaches the destination in the least time possible. When he reaches the service location, he uses the Check-In button to register his arrival to the service site as shown below.

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 7 313x625.png.pagespeed.ic.ifEzop6fQd Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

William’s appointment is for the inspection and restoration of an antique table. He diligently begins to inspect the table but after 10 minutes she concludes that the table is beyond the scope of restoration and deems it as a lost cause. He uses the Check-Out button to register that he is done with the appointment. Now since his appointment is cut short he’s left with free time. William sees this as an opportunity and quickly decides to schedule another meeting with a nearby customer.

To do this, he opens Maplytics on his phone and clicks the Locate Me button on the Detail Map to plot his current GPS location. Next, he opens his Plot Records card and enters a 2 and 3-mile radii as Search Radius since his next location is more than 3 miles away. He then selects Leads as the data source, My Open Leads as the View and then hits the Search button.

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 8 300x625.png.pagespeed.ic.r QbqlXGrp Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

Now William has all the leads plotted on the map which lie within 2 to 3 miles from his current location as shown below.

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 9 313x625.png.pagespeed.ic.vGpzgVdaSw Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

He uses the tooltip cards of the respective pushpins and looks at the Line-of-Sight Distance as shown below to understand which location is the nearest and most suitable to him from his current location. He sees that the closest customer is only .40 miles and it is also aligned with the route to his next customer! William proceeds to contact the customer to seek confirmation for a preponed service. Once he receives a confirmation, he makes his way to the customer’s location by right-clicking on their pushpin and adding it to the route. He uses Check In to register his appointment, performs the service, Checks Out, and moves on to continue with his remaining schedule.

xComplete Field Service Solution with Auto Scheduling Optimized Routes Check In Out Territory Management Radius search et al within Dynamics 365 CRM PowerApps 10 313x625.png.pagespeed.ic.xf iea IUI Complete Field Service Solution with Auto Scheduling, Optimized Routes, Check In Out, Territory Management, Radius search, et al, within Dynamics 365 CRM / PowerApps!

While the field agents are continuously logging their progress, Josh can monitor how many requests have been met and also track the agents who are fulfilling the maximum numbers of requests by plotting the Check-Ins and Check-Outs on the map. This gives him an overview of the day’s performance by the entire team and individuals.

The above use case illustrates just a glimpse into the everyday life made simple for the managers and field agents in the organization using Maplytics’ end-to-end field service solution. While this solution simplifies organizing and managing field agents for managers, it optimizes convenience and maximizes productivity for the field agents.

Maplyticsis a 360-degree geo-analytical solution that revolves around customer satisfaction and therefore, you can find a huge array of other productivity-enhancing features that allow you to auto create territories, manage territories, perform along the route search, set area of service, and much more within the Maplytics features set!

So if customers are your priority and you would like to introduce the next level of intelligence for your field service, you need to get hands-on experience of Maplytics today with our free 15-day trial. You can download Maplyticsfrom our Website or Microsoft AppSource (don’t forget to check out our shiny official badge of Preferred Solution there!)

Need someone to walk and talk you through this end-to-end solution? We’ve got our support team waiting for you just an email away. You can draft now and click crm@inogic.com to get started with your free and personalized Maplytics demo!

Until the next time –we’ll be around on your map!

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Google details how it’s using AI and machine learning to improve search

October 16, 2020   Big Data

Automation and Jobs

Read our latest special issue.

Open Now

During a livestreamed event this afternoon, Google detailed the ways it’s applying AI and machine learning to improve the Google Search experience.

Soon, Google says users will be able to see how busy places are in Google Maps without having to search for specific beaches, parks, grocery stores, gas stations, laundromats, pharmacies, or other business, an expansion of Google’s existing busyness metrics. The company also says it’s adding COVID-19 safety information to business profiles across Search and Maps, revealing whether they’re using safety precautions like temperature checks, plexiglass, and more.

An algorithmic improvement to “Did you mean,” Google’s spell-checking feature for Search, will enable more accurate and precise spelling suggestions. Google says the new underlying language model contains 680 million parameters — the variables that determine each prediction — and runs in less than three milliseconds. “This single change makes a greater improvement to spelling than all of our improvements over the last five years,” Prabhakar Raghavan, head of Search at Google, said in a blog post.

Beyond this, Google says it can now index individual passages from webpages as opposed to whole pages. When this rolls out fully, it will improve roughly 7% of search queries across all languages, the company claims. A complementary AI component will help Search capture the nuances of what webpages are about, ostensibly leading to a wider range of results for search queries.

“We’ve applied neural nets to understand subtopics around an interest, which helps deliver a greater diversity of content when you search for something broad,” Raghavan continued. “As an example, if you search for ‘home exercise equipment,’ we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page.”

Google is also bringing Data Commons, its open knowledge repository that combines data from public datasets (e.g., COVID-19 stats from the U.S. Centers for Disease Control and Prevention) using mapped common entities, to search results on the web and mobile. In the near future, users will be able to search for topics like “employment in Chicago” on Search to see information in context.

On the ecommerce and shopping front, Google says it has built cloud streaming technology that enables users to see products in augmented reality (AR). With cars from Volvo, Porsche, and other “top” auto brands, for example, they can zoom in to view the steering wheel and other details in a driveway, to scale, on their smartphones. Separately, Google Lens on the Google app or Chrome on Android (and soon iOS) will let shoppers discover similar products by tapping on elements like vintage denim, ruffle sleeves, and more.

 Google details how it’s using AI and machine learning to improve search

Above: Augmented reality previews in Google Search.

Image Credit: Google

In another addition to Search, Google says it will deploy a feature that highlights notable points in videos — for example, a screenshot comparing different products or a key step in a recipe. (Google expects 10% of searches will use this technology by the end of 2020.) And Live View in Maps, a tool that taps AR to provide turn-by-turn walking directions, will enable users to quickly see information about restaurants including how busy they tend to get and their star ratings.

Lastly, Google says it will let users search for songs by simply humming or whistling melodies, initially in English on iOS and in more than 20 languages on Android. You will able to launch the feature by opening the latest version of the Google app or Search widget, tapping the mic icon, and saying “What’s this song?” or selecting the “Search a song” button, followed by at least 10 to 15 seconds of humming or whistling.

“After you’re finished humming, our machine learning algorithm helps identify potential song matches,” Google wrote in a blog post. “We’ll show you the most likely options based on the tune. Then you can select the best match and explore information on the song and artist, view any accompanying music videos or listen to the song on your favorite music app, find the lyrics, read analysis and even check out other recordings of the song when available.”

Google says that melodies hummed into Search are transformed by machine learning algorithms into a number-based sequence representing the song’s melody. The models are trained to identify songs based on a variety of sources, including humans singing, whistling, or humming, as well as studio recordings. They also take away all the other details, like accompanying instruments and the voice’s timbre and tone. This leaves a fingerprint that Google compares with thousands of songs from around the world and identify potential matches in real time, much like the Pixel’s Now Playing feature.

“From new technologies to new opportunities, I’m really excited about the future of search and all of the ways that it can help us make sense of the world,” Raghavan said.

Last month, Google announced it will begin showing quick facts related to photos in Google Images, enabled by AI. Starting in the U.S. in English, users who search for images on mobile might see information from Google’s Knowledge Graph — Google’s database of billions of facts — including people, places, or things germane to specific pictures.

Google also recently revealed it’s using AI and machine learning techniques to more quickly detect breaking news around crises like natural disasters. In a related development, Google said it launched an update using language models to improve the matching between news stories and available fact checks.

In 2019, Google peeled back the curtains on its efforts to solve query ambiguities with a technique called Bidirectional Encoder Representations from Transformers, or BERT for short. BERT, which emerged from the tech giant’s research on Transformers, forces models to consider the context of a word by looking at the words that come before and after it. According to Google, BERT helped Google Search better understand 10% of queries in the U.S. in English — particularly longer, more conversational searches where prepositions like “for” and “to” matter a lot to the meaning.

BERT is now used in every English search, Google says, and it’s deployed across languages including Spanish, Portuguese, Hindi, Arabic, and German.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Intelligent Analytics: The Search For Hidden Treasure In Your Business Data

April 2, 2020   BI News and Info

Tech Unknown | Episode 5 | Season 2

Featuring guests Carla Gentry, Iver van de Zand, and Timo Elliott with host Tamara McCleary

Subscribe: Apple Podcasts | Stitcher | Google Play

Your organization is sitting on buried treasure. There are precious insights to be extracted from organizational data and used to guide your business to new levels of profitability. In this episode of Tech Unknown, our guests explore how you can find the buried treasure, extract and refine it, and put it to work for your business.

Businesses worldwide are creating and storing massive amounts of data, more than at any point in history. How much data? By 2025, we’re looking at 175 zettabytes of data. To put that in context: If you burned those files to DVDs, you’d have a stack that reached all the way to the moon.

And back.

Twelve times.

This business data is one of the most valuable assets for a modern corporation. It holds the information businesses need to make smart investments, develop new lines of business, increase efficiency, and ultimately drive more revenue. 

If simply having the data were enough to realize these results, every business would be going like gangbusters. But the value doesn’t come in having the data; it’s in using it, analyzing it, extracting insight, and using those insights to guide the business.

If you’re still relying on spreadsheets and manual processes to analyze your corporate data, you’re likely missing out on much of the potential. There’s simply too much data, in too many streams, updated too quickly, for humans to keep up.

That’s where intelligent cloud-based analytics can help. The processing power of the cloud can help with every part of the process: data collection, sanitization, aggregation, analysis, and reporting.

This episode, we’re taking you on a treasure hunt to find the gems hidden in your mountains of business data. Our experts explain what intelligent analytics means, share how businesses can use it, and offer inspiring success stories from leading organizations.

Listen to Learn:

  • How businesses can use machine learning and AI to analyze data
  • The three crucial questions you can answer with predictive analytics
  • How analytics improves outcomes across the organization
  • How custom dashboards can make reporting quick and easy

Want to learn more about SAP Analytics? Connect with an expert today.

About Our Guests:

CarlaGentry 480x480 1 Intelligent Analytics: The Search For Hidden Treasure In Your Business Data

Carla Gentry is a data scientist with over two decades of experience in predictive models, algorithms, and data structure as they relate to driving business insight. She has consulted for Fortune 100 companies and is currently the resident “Data Nerd” at the University of Central Florida.

“Look at your data. Don’t wait two, three, four months to look at that data because now it’s hindsight. Look at that data daily or weekly if you can.” –Carla Gentry

IvervandeZand 480x480 1 Intelligent Analytics: The Search For Hidden Treasure In Your Business Data

Iver van de Zand is the vice president of solution management and product strategy at SAP. He is the author of Passionate on Analytics and blogs at ivervandezand.com.

“Bringing analytics into the cloud allows you to scale your analytics towards the scale of your business.” –Iver van de Zand

TimoEliott 480x480 1 Intelligent Analytics: The Search For Hidden Treasure In Your Business Data

Timo Elliott is the vice president, global innovation evangelist at SAP. He has spent over 30 years presenting to business and IT audiences in over 58 different countries, talking about digital transformation, AI, analytics, and the future of digital marketing. He also blogs at timoelliot.com.

“Analytics really is core to every aspect of business. Wherever you have a process, or a customer experience, or employee experience, the very first thing you need to do is be able to measure it. Without being able to measure it, you can’t analyze it and you can’t optimize it.” –Timo Elliot

Did you miss our last episode?

Check out our previous episode with guests Lisa Anderson, Tim Crawford, Eric Kavanagh, and Tom Roberts: “Intelligent ERP: The Foundation of Digital Evolution.” Click here to listen.

Episode 5 Transcript:

Tamara: Welcome to Tech Unknown, a podcast to prepare your organization for the tech-centered future of business. I’m Tamara McCleary, CEO of Thulium.

Our big umbrella topic this season is data. We’re digging into how sharing data across the organization can increase efficiency, reduce costs, and improve the customer experience.

This episode, we’re going to talk about the process of turning data into insight. Raw data by itself isn’t yet an asset to your business. The value comes from what’s hiding inside the data: The trends you can use to build predictive models and guide your business to greater profitability.

Think about it this way: Imagine that you’re an old-timey pirate, sailing the Seven Seas…

And you have a map that leads to an island full of buried treasure. That’s your data.

Parrot: Awk! Pieces of eight!

Tamara: Exactly. But when you get to the island, you can see that it’s MASSIVE… you can’t even see the other side from where you’re standing! 

And your map doesn’t say where the treasure’s buried… there’s just a big X marked across the whole island.

Now, you could just start digging and hope for the best… a total shot in the dark! But, that’s what most businesses are doing with their data.

So in theory, you have this enormous treasure at your fingertips…

But really, you can’t find your doubloons hidden in all that dirt.

When we talk about data as a business asset, this is the challenge. We all know there’s value in the data, but how do we find it and extract it?  

The answer is through cloud-based, intelligent analytics.

Now, analyzing data is nothing new – but analyzing massive amounts of data quickly for real-time insight is new. Revolutionary, even. Let’s dig into why we need the processing power of the cloud for data analytics at scale.

Iver van de Zand: Hey, my name is Iver van de Zand. I’m the vice president of product strategy for augmented BI with SAP.

If we look today how the worldwide volume of data is evolving, then that looks as follows. Today, worldwide, we have, more or less, 50 zettabytes of data, which is expected to grow to 175 zettabytes in 2025. If you stored that on DVDs, that brings you 23 times to the moon. 

Companies today are only able to look at the tip of the iceberg of that data, yeah? Because simple business intelligence today with on-premises users doesn’t allow to even scan and analyze that amount of data. Bringing this all to the cloud means that we can use new technologies, automated insights, augmented insights to start understanding these enormous amounts of data. That can only be done through the cloud.

A few more reasons going to the cloud is all about using analytics… a cloud allows you a way bigger and a larger variety of accessing data sources. But also, if you manage a platform around analytics through the cloud, you have way lower costs in terms of manageability, yeah? Because it’s centrally managed for you. That means together, that if you’re looking at the real potential of bringing analytics to the cloud, then that answers a scalability question. Bringing analytics into the cloud allows you to scale your analytics towards the scale of your business.

Tamara: So you have zettabytes of data to process, and now you have the storage and processing power that you need to do it. But you need one more piece of technology to eliminate the final bottleneck to business intelligence: You. Well, not you personally, but human beings. If your team is manually processing data, or your data is siloed between departments, you’re holding up the computers! Here’s Iver again on why machine learning and AI are a crucial part of the equation.

Iver: The amount of data is increasing that rapidly that we must come with machine learning-driven algorithms that help us find the correct insights at the correct moment. You could also see this as making things smart so what we massively are doing, incorporating in our software today are, for example, forecasting and correlation algorithms. Forecasting algorithms automatically start forecasting for you through time, whatever metric you have. Correlation algorithms are even applied even more. Correlation algorithms tell you what perspective, what attribute contributes to a certain metric the most.

To make it very tangible, imagine your margin is growing with 2%; you want to know what element contributed to that? What caused that increase of 2%? Was it the product line? Was it the time of the year? Was it a region? Was it the sales owner? So that’s correlation really helping you with machine learning and artificial intelligence finding the correct insight the moment you need that. That, for me, represents the role of artificial intelligence in modern analytics.

Tamara: So all we have to do is connect our data streams to a cloud-based analytics platform, and we can all head home for the day?!  Well, in that case, I welcome our robot overlords! Uh, sorry – that was entirely in jest for my fellow science fiction enthusiasts. The best results are clearly achieved when we pair human intelligence with artificial intelligence…Here’s Carla Gentry, data scientist at Analytical Solution, to tell us why.

Carla Gentry: It’s not about just collecting data, it’s about gleaning insights from that data. So, I would say the future of analytics has to be that we finally, you know, stop talking about Perl and Python and R and all this other programming crap, and start talking about the results that we actually get from that. Because all of those things that I just mentioned are just a tool that is in the data scientist’s arsenal of weapons. They have statistics, that is a tool, they have programming as a tool. SQL is just a tool. The visualizations and all are a tool, but they’re tools that are needed to be able to show your boss or your client or, you know, the C-suite that this is working.

So, let your data tell a story. You know, “this is the information, where we got it, what we did to it, how we cleansed it.” Be transparent. Don’t start, “Oh, well, there was missing data, so we just auto-populated some stuff in there.” Well, you just skewed the entire dataset and biased it based on your personal biases because you assumed what the customer meant, and you really don’t have a clue. So, letting the data speak for itself is how we’re gonna get integrity and transparency.

We have to have clean data to be able to do machine learning. 

Tamara: So we start with clean data. We take it to the cloud for processing and analytics. Okay, then how do we identify the buried treasure? Iver says there are three types of questions our data analytics can answer.

Iver: If I look at predictive analytics for my area, meaning providing customers with 360-degree insight in how their business is running, then you typically see three areas. You have an area that answers the questions, “hey, how is that process going? What is the level of that KPI? How am I doing there?” This is typically covered by business intelligence, whereas you also have an area is on, “hey, what am I planning to do? What was my objective to have that KPI and be on?” Which is typically the planning area. And to answer your question, answering questions as “what could happen? What happens if I change that?” That area of insight is typically covered by predictive analytics. So predictive analytics for me is all about looking into the future, but primarily also simulating, “what can happen with this product line if I change the supplier?” for example.

Tamara: As Iver says, there are three types of questions that intelligent analytics can answer: Performance, planning, and prediction. These “three p’s” combined can generate amazing insights for any type of organization. For example….

Iver: This is a company that produces tires for cars in Europe, and of course, they deliver those tires today with modern technology embedded in those tires’ sensors. And those sensors measure the quality of the tire 24 hours a day, 365 days per year. The data that they take out of that is, of course, very useful for their suppliers and the support organizations to understand because they can predict when a tire on a truck is going to break, so they can automatically serve the truck. What I really like about this example is that you would expect that company to provide that data on the tires to those suppliers and assistance to these services companies. But they didn’t. They sell the data, and this is a very nice example of not only solving a business problem, even stronger: monetizing a business problem.

Tamara: When you fully leverage the value of your data as a business asset, you can actually sell that asset as a product! As in Iver’s example, even a tire company can be in the information business. 

But predictive analytics can do more than solve business problems. It can even do more than create new business models! Smart data analytics can literally save lives.

Iver: So I think everybody’s with me if I say that data is more or less becoming the blood of our economy today, yeah? The blood that drives the heartbeat of our economy today. And a nice example is in a hospital in London where they research patients with heart diseases, using sensor technologies that are put into the bodies of these patients. And using modern artificial intelligence, today, this hospital is capable of predicting a heart failure with a patient, and that is massive news. So by constantly monitoring heart rhythm and blood quality through sensors put into the bodies of those patients, the hospital can predict – when this patient makes a certain move or has a too intense pattern in his life – that they can warn the patient that a heart failure is going to come. And I think there is no better example to prove the use and the value of artificial intelligence than this one.

Tamara: That’s quite a heartwarming story, don’t you think?

AI voice: Bad pun detected. Predictive analysis shows a 60% possibility of future attempts at wordplay. Recommendation: Early termination of episode.

Tamara: Okay, so AI may be great at predicting the future of business, but it still doesn’t have much of a sense of humor.

AI voice: That makes two of us.

Tamara: Ouch! Okay, fine. Maybe intelligent analytics can quantify human behavior. In fact, Carla has a great example of how you can use predictive analytics to take some of the guesswork out of hiring new talent.

Carla: Well, like for turnover within a particular company, you can look at the habits and the personalities of the particular person in that position. So, we’ll take, for example, like a high turnover position like sales or a cashier. So, say you were a big-box industry, and you have 120% turnover, and you’re spending millions and millions of dollars to train all of these people, and what you really need to do is expand your employee lifecycle. So, you wanna look and see, you know, the employees that you have there, what are their personalities, and you can do that through an assessment. You can do that through surveys or interviews or however you wanna do it. So now, you’ve collected this information. It’s like a test and training set. So, you’ve got the employees that are happy, what are their characteristics? I’m talking about personality characteristics. And look at the employees that are unhappy, see what their characteristics are.

And then when you go to hire people, you give them that same survey, that same assessment, that same, you know, questionnaire and ask them and see what they answer. So, the person that you hire that has the personality more likely to, you know, stay longer and increase that employee, you know, lifecycle is gonna have a personality like this. So, then at the very end, you’ve got like red light, green light, yellow light. The green lights are the people that have similar personalities to the successful employees. So, of course, they get an immediate interview. The people in yellow are the ones that, you know, they have some characteristics, but you’re probably gonna have to have a more formal interview to be able to get any insight. And then the third case in red, you know, and this is all a dashboard kind of thing for the employed or the C-suite or the HR person to be able to look at and make decisions. And it’s a visual kind of cue, we’re, “Okay, red light, we don’t hire them. We’re not even gonna give them a callback.”

So, that is a practical use of being able to predict your employee lifecycle and your turnover rate, based on collecting unstructured data from, you know, surveys or assessments or questionnaires within your own company and then using that same thing for your external potential employees to try to predict whether they would actually work out well in that position or not.

Tamara: Intelligent analytics can keep a heart beating, a tire from blowing out, and a new employee from burning out. But let’s not forget perhaps the single most important source of data we should be analyzing and optimizing for – your customers. Carla says that data is “the voice of the customer,” and she’s not wrong!

Carla: Well, anytime your customer is talking about your brand, that’s the voice. And whether that be social media, whether that be a Google review or a Yelp review or a survey or Net Promoter score, you know, like, how are we doing? Would you recommend us to a friend? Emails that you’ve sent out, did they respond? If they did, how long it took. Any of that information is the voice of the customer because a lot of times when we’re looking at data, we’re thinking PoS data or syndicated data, you know, data that we would get from, you know, like, we would purchase data, but there’s so many other forms of data that’s for free. I mean, we could go on Twitter and use hashtags and see what the customers say about us, and that didn’t cost us anything. We didn’t even have to pay a Twitter ad for that. We just had a person, maybe an influencer, ask a question about a brand. Now we can collect all of that information about how the customers feel about us. So, that’s the thing when I’m talking to people about collecting data. There’s many, many resources out there. Just because you don’t feel that you have a robust dataset, there’s more information out there, you just have to look for it. And a lot of times, it’s free.

Look at your data. Don’t wait two, three, four months to look at that data because now it’s hindsight. Look at that data daily or weekly, if you potentially can, and be able to rectify those issues. That’s why we have, like, knock meetings, these emergency, you know, meetings, “Oh, God, our website is down,” because they’re losing money. When something’s wrong with that website, they respond to it immediately. So, if something is wrong with their service, they need to respond just as immediately. Because, yeah, the website’s your money. That’s your PoS data, your bread and butter, but your customer, ultimately, is that bread and butter. That’s the person you need to respect and take care of. So, if they’ve taken the time to respond to you, respond back to them. Find out what the problem is. That’s free data.

Tamara: As you can see, intelligent analytics is all about finding the value in your data, no matter what business you’re in, and in every department in that business. If we go back to the treasure map analogy from earlier… 

Avast, it’s good to be back on the open sea, me hearties. But if you’ll recall, we had a map that led us to Treasure Island, but it just had a single big X over the entire landmass! The value was there, but we couldn’t get to it without a lot of time-consuming labor.

Intelligent analytics can put the X exactly where the treasure is. More than that, it can call in a swarm of drones with shovels…

Equipped with SONAR….

To pull out those priceless insights and bring them right aboard my boat!

Robot voice: Here be your treasure, Captain McCleary. Avast me hearties, yo ho.

Tamara: Here’s one big difference between a treasure map and an analytics dashboard, though: a custom dashboard can not only show you these insights, it can also update the most important metrics for you in real time. With a custom analytics dashboard, you have the data right at your fingertips.

Here’s Iver again to explain how custom dashboards work for the SAP Analytics Cloud.

Iver: As part of SAP Analytics Cloud, Analytics Designer allows you to build custom-designed dashboards. And that is really what our customers use Analytics Designer as part of SAP Analytics Cloud for. These examples of custom-designed dashboards are, for example, we have a number of car manufacturing companies in Germany that create these custom-designed dashboards that are put into the factory on the ceiling where everybody in the factory can in real-time follow the number of cars they have produced in a day, the number of failures, the outage of the system. So, this is really used to provide all the employees with constant insights on how strong the performance in their factory is on that day. But it also triggers as a kind of a, I would say, gamification mechanism, teaching people to do better the next time. So, this is one of the ways that Analytics Designer is used with our customers.

Tamara: We’ve covered a lot of ground in this episode – from hospitals to highways to the Spanish Main – but it’s time to wrap it all up. In season one of Tech Unknown, I interviewed Timo Elliott, innovation evangelist for SAP. Timo told me a great story about analytics and football fans. It really illustrated how cloud-based analytics can help solve for any problem, including optimizing how much fun you’ll have at a San Francisco 49ers game!

I can almost hear that conversation now… 

Timo Elliot: One of my favorite projects recently is we worked with the San Francisco 49ers. So, in their stadium, they really want to optimize people’s enjoyment of the game. And so, they’re collecting data from nine different systems in real time, so they can immediately spot anything from parking, to the food and beverage, to the weather problems, and really, in real time, act to make sure that whatever is happening is not damaging the overall customer experience. In fact, they even have big screens in what they call the Executive Huddle, so that, as you’re watching the game, they have these big screens with SAP Analytics Cloud bringing all of the data from the game in real time but also all of the historic information, so you can see how this game relates to the overall trend. So it’s directly helping fans enjoy the game more because they have more of the context.

So again, you know, analytics really is core to every aspect of business. Wherever you have a process, or a customer experience, or employee experience, the very first thing you need to do is be able to measure it. Without being able to measure it, you can’t analyze it and you can’t optimize it. So ultimately, analytics, I think, is the ultimate business process that we need to work on in today’s organizations.

Tamara: Analytics, Timo says, is the ultimate business process that we should be concerned with. It makes sense: Data is one of your business’ most valuable assets. The insights from your data can help measure your performance, help you plan for the future, and predict what’s next for your industry. 

Even more than that, the data itself can become part of your product offerings. Intelligent analysis can actually change the way you do business, whether it’s adding a new revenue stream or completely transforming your business model.

But you can’t get all that value from your data with a treasure map and a shovel. It takes intelligent analytics – cloud-based, real-time, augmented with artificial intelligence and machine learning – to unearth every bit of that buried treasure.

Thanks for listening to Tech Unknown, and thanks to my guests Carla Gentry, Iver van de Zandt, and Timo Elliott. Please subscribe on iTunes, Google Play, Stitcher or wherever you listen to podcasts.

I’m Tamara McCleary and until next time: Stay sharp, stay curious, and keep exploring the unknown.

To learn more about intelligent ERP, go to s4hanaopp.com. You can also find a transcript of this episode and more at digitalistmag.com. And make sure to subscribe wherever you listen to podcasts.

Let’s block ads! (Why?)

Digitalist Magazine

Read More

Voice Search Is the Future of SEO

March 26, 2020   CRM News and Info

Achieving top search position for dozens of keywords is the ultimate goal for many content marketers, but Google keeps throwing curveballs our way to make that quest even more difficult. One of these recent updates that is shaking up how we optimize our websites for SEO is voice search. 

So what is voice search? Voice search makes it just a bit easier for us to find the resources and answers we need to make purchasing decisions by allowing us to input our searches audibly instead of typing them up. And because of this added layer of convenience, this won’t be going away any time soon. Organizations will have to consider this factor when optimizing their websites and webpages for SEO. 

shutterstock 1219812214 Voice Search Is the Future of SEO

Voice search is still new, so many of us marketers are still trying to wrap our head around how it works and its vast potential. The good news is that getting ahead of the game now will put you one step ahead of your competitors. Optimizing your website for this new search medium will give your SEO a huge boost and provide a more enjoyable customer journey and user experience. Most importantly, your sales and revenue could potentially skyrocket. 

What Is Voice Search? 

Voice Search enables users to use voice commands instead of typing to search for what they need. Users can search by opening their browser, clicking on the microphone icon to the right of the search bar, and recording whatever phrase they would type otherwise. These search terms can be as simple as “coffee shops” or “coffee shops in northwest Portland open Monday at 7 AM.” 

With this feature being available on both desktop and mobile, the voice search option is quickly increasing in popularity. 

Here are three best practices for optimizing your keyword strategy for voice search.

Consider How People Talk When Developing Your Content

Very few of us talk the same way we write, and we should expect to see that difference reflected in the way our target audience looks for information via voice search.  

For example, while you might type in “Los Angeles restaurants” and then proceed to narrow down your search, you might voice search “What are some good restaurants near the Los Angeles Arts District?” Again, individuals who voice search are using this method because it’s convenient and probably don’t want to spend the additional time scrolling and narrowing down their search. 

You can optimize your content for voice search by including:

  • Questions: Incorporate “Who, What, Where, When, and Why” language into your content. Think of the information your target audience is looking for and how they’ll voice that question when speaking instead of typing. This practice will also drastically improve your chance of earning a featured snippet, which we’ll cover in the next section. 
  • Location: Try to be as narrow and descriptive as possible when it comes to describing your location. Make sure to mention the neighborhood you’re in and nearby attractions in your content (whenever it makes sense to do so; don’t force it). 
  • Related Products or Services: Is there a related product or service that your audience is constantly searching for? If so, you probably want to mention them in your content to ensure they can find you more easily. For example, marketing automation users often want to know if our platform integrates with the CRM they are using, so we make sure to include phrases such as “Does Act-On integrate with Microsoft Dynamics?” or “What are the benefits of integrating SugarCRM with Act-On?” throughout our website.
  • Highlight Your Competitive Advantages: The last thing you want is for your target customer to voice search “What is the best [insert product or service here]?” and have your company not appear in the search results. To prevent this from happening, make sure to mention what makes you stand out from the competition and feature plenty of positive customer reviews throughout your website.

Focus on Earning Featured Snippets 

You’ve probably scrolled through your search results and noticed that, many times, there’s a featured preview with the answer to the question you were looking for. And, as a marketer, you might be wondering how this company managed to get a bit more than just their website name and meta description featured in search results. 

These highlighted posts are called featured snippets, and companies can earn them by offering what Google deems a thorough answer to commonly asked questions. As we previously mentioned, the introduction of voice search means that more individuals will be searching for more specific questions instead of generic terms. So you can kill two birds with one stone by structuring your content to answer frequently asked questions — you’ll improve your ranking in search results and increase your chances of earning that featured snippet spot.

Securing a featured snippet can help you do more than get your content front and center in search results. Better search positioning means better web traffic and more customers. And since securing this spot establishes your company as a thought leader, a featured snippet can also help you build trust and credibility with your audience early on in the sales process — providing added leverage over the competition. 

If you’d like to learn more about how to earn a featured snippet on Google, check out this blog post we wrote on the topic not too long ago. 

Sticking to SEO Basics Can Go a Long Way

You have to learn how to walk before you can run, and that same concept applies if you want to optimize your website for voice search. In other words, the two tactics we mentioned will go further if you have a solid SEO foundation to start with — and that means having good keywords, meta descriptions, and little-to-no errors.  And by errors, I don’t mean just looking out for typos; you should ensure that your page is indexed, the meta description is the correct length, you are using the correct headings (H1, H2) with the proper keywords, and that your page doesn’t take too long to load. 

If you’re completely new to SEO, it can be a very difficult concept to grasp — and you might not know where to start to make sure you’re on the right track. Thankfully, there’s a wide array of tools available to ensure your webpages are optimized. Act-On’s SEO audit tool, for example, checks your web and landing pages to verify that you’re following best practices with effective keywords and zero errors.

Guide Customers Through a Targeted and Effective Customer Journey With Act-On

As any good marketer knows, SEO is only one part of the lead generation and growth marketing process. There’s so much more to creating an awesome holistic strategy!

If you’re ready to learn about the digital tools you can use to enhance your marketing strategy, please schedule a demo with one of our marketing automation experts. 

If you’re not quite there yet, please download the eBook below to learn how to optimize your funnel!

Let’s block ads! (Why?)

Act-On Blog

Read More

The growth of cognitive search in the enterprise, and why it matters

December 15, 2019   Big Data
 The growth of cognitive search in the enterprise, and why it matters

Enterprises typically have countless data buckets to wrangle (upwards of 93% say they’re storing data in more than one place), and some of those buckets invariably become underused or forgotten. A Forrester survey found that between 60% and 73% of all data within corporations is never analyzed for insights or larger trends, while a separate Veritas report found that 52% of all information stored by organizations is of unknown value. The opportunity cost of this unused data is substantial — the Veritas report pegs it as a cumulative $ 3.3 trillion by the year 2020, if the current trend holds.

That’s perhaps why this year saw renewed interest from the corporate sector in AI-powered software-as-a-service (SaaS) products that ingest, understand, organize, and query digital content from multiple sources. “Keyword-based enterprise search engines of the past are obsolete. Cognitive search is the new generation of enterprise search that uses [AI] to return results that are more relevant to the user or embedded in an application issuing the search query,” wrote Forrester analysts Mike Gualtieri, Srividya Sridharan, and Emily Miller in a comprehensive survey of the industry published in 2017.

Emerging products

Microsoft kicked the segment into overdrive in early November by launching Project Cortex, a service that taps AI to automatically classify and analyze an organization’s documents, conversations, meetings, and videos. It’s in some ways a direct response to Google Cloud Search, which launched July 2018. Like Project Cortex, Cloud Search pulls in data from a range of third-party products and services running both on-premises and in the cloud, relying on machine learning to deliver query suggestions and surface the most relevant results. Not to be outdone, Amazon last week unveiled AWS Kendra, which taps a library of connectors to unify data sources, including file systems, websites, Box, DropBox, Salesforce, SharePoint, relational databases, and more.

Of course, Google, Amazon, and Microsoft aren’t the only cognitive search vendors on the block. There’s IBM, which offers a data indexing and query processing service dubbed Watson Explorer, and Coveo, which uses AI to learn users’ behaviors and return results that are most relevant to them. Hewlett-Packard Enterprise’s IDOL platform supports analytics for speech, images, and video, in addition to unstructured text. And both Lucidworks and Squirro leverage open source projects like Apache Solr and Elasticsearch to make sense of disparate data sets.

The cognitive search market is exploding — it’s anticipated to be worth $ 15.28 billion by 2023, up from $ 2.59 billion in 2018, according to Markets and Markets — and it coincides with an upswing in the adoption of AI and machine learning in the enterprise. But it’s perhaps more directly attributable to the wealth of telemetry afforded by modern corporate digital environments.

AI under the hood

AI models like those at the heart of AWS Kendra, Project Cortex, and Cloud Search learn from signals, or behavioral data derived from various inputs. These come from the web pages that employees visit or the videos they watch online, or their online chats with support agents and public databases of support tickets. That’s not to mention detailed information about users, including job titles, locations, departments, coworkers, and potentially all of the documents, emails, and other correspondences they author.

Each signal informs an AI system’s decision-making such that it self-improves practically continuously, automatically learning how various resources are relevant to each person and ranking those resources accordingly. Plus, because enterprises have far fewer data sources to contend with than, say, a web search engine, the models are less expensive and computationally time-consuming to train.

The other piece of the puzzle is natural language processing (NLP), which enables platforms like AWS Kendra to understand not only the document minutiae, but the search queries that employees across an organization might pose — like “How do I invest in our company’s 401k?” versus “What are the best options for my 401k plan?”

Not every platform is equally capable in this regard, but most incorporate emerging techniques in NLP, as well as the adjacent field of natural language search (NLS). NLS is a specialized application of AI and statistical reasoning that creates a “word mesh” from free-flowing text, akin to a knowledge graph, to connect similar concepts that are related to larger ideas. NLS systems understand context in this way, meaning they’ll return the same answer regardless of how a query is phrased and will take users to the exact spot in a record where that answer is likely to be found.

Cognitive search: the new normal

In short order, cognitive search stands to become table stakes in the enterprise. It’s estimated that 54% of knowledge workers are already interrupted a few times or more per month when trying to get access to answers, insights, and information. And the volume of unstructured data organizations produce is projected to increase in the years to come, exacerbating the findability problem.

“Productivity isn’t just about being more efficient. It’s also about aggregating and applying the collective knowledge of your organization so that together you can achieve more,” wrote Microsoft 365 corporate vice president Jared Spataro in a recent blog post. “[Cognitive search systems enable] business process efficiency by turning your content into an interactive knowledge repository … to analyze documents and extract metadata to create sophisticated content models … [and to] make it easy for people to access the valuable knowledge that’s so often locked away in documents, conversations, meetings, and videos.”

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

A super-fast machine learning model for finding user search intent

November 30, 2019   Big Data

In April 2019, Benjamin Burkholder (who is awesome, by the way) published a Medium article showing off a script he wrote that uses SERP result features to infer a user’s search intent. The script uses the SerpAPI.com API for its data and labels search queries in the following way:

  • Informational — The person is looking for more information on a topic. This is indicated by whether an answer box or PAA (people also ask) boxes are present.
  • Navigational — The person is searching for a specific website. This is indicated by whether a knowledge graph is present or if site links are present.
  • Transactional — The person is aiming to purchase something. This is indicated by whether shopping ads are present.
  • Commercial Investigation — The person is aiming to make a purchase soon but is still researching. This is indicated by whether paid ads are present, an answer box is present, PAAs are present, or if there are ads present at the bottom of the SERP.

This is one of the coolest ways to estimate search intent, because it uses Google’s understanding of search intent (as expressed by the SERP features shown for that search).

The one problem with Burkholder’s approach is its reliance on the Serp API. If you have a large set of search queries you want to find intent for, you need to pass each query phrase through the API, which then actually does the search and returns the SERP feature results, which Burkholder’s script can then classify. So on a large set of search queries, this is time consuming and prohibitively expensive.

SerpAPI charges ~$ 0.01 per keyword, so analyzing 5,000 keywords will cost you $ 50. Running these results through Burkholder’s labeler script also takes 3 to 5 hours to get through these 5,000 keywords.

So I got to thinking: What if I adapted Burkholder’s approach so that, rather than use it to classify intent directly, I could use it to train a machine learning model that I would then use to classify intent? In other words, I’d incur one-time costs to produce my Burkholder-labeled training set, and, assuming it was accurate enough, I could then use that training set for all further classification, cost free.

With an accurate training set, anyone could label huge numbers of keywords super quickly, without spending a dime.

Finding a model

Hamlet Batista has written a few stellar posts about how to leverage Natural Language models like BERT for labeling intent.

In his posts, he uses an existing intent labeling model that returns categories from Kaggle’s Question Answering Dataset. While these labels can be useful, they are not really “intent categories” in line with what we typically think of for intent taxonomy categories and instead have labels such as Description, Entity, Human, Numeric, and Location.

He achieved excellent results by training a BERT encoder, getting near 90% accuracy in predicting labels for new/unlabeled search keywords.

The big question for me was, could I leverage the same tech (Uber’s Ludwig BERT encoder) to create an accurate model using the search intent labels I’d get from Burkholder’s code?

It turns out the answer is yes!

How to do it

Here’s how the process works:

1. Gather your list of keywords. If you’re planning on training your own model, I recommend doing so within a specific category/niche. Training on clothing-related keywords and then using that model to label finance related keywords will likely be significantly less accurate than training on clothing related keywords and then using that model to label other unlabeled clothing related keywords. That said, I did try using a model labeled on one category/niche to label another, and the results still seemed quite good to me.

2. Run Burkholder’s script over your list of keywords from Step 1. This will require signing up for SerpAPI.com and buying credits. I recommend getting labels for at least 10,000 search queries with this script to use for training. The more training data, the more accurate your model will likely be.

3. Use the labeled data from the previous step as your training data for the BERT model. Batista’s code to do this is very straightforward, and this article will guide you through the process. I was able to get about ~72% accuracy using about 10,000 labels of training data.

4. Use your model from Step 3 to label unlabeled search data, and then take a look at your results!

The results

I ran through this process using a huge list (13,000 keywords) of clothing/fashion-related search terms from SEMrush as my training data. My resulting model gets just about 80% accuracy.

It seems likely that training the model with more data will continue to improve its accuracy up to a point. If any of you attempt it and improve on 80% accuracy, I would love to hear about it. I think with 20,000+ labeled searches, we could see up to maybe 85-90% accuracy.

This means when you ask this model to predict the intent of unlabeled search queries, 8 times out of 10 it will give you the same label as what would have been returned by Burkholder’s Serp API rules-based classifier. It can also do this for free, in large volumes and incredibly fast.

So something that would have taken a few thousand dollars and days of scraping can now be done for free in just minutes.

In my case I used keywords from a related domain (makeup) instead of clothing keywords, and overall I think it did a pretty good job. Labeling 5,000 search queries took under two minutes with the BERT model. Here’s what my results looked like:

The implications

For SEO tools to be useful, they need to be scalable. Keyword research, content strategy, PPC strategy, and SEO strategy usually rely on being able to do analysis across entire niches/themes/topics/websites.

In many industries, the keyword longtails can extend into the millions. So a faster, more affordable approach to Burkholder’s solution can make a lot of difference.

I forsee AI and machine learning tools being used more and more in our industry, enabling SEOs, paid search specialists, and content marketers to gain superpowers that haven’t been possible before these new AI breakthroughs.

Happy analyzing!

Kristin Tynski is a founder and the SVP of Creative at Fractl, a boutique growth agency based in Delray Beach, FL.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Search using Query fails accessing ChemSpider with Mathematica

November 2, 2019   BI News and Info

Performing a search with “Formula” works as expected. Search with “Query” fails. Printout attached to better document the problem.

kCmjr Search using Query fails accessing ChemSpider with Mathematica

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

Why smart search is the cornerstone of digital transformation (VB Live)

October 21, 2019   Big Data
 Why smart search is the cornerstone of digital transformation (VB Live)

Presented by Lucidworks

Employees have a full spectrum of content and data, but it’s easy to get lost in unproductive, dead-end hunts. Join this VB Live event to learn how AI-powered smart search boosts efficient data discovery and insights that deliver real-world, high-value solutions for complex problems of every size.

Register here for free.


Digital transformation means pivoting to become more efficient, data-driven, and nimble. Traditional enterprise search is anything but.

To do their jobs, enterprise employees need to tap into a huge amount of content and data available both inside and outside the company, and the tools they’re handed aren’t up to the job, says Simon Taylor, vice president worldwide channels & alliances at Lucidworks.

“The volume of data in those silos has created a form of disconnection, both from the employee to the information and the employee to the application, which ultimately drives down productivity,” Taylor says. “Also, if you have too much information, you  get information overload.”

A lot of employees today are disengaged at work because they can’t get to the information that makes their role or purpose more relevant, he explains. Their search results aren’t relevant enough to their specific role, or surfaced to them in relevant contexts, and that’s why employees, particularly millennials, get frustrated in their workplace, he adds. They’re very used to tools they use externally, whether it’s social media or browsing or searching.

“The workplace is having to transform the way it does things, to provide the same type of demand level tools, relevancy level tools, insight-driven tools, to make employees more productive,” Taylor explains.

He warns that if you don’t have the right solutions in place, or even the right insight tools to drive productivity from employees, you’ll make bad decisions, and the company as a whole is going to spend more money on operationalizing things that fundamentally should be a lot easier — and that’s where you get a strong business case for search.

Two types of AI-powered search differentiate businesses today. In curated search you give the AI framework direction about what you need to learn, and then ask it to surface insight that’s relevant to your personal journey in order to make more informed decisions about how to use that information.

On the flip side of that is deep learning, which auto-curates the most relevant topics and classifications and categories, or the ontology, from the data, and then presents it as a corpus of information with recommendations about how to best use it.

In a business context, both curated and uncurated search is essential, and can drive value in three areas. The first is making access to information more efficient, and delivering contextualized, relevant results quickly — for instance, use cases like HR portals.

With natural language processing and machine learning, knowledge management tools can serve responses through an automated communication platform like chatbots. It improves efficiency and it saves money too, by reducing the amount of operational resources required to answer rudimentary questions.

The third category of solution is finding opportunities to make money, or the data discovery area, which helps employees find ways to do things more quickly and actually drive more revenue.

He points to an oil and gas company that had spent money repeating the same research three times in a 10-year period because their ECM (enterprise content management) was not able to find and pull up the records for previous studies. This was to determine the viability of a new drilling site, and, of course, each time the study gave researchers the same result — the area they were exploring would not be a lucrative drill site, and they would not be getting back the time and the money they had wasted discovering that again.

AI-powered search broke down the company’s information silos so that they were able to access research that had been done across the business, and better decide where to focus the company’s assets and resources in order to get the most value.

“That business case, for them, represented multi-billion-dollar savings and revenue generation,” Taylor says. “It’s the real power of search, when you can deliver some significant business-changing direction.”

Transitioning to an AI-powered search tool isn’t really optional these days, he adds — your competitors are doing it to get to information in the most efficient way possible.

“It’s not a resource thing, where you can throw more people at it,” he says. “You have to become clever in the way that you apply technology and think smarter about the way you use it.”

To learn more about how search and machine learning can deliver both savings and revenue opportunities, a look into some of the most compelling business use cases, and more, don’t miss this VB Live event!


Don’t miss out!

Register here for free.


You’ll learn:

  • What operationalized AI means
  • How search and machine learning align to drive efficiency and Opex (operational expenditure) savings
  • How search and machine learning can create revenue opportunities
  • Success factors for operationalized AI and top lessons learned

Speakers:

  • Simon Taylor, Vice President Worldwide Channels & Alliances, Lucidworks
  • JP Sherman, Enterprise Search & Findability Expert, Red Hat

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

What Is the Google Search Console?

September 20, 2019   CRM News and Info
Webinar Featured Image What Is the Google Search Console?

Crawl Errors & Stats

As a search engine discovers pages on your site, it can encounter a few issues as it tries to gain access. In the Crawl Errors report, you’ll be notified of issues such as Server Errors, Soft 404s, and 404 Not Found pages for both mobile and desktop, which indicate the following errors on your site. 

  • Server Errors can be caused by issues with your hosting provider. The crawler may experience an issue if your hosting provider goes down and pages cannot be found.
  • Soft 404s are errors resulting from URLs not existing on your site. In these instances, the pages don’t exist and your site isn’t showing a 404 error.
  • 404 Not Found notifications are errors indicating a page does not exist. It shows the user a 404 page and indicates the page is no longer there. 

Crawl Stats indicate the rate at which a crawler is finding pages, pages crawled per day (high and low), the time it takes to download your page, and kilobytes downloaded in total. It’s important to analyze these stats over time for better insight into sudden spikes in traffic or time it takes to crawl your site, both of which could indicate issues that are worth investigating in more detail. After all, if search engines experience issues crawling your site, they very well could miss important information your prospective customers need to see.

Sitemaps

Every website should have a sitemap. If yours does not, I highly recommend building one today. A sitemap shows a search engine all of the pages on your site in a format that is simplified and easy for a bot to read and understand.  

In Google Search Console, web property owners can submit their sitemaps and monitor the number of pages submitted and subsequent pages Google indexes. Any errors encountered with the sitemap are shown in this section. Take note of large differences between the number of pages submitted and the number of pages indexed. A large difference may indicate an issue and should be investigated.

Security Issues & Other Resources

Spam is prevalent online, and the security of your website could be at risk if it is not managed correctly. Google takes precautions and communicates if they discover any indication of spam or security threats on your site. If Malware is detected, Google Search Console will indicate it in the Security Issues section. Be sure to monitor this area regularly and check out the other resources available, too.

Optimizing Your Website Takes Time and Effort

Search Console is an amazing resource for anyone managing or overseeing marketing activities related to a website. Without this useful tool, you’re missing out on in-depth information to help you improve the amount of traffic you get from organic search and, as a result, failing to capture great leads that help drive ROI. 

Optimizing your website, however, is only part of the equation. If you want to learn how you can do more with your website to engage visitors and keep them moving through the sales funnel, download our eBook, Personalizing the Web Experience: The Key to Better Customer Interaction and Engagement on Your Website (also linked below). In this useful guide, we’ll walk you through what you can do to engage and inspire visitors to convert once they’re actually on your site.

Let’s block ads! (Why?)

Act-On Blog

Read More
« Older posts
  • Recent Posts

    • Rickey Smiley To Host 22nd Annual Super Bowl Gospel Celebration On BET
    • Kili Technology unveils data annotation platform to improve AI, raises $7 million
    • P3 Jobs: Time to Come Home?
    • NOW, THIS IS WHAT I CALL AVANTE-GARDE!
    • Why the open banking movement is gaining momentum (VB Live)
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited