• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: using

Upcoming Webinar: Using Dynamics 365 to Empower your Marketing and Sales Teams with Digital Automation

January 22, 2021   Microsoft Dynamics CRM
crmnav Upcoming Webinar: Using Dynamics 365 to Empower your Marketing and Sales Teams with Digital Automation

See first-hand how your sales and marketing teams can automate their manual processes, which take up valuable time and energy, using tools in Dynamics 365 like lead scoring and automated email campaigns.

When: Tuesday, February 16th at 11 AM CST

Register Now

In this webinar you’ll learn:

  • Target your leads most likely to close with behavior-based and targeted email marketing
  • Use automatic lead scoring models that grade leads based on their interaction with your marketing content
  • Discuss questions with Dynamics 365 Marketing Consultants with a Live Q&A

What is Marketing Automation?

Marketing automation tools take marketing processes that are traditionally done manually and makes them easier and more targeted. For example, instead of guessing how interested your leads are, create complex scoring models that rate them based on their previous interactions with your website and marketing content.

In Dynamics 365, you can utilize marketing automation to automate processes like sending an email to a contact that clicked a link in a previous email or upgrade a lead score when they visit your website.

Learn more about Dynamics 365 Marketing

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Researchers propose using the game Overcooked to benchmark collaborative AI systems

January 15, 2021   Big Data

The 2021 digital toolkit – How small businesses are taking charge

Learn how small businesses are improving customer experience, accelerating quote-to-cash, and increasing security.

Register Now


Deep reinforcement learning systems are among the most capable in AI, particularly in the robotics domain. However, in the real world, these systems encounter a number of situations and behaviors to which they weren’t exposed during development.

In a step toward systems that can collaborate with humans in order to help them accomplish their goals, researchers at Microsoft, the University of California, Berkeley, and the University of Nottingham developed a methodology for applying a testing paradigm to human-AI collaboration that can be demonstrated in a simplified version of the game Overcooked. Players in Overcooked control a number of chefs in kitchens filled with obstacles and hazards to prepare meals to order under a time limit.

The team asserts that Overcooked, while not necessarily designed with robustness benchmarking in mind, can successfully test potential edge cases in states a system should be able to handle as well as the partners the system should be able to play with. For example, in Overcooked, systems must contend with scenarios like when a plates are accidentally left on counters and when a partner stays put for a while because they’re thinking or away from their keyboard.

 Researchers propose using the game Overcooked to benchmark collaborative AI systems

Above: Screen captures from the researchers’ test environment.

The researchers investigated a number of techniques for improving system robustness, including training a system with a diverse population of other collaborative systems. Over the course of experiments in Overcooked, they observed whether several test systems could recognize when to get out of the way (like when a partner was carrying an ingredient) and when to pick up and deliver orders after a partner has been idling for a while.

According to the researchers, current deep reinforcement agents aren’t very robust — at least not as measured by Overcooked. None of the systems they tested scored above 65% in the video game, suggesting, the researchers say, that Overcooked can serve as a useful human-AI collaboration metric in the future.

 Researchers propose using the game Overcooked to benchmark collaborative AI systems

“We emphasize that our primary finding is that our [Overcooked] test suite provides information that may not be available by simply considering validation reward, and our conclusions for specific techniques are more preliminary,” the researchers wrote in a paper describing their work. “A natural extension of our work is to expand the use of unit tests to other domains besides human-AI collaboration … An alternative direction for future work is to explore meta learning, in order to train the agent to adapt online to the specific human partner it is playing with. This could lead to significant gains, especially on agent robustness with memory.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

How to Show/Hide a Button Using the Business Process Flow Stage

January 7, 2021   Microsoft Dynamics CRM

In today’s blog, we’ll discuss how to show and hide a button using the Business Process Flow (BPF) stage. In the example, we are going to hide the Close as Won button for all BPF stages on the Opportunity entity except for the last stage, Opportunity Close, where we will show the Close as Won button. First, we will create one new record in the Opportunity entity. After creating the new Opportunity…

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

Let Your Data Tell a Story: CRM Using Microsoft Power BI

December 16, 2020   CRM News and Info

CRM is the key to a successful business. Data visualization takes your customer relationships to another level. Extend your CRM solutions to create a bigger, more sustainable customer pool with Microsoft Power BI. This software application visualizes your corporate information and allows your data to tell a very important story.

The information age has digitized most businesses today, which means your corporate data speaks about a company more than any top executive. The Microsoft Dynamics 365 suite of applications allow you to maximize your customer relations using your data!

If you’re interested in maintaining excellent customer relationships, read more. JourneyTEAM has given you tips and tricks on how your data can tell a story with Microsoft Power BI.

The Principles of Storytelling

Storytelling has been the way humans have communicated since the stone age. Now, it’s time for your data to tell a story. Below are the principles for a good story:

  • Quick and Simple: Present long, complex data in a short and easy-to-understand fashion so the audience can make the most out of what you show.
  • Inspire Change: The data you present should connect with the audience so that they’re inspired to change their work/life in a way that benefits your business (CRM 101)!
  • Make an impact: Ensure that the storytelling leaves a mark on the audience. An unforgettable experience is what manufactures recurring customers.

xScreen Shot 2020 12 14 at 5.24.10 PM 625x344.png.pagespeed.ic.s22YhwXABm Let Your Data Tell a Story: CRM Using Microsoft Power BI

Elements of a Good Story

For your data to tell a story effectively, it has to abide by four key elements: stage setting, target audience, planning, and allowing deeper explorations.

Stage Setting

Good CRM starts with context. Stage setting gives your audience a context on the data they’re viewing. Without your audience knowing what they’re seeing, your story loses all meaning.

The best way to set the stage for your data visualization is the dashboard. Here are some tips to an excellent Power BI dashboard:

  • Organize the dashboard effectively and maintain good spacing
  • Use clear headings and titles for all graphics and data
  • Maintain concise and impactful labels
  • Use few colors to reduce distractions, and highlight only key metrics

Target Audience

A different audience means a different story. It is important to tailor your visualizations based on who your audience is, where they come from, what data they care about, etc.

A good practice is to create different pages for different audiences. You can also provide page links on the dashboard. Know your audience well, and show only the data they care about. This will maximize the story’s impact.

Plan

To ensure you give enough attention to all aspects of your story, plan what you’re going to present beforehand.

A great way to do this is through storyboards. With storyboards you can create boards for each page. You decide what colors and information the slide will have. You can also manage links and any other aspect of the page.

Ensure that you focus on the important stories and data only. Create separate pages to organize your data well. Don’t clutter a page! Presenting too much information at once ironically leads to a lack of information in the audience.

Explorations

Allow your audience to explore your visualizations. Create pages, links, and drill-throughs that engage the audience and demand their attention.

  • Use filters to allow users to find the data and reports they need. Drill-throughs are useful so that customers can learn more about specific pieces of information!
  • Make the dashboard the home for the most important stories. It should be an executive summary that says a little bit about everything. Then use drill-throughs to allow users to dig into the specifics.

The Perfect Dashboard

The dashboard is the heart of your visualization. The better the dashboard, the better the response.

  • Personalize the dashboard based on your audience. Make it informative yet simple, and elaborative yet quick!
  • A dashboard needs to say a little about everything. Ensure you display all the critical stories on the dashboard and use drill-throughs to allow users to dig deeper.
  • Put the important information on the top, and the most critical information on the top left. It’s where the eyes go first!
  • Use the right charts to represent your data, maintain color consistency, and highlight only key metrics!

Note: Power BI offers several variations of graphs and charts. Make sure you choose the ones accurate for the type of information you present!

Making the Most of Visualization

70% of your mental faculty is used in visualization. Most of that is iconic memory, things that you see and instantly forget.

When your audience views your visualizations, you must make the most out of cognitive science to ensure the information goes from iconic memory to long-term memory.

What to avoid:

  • Too many bright colors
  • Thick, colorful borders
  • Lack of highlighting
  • Excessive Grids
  • Needlessly precise numbers

How to capture your story in the long-term memory:

  • Create call-to-actions
  • Use data patterns that are easy to view and process (see Gestalt’s principles)
  • Use textual or audio cues alongside the visualization to engage multiple sensory inputs
  • Maintain color consistency
  • Maintain good spacing
  • It’s okay to round off numbers that are full of digits

Click here to see the full article

JourneyTEAM was recently awarded Microsoft US Partner of the Year for Dynamics 365 Customer Engagement (Media & Communications) and the Microsoft Eagle Crystal trophy as a top 5 partner for Dynamics 365 Business Central software implementations. Our team has a proven track record with successful Microsoft technology implementations. Consolidating your communication, collaboration, and productivity platform with Microsoft will provide the best support possible for your sales, marketing, and other teams, wherever they work. Let us show you how. Contact JourneyTEAM today!


177x247x2020 08 24 15 32 44.jpeg.pagespeed.ic.GxqNPDY2JL Let Your Data Tell a Story: CRM Using Microsoft Power BIArticle by: Dave Bollard – Chief Marketing Officer  | 801-436-6636

JourneyTEAM is an award-winning consulting firm with proven technology and measurable results. They take Microsoft products; Dynamics 365, SharePoint intranet, Office 365, Azure, CRM, GP, NAV, SL, AX, and modify them to work for you. The team has expert level, Microsoft Gold certified consultants that dive deep into the dynamics of your organization and solve complex issues. They have solutions for sales, marketing, productivity, collaboration, analytics, accounting, security and more. www.journeyteam.com

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

New York City Council votes to prohibit businesses from using facial recognition without public notice

December 11, 2020   Big Data
 New York City Council votes to prohibit businesses from using facial recognition without public notice

The cutting-edge computer architecture that’s changing the AI game

Learn about the next-gen architecture needed to unlock the true capabilities of AI and machine learning.

Register here

New York City Council today passed a privacy law for commercial establishments that prohibits retailers and other businesses from using facial recognition or other biometric tracking without public notice. If signed into law by NYC Mayor Bill de Blasio, the bill would also prohibit businesses from being able to see biometric data for third parties.

In the wake of the Black Lives Matter movement, an increasing number of cities and states have expressed concerns about facial recognition technology and its applications. Oakland and San Francisco, California and Somerville, Massachusetts are among the metros where law enforcement is prohibited from using facial recognition. In Illinois, companies must get consent before collecting biometric information of any kind, including face images. New York recently passed a moratorium on the use of biometric identification in schools until 2022, and lawmakers in Massachusetts have advanced a suspension of government use of any biometric surveillance system within the commonwealth. More recently, Portland, Maine approved a ballot initiative banning the use of facial recognition by police and city agencies.

The New York City Council bill, which was sponsored by Bronx Councilman Ritchie Torre, doesn’t outright ban the use of facial recognition technologies by businesses. However, it does impose restrictions on the ways brick-and-mortar locations like retailers, which might use facial recognition to prevent theft or personalize certain services, can deploy it. Businesses that fail to post a warning about collecting biometric data must pay $ 500. Businesses found selling data will face fines of $ 5,000.

In this aspect, the bill falls short of Portland, Oregon’s recently-passed ordinance regarding biometric data collection, which bans all private use of biometric data in places of “public accommodation,” including stores, banks, restaurants, public transit stations, homeless shelters, doctors’ offices, rental properties, retirement homes, and a variety of other types of businesses (excepting workplaces). It’s scheduled to take effect starting January 1, 2021.

“I commend the City Council for protecting New Yorkers from facial recognition and other biometric tracking. No one should have to risk being profiled by a racist algorithm just for buying milk at the neighborhood store,” Fox Cahn, executive director of the Surveillance Technology Oversight Project, said. “While this is just a first step towards comprehensively banning biometric surveillance, it’s a crucial one. We shouldn’t allow giant companies to sell our biometric data simply because we want to buy necessities. Far too many companies use biometric surveillance systems to profile customers of color, even though they are biased. If companies don’t comply with the new law, we have a simple message: ‘we’ll see you in court.’”

Numerous studies and VentureBeat’s own analyses of public benchmark data have shown facial recognition algorithms are susceptible to bias. One issue is that the data sets used to train the algorithms skew white and male. IBM found that 81% of people in the three face-image collections most widely cited in academic studies have lighter-colored skin. Academics have found that photographic technology and techniques can also favor lighter skin, including everything from sepia-tinged film to low-contrast digital cameras.

“Given the current lack of regulation and oversight of biometric identifier information, we must do all we can as a city to protect New Yorkers’ privacy and information,” said Councilman Andrew Cohen, who chairs the Committee on Consumer Affairs. Crain’s New York reports that the committee voted unanimously in favor of advancing Torres’ bill to the full council hearing earlier this afternoon.

The algorithms are often misused in the field, as well, which tends to amplify their underlying biases. A report from Georgetown Law’s Center on Privacy and Technology details how police feed facial recognition software flawed data, including composite sketches and pictures of celebrities who share physical features with suspects. The New York Police Department and others reportedly edit photos with blur effects and 3D modelers to make them more conducive to algorithmic face searches. And police in Minnesota have been using biometric technology from vendors including Cognitec since 2018, despite a denial issued that year, according to the Star Tribune.

Amazon, IBM, and Microsoft have self-imposed moratoriums on the sale of facial recognition systems. But some vendors, like Rank One Computing and Los Angeles-based TrueFace, are aiming to fill the gap with customers, including the City of Detroit and the U.S. Air Force.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Using Connection References with Power Automate and Common Data Service

November 14, 2020   Microsoft Dynamics CRM

If you have not heard of Connection References yet, they are definitely worth looking into. There are a lot of great blogs out there already, such as our blog introducing the feature, as well as our Docs information.

At a high-level, connection references are solution-aware components that contain a reference which associates the connector used and the flow it resides in. When importing a solution that contains a flow from one environment to the other, this prevents you from needed to open the flow and re-establish connections. In this case, we will focus on the Common Data Service(current environment) connection reference.

There a few important things I want to call out.

Connection references are specific to a user and connection when being automatically created or automatically used within a flow connection. Connections within a flow can be manually updated to use a connection reference from another user’s connection if they exist.

Manually, they can be created from the Power Apps maker portal. These are also specific to a user’s connection.

From within a solution, click New and select Connect Reference (preview)

7266.pastedimage1605200068356v1 Using Connection References with Power Automate and Common Data Service

You will give it a unique name and I recommend this be unique, since that is what currently displays in the views. You will also have to select a connector and an existing connection for that connector type. If it doesn’t exist, you must create it. (note: a connection for that connector must also exist for this user in the destination org)

2068.pastedimage1605200079989v2 Using Connection References with Power Automate and Common Data Service

That is pretty straight forward.

How is a connection reference automatically created? In this case, a user does not have an existing connection reference in an environment where they want to build a flow. If this user creates a flow in this environment using the Common Data Service (Current environment) trigger or action, a connection reference will automatically be created.

When it is created automatically, it will look like below, using a default name:

6862.pastedimage1605200092513v3 Using Connection References with Power Automate and Common Data Service

The only way you can currently find the schema name of this connection reference, is to click the ellipses from this view and click Edit. Then, you will see this as below:

1754.pastedimage1605200105370v4 Using Connection References with Power Automate and Common Data Service

Now, if there is more than one person creating flows that use Common Data Service (current environment) connectors in that same environment that also do not have an existing connection reference, the same thing will happen for that user.

This means, if the user does not already have a connection reference that they own for Common Data Service (current environment) and they create a new flow using a trigger or action for that connector, another connection reference will be created with the exact same name for this user.

6470.pastedimage1605200117254v5 Using Connection References with Power Automate and Common Data Service

However, the schema name is different.

4118.pastedimage1605200129809v6 Using Connection References with Power Automate and Common Data Service

If you extend this scenario to 10-15 people, or more, in an environment building flows, you will end up with multiple connection references that appear to be the same. This can become confusing very quick, especially during import of the solution.

If a user already has an existing connection reference in an environment, using Common Data Service (current environment) and they create a new flow, it will automatically set that connection reference to the connector in the flow and will not create a new connection reference.

As a best practice, and to prevent having multiple connection references using the same name and causing confusion, each user can create a connection reference manually with a preferred naming convention before creating any flows.

If you are already in this situation, users can change the display name of existing connection references or create a new connection reference with the preferred name and reassign this to the existing flows.

To update the connection reference once you have a new one created, open the flow in the originating org and update the connections to use the new connection reference. This will update the destination org when imported.

6011.pastedimage1605200144420v7 Using Connection References with Power Automate and Common Data Service

There is a known issue right now, if you create a new solution with a flow or add a flow to an existing solution the connection reference it is not added to the solution. It must be added manually. Be aware of this, especially if that connection reference doesn’t exist in the destination environment. It will fail to import. If you encounter this and have multiple connection references using the default name of Common Data Service (current environment), you will have to go find the schema name that aligns with the missing connection reference and add it to the solution in the originating environment.

While connection references are in preview, one connection reference can only be used within a maximum of 16 flows. If the same connection needs to be used in more than 16 flows, then create another connection reference with a connection to the same connector. 

What happens when you this limit? You will a message like this when trying to save the flow (click image to expand)

3630.pastedimage1605200162194v8 Using Connection References with Power Automate and Common Data Service

What happens when User A creates a solution with a flow in one environment, using a connection and connection reference of User A, but User B imports this solution to the destination environment where User A does not have an existing connection.

User B, when importing the solution, will have the option to select their existing connection to tie to the connection reference or create a new connection. They could select below

0268.pastedimage1605200174487v9 Using Connection References with Power Automate and Common Data Service

or click + New Connection

8176.pastedimage1605200186946v10 Using Connection References with Power Automate and Common Data Service

Otherwise, User A would need to sign-in to the flow portal for the destination environment and create a connection. This would allow the user importing the solution to select the correct connection for the connection reference.

2781.pastedimage1605200197155v11 Using Connection References with Power Automate and Common Data Service

Thanks for reading!

Aaron Richards

Sr. Customer Engineer

Dynamics 365 and Power Platform

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

Read More

Datathon Winner: Best Analytics Visualization Using TIBCO Spotfire

November 11, 2020   TIBCO Spotfire
TIBCO EnergyAnalytics scaled e1603984637123 696x365 Datathon Winner: Best Analytics Visualization Using TIBCO Spotfire

Reading Time: 5 minutes

Oil and gas and energy professionals came together this summer for an amazing, data-driven event. Untapped Energy and the Society of Petroleum Engineers (SPE) organized and hosted an incredible event, the DUC Datathon a.k.a. Drilled but Uncompleted Wells Datathon. Bringing together hundreds of students and energy professionals, participants gained access to virtual sessions, data bootcamps, and lively competitions. 

TIBCO was excited to sponsor the event and provided free access to TIBCO Spotfire®, an advanced analytics tool that is very popular in the global energy sector. Our very own Michael O’Connell, TIBCO Chief Analytics Officer, kicked off the competition with a fantastic keynote.  

We even had a competition of our own! Worthy competitors went head to head to create the “Best Visualization Using TIBCO Spotfire.” There was no shortage of creativity and talent, but in the end only one champion could be crowned for their outstanding efforts: Dean Udsen. Udsen is an accomplished data and analytics consultant in the energy space. Over the last decade, he has been widely recognized for his efforts with TIBCO customers.

Udsen decided to join the Datathon to stay busy, motivated, and continue his learning this summer. And his enthusiasm paid off, winning him a pair of Bose 700 headphones. There’s a lot we can learn from Udsen’s success. Below we’ll take a look at a few examples of how he helps TIBCO Spotfire customers in Western Canada create real business value.

Providing Value to Customers 

Udsen’s success is based on his core best practices: It’s essential to have a deep understanding of your clients’ goals and ensure you’re working towards a common objective. It’s also important to work with the client team at their comfort level. Some organizations are experienced and ready to move quickly. Others take time to ramp up their capabilities and develop trust in the analytics.

Udsen believes that integrating data is the “secret sauce” for all organizations to identify ways to improve their efficiency and effectiveness. By providing advice and experience on how to pull the data together, he helps clients quickly build dashboards that are providing the business with real value.

Today, there is often an impulse to either build everything all at once or to build all the backend components to support all potential processes before an end-user sees anything. But, according to Udsen, it’s better to start with one or two business processes and build them from beginning to end. This gives end-users something to start working with, shows them the potential, and allows the team to learn as they go. Plus, you gain immediate value as clients use the new process quickly to help make business decisions.

It’s important to not employ technology for the sake of technology. In a very price sensitive environment, you need to implement tactical projects that provide an immediate benefit. First, understand what is the most important issue for the client, and then look for the quickest way to improve.

Now, let’s look at a few customer examples from Dean Udsen’s past projects:

  • Radical Improvements in Time to Acquisition Analysis: One client wanted to improve their acquisition process and significantly decrease the amount of time needed to perform analysis. How accurate were their forecasts for price, production volumes, revenue, etc. in comparison to the reality of what happened in the ensuing years? They built the dashboard below using Spotfire to combine data from public data sources, internal accounting, and forecasting software. It allows the team to quickly and easily compare the forecasted vs. actual data of any acquisition.
 Datathon Winner: Best Analytics Visualization Using TIBCO Spotfire
  • An Evolution in Weekly Production Reporting to Immersive, Real-Time Reporting: In the past, another client of Udsen’s met each week to compare weekly production using print out hard copy reports for each well. Now, with a new dashboard built using Spotfire (pictured below), the team has an automated way to review the data. The team can select the report week, group wells by their pad location, and investigate individual wells in real time. They can easily compare wells and pads to each other, and the amount of time required to prepare for the meeting is now zero. The data is available on demand without having to wait for the reports.
 Datathon Winner: Best Analytics Visualization Using TIBCO Spotfire
  • Faster Well Counting, from Weeks to Hours: Another client previously used a quarterly report process to review industry activity and predict how many wells would be drilled on their royalty lands. This process took weeks to complete each quarter. The dashboard below, again built using Spotfire, speeds up this process greatly, allowing the team to get updates every day and see up to the minute activity and expected well counts as they occur. It combines data from a number of sources including public data, land, and GIS data to automatically determine which wells are “on the lands” and whether the expected drilling matches the company’s mineral rights.  
 Datathon Winner: Best Analytics Visualization Using TIBCO Spotfire
  • Deeper Insights into Well Operations: Engineers at one company needed a quick way to combine data between their production, accounting, and drilling applications. A quick look at each well was needed to ensure optimal performance and that costs were being contained. Working with the engineers, a number of charts were developed using Spotfire to meet this need, looking at the physical operations of the wells, along with the production and sales revenues. See example below:
 Datathon Winner: Best Analytics Visualization Using TIBCO Spotfire

Get started with TIBCO Spotfire 

These are just a few examples of the successful use cases saving Spotfire users time and money. In fact, according to one customer:

“Spotfire enables us to bring together several disparate data sets, both internal and external to the company.  We can then visualize relationships and develop insights about our business, without spending most of our time collecting, curating, modifying, and combining data.  The results are easily shared, and often generate new ideas that lead to even further leverage of our data, and impact the strategic activities within the business.” 

It’s essential to have a deep understanding of your clients’ goals and ensure you’re working towards a common objective. It’s also important to work with the client team at their comfort level. Click To Tweet

Interested in implementing any of the use cases mentioned above or others? Implement TIBCO Spotfire and enable everyone to visualize new discoveries in your data, quickly and easily. 

Let’s block ads! (Why?)

The TIBCO Blog

Read More

Google details how it’s using AI and machine learning to improve search

October 16, 2020   Big Data

Automation and Jobs

Read our latest special issue.

Open Now

During a livestreamed event this afternoon, Google detailed the ways it’s applying AI and machine learning to improve the Google Search experience.

Soon, Google says users will be able to see how busy places are in Google Maps without having to search for specific beaches, parks, grocery stores, gas stations, laundromats, pharmacies, or other business, an expansion of Google’s existing busyness metrics. The company also says it’s adding COVID-19 safety information to business profiles across Search and Maps, revealing whether they’re using safety precautions like temperature checks, plexiglass, and more.

An algorithmic improvement to “Did you mean,” Google’s spell-checking feature for Search, will enable more accurate and precise spelling suggestions. Google says the new underlying language model contains 680 million parameters — the variables that determine each prediction — and runs in less than three milliseconds. “This single change makes a greater improvement to spelling than all of our improvements over the last five years,” Prabhakar Raghavan, head of Search at Google, said in a blog post.

Beyond this, Google says it can now index individual passages from webpages as opposed to whole pages. When this rolls out fully, it will improve roughly 7% of search queries across all languages, the company claims. A complementary AI component will help Search capture the nuances of what webpages are about, ostensibly leading to a wider range of results for search queries.

“We’ve applied neural nets to understand subtopics around an interest, which helps deliver a greater diversity of content when you search for something broad,” Raghavan continued. “As an example, if you search for ‘home exercise equipment,’ we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page.”

Google is also bringing Data Commons, its open knowledge repository that combines data from public datasets (e.g., COVID-19 stats from the U.S. Centers for Disease Control and Prevention) using mapped common entities, to search results on the web and mobile. In the near future, users will be able to search for topics like “employment in Chicago” on Search to see information in context.

On the ecommerce and shopping front, Google says it has built cloud streaming technology that enables users to see products in augmented reality (AR). With cars from Volvo, Porsche, and other “top” auto brands, for example, they can zoom in to view the steering wheel and other details in a driveway, to scale, on their smartphones. Separately, Google Lens on the Google app or Chrome on Android (and soon iOS) will let shoppers discover similar products by tapping on elements like vintage denim, ruffle sleeves, and more.

 Google details how it’s using AI and machine learning to improve search

Above: Augmented reality previews in Google Search.

Image Credit: Google

In another addition to Search, Google says it will deploy a feature that highlights notable points in videos — for example, a screenshot comparing different products or a key step in a recipe. (Google expects 10% of searches will use this technology by the end of 2020.) And Live View in Maps, a tool that taps AR to provide turn-by-turn walking directions, will enable users to quickly see information about restaurants including how busy they tend to get and their star ratings.

Lastly, Google says it will let users search for songs by simply humming or whistling melodies, initially in English on iOS and in more than 20 languages on Android. You will able to launch the feature by opening the latest version of the Google app or Search widget, tapping the mic icon, and saying “What’s this song?” or selecting the “Search a song” button, followed by at least 10 to 15 seconds of humming or whistling.

“After you’re finished humming, our machine learning algorithm helps identify potential song matches,” Google wrote in a blog post. “We’ll show you the most likely options based on the tune. Then you can select the best match and explore information on the song and artist, view any accompanying music videos or listen to the song on your favorite music app, find the lyrics, read analysis and even check out other recordings of the song when available.”

Google says that melodies hummed into Search are transformed by machine learning algorithms into a number-based sequence representing the song’s melody. The models are trained to identify songs based on a variety of sources, including humans singing, whistling, or humming, as well as studio recordings. They also take away all the other details, like accompanying instruments and the voice’s timbre and tone. This leaves a fingerprint that Google compares with thousands of songs from around the world and identify potential matches in real time, much like the Pixel’s Now Playing feature.

“From new technologies to new opportunities, I’m really excited about the future of search and all of the ways that it can help us make sense of the world,” Raghavan said.

Last month, Google announced it will begin showing quick facts related to photos in Google Images, enabled by AI. Starting in the U.S. in English, users who search for images on mobile might see information from Google’s Knowledge Graph — Google’s database of billions of facts — including people, places, or things germane to specific pictures.

Google also recently revealed it’s using AI and machine learning techniques to more quickly detect breaking news around crises like natural disasters. In a related development, Google said it launched an update using language models to improve the matching between news stories and available fact checks.

In 2019, Google peeled back the curtains on its efforts to solve query ambiguities with a technique called Bidirectional Encoder Representations from Transformers, or BERT for short. BERT, which emerged from the tech giant’s research on Transformers, forces models to consider the context of a word by looking at the words that come before and after it. According to Google, BERT helped Google Search better understand 10% of queries in the U.S. in English — particularly longer, more conversational searches where prepositions like “for” and “to” matter a lot to the meaning.

BERT is now used in every English search, Google says, and it’s deployed across languages including Spanish, Portuguese, Hindi, Arabic, and German.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

1 Click to Undo and Restore recent and past Dynamics 365 CRM record(s) changes using Click2Undo

October 3, 2020   Microsoft Dynamics CRM

600x343xc2u blog6.jpg.pagespeed.ic.VuJkfeyyiQ 1 Click to Undo and Restore recent and past Dynamics 365 CRM record(s) changes using Click2Undo

When working with data, it is very crucial to maintain the correctness of the data. The correctness of data leads to reliable and effective decision making that helps in driving sales. Dynamics 365 CRM being a data-driven platform, a user cannot afford to make mistakes. But it is not possible to not make mistakes when handling a large amount of data every day. And, when it comes to your business data — a misplaced data or data loss can bring the entire operation to an abrupt halt.

Suppose you are updating some details of an open opportunity record in your Dynamics 365 CRM. Before you realize it you accidentally updated the amount and the change got saved by CRM. You can’t remember what the actual amount was. And CRM doesn’t allow you to undo the changes made in a CRM record once they are saved. This simple human error could cost a lot to the company. But not if you have Click2Undo installed in your Dynamics 365 CRM, the newest addition to our suite of productivity apps 1 Click Apps.

1 Click App Click2Undo:

Click2Undo is a robust Microsoft Preferred App that enables Undo functionalities in Dynamics 365 CRM and Power Apps. It helps users in restoring records to its last known state with just a single click. Users can undo the last change made to a single record or multiple records in one go. Changes made in the past can also be changed without any hassle. With the help of Click2Undo, users will also be able to restore deleted records in just one click.

Features of Click2Undo:

Undo and restore last made changes in a record – Click2Undo lets users undo the last changes made to a record and restore it to the former state with just a single click on the Click2Undo button. Changes that are auto-saved can be undone by this feature thereby reducing the risk of losing the data.

Undo and restore changes made in the past – Dynamics 365 CRM with the help of audit logs, captures all the updates made to a record. Using a user-friendly interface, Click2Undo allows users to review those updates so that they can select and restore the updates made to the individual field values of the selected record. From recent changes to the changes done in the previous month, a single click on the Undo button and all can be restored.

Undo Changes in multiple records – With the help of this feature, changes made in multiple Dynamics 365 CRM records can be undone in one go. With the help of this bulk restoring feature, users can save time and increase their productivity.

Imagine misplacing important client’s contact information, only to find out later that your delay resulted in losing your client to a competitor. After working so hard, you don’t want a silly mistake to set you back. Hence, Click2Undo is a must have solution for your Dynamics 365 CRM environment.

While Click2Undo makes sure that the unwanted changes are to be undone with just a click, there is one requirement that needs to be fulfilled: it has to be installed in your Dynamics 365 CRM before you have a problem. After all, Click2Undo is all about preparedness and staying ahead of the potential for things to go wrong.

So, what are you waiting for? Click2Undo is a must have productivity app in your Dynamics 365 CRM / PowerApps that can reduce the risks of data loss and data incorrectness thereby indirectly assisting in maintaining customer relationships and making informed decisions. Download the solution for a free trial of 15 days from our website or Microsoft AppSource.

To know more about our other Microsoft Preferred Dynamics 365 CRM / PowerApps solutions, click here.

For more information on Click2Undo, refer our online help manual or drop us a mail at crm@inogic.com and our solution experts will be happy to assist you.

And…there is no undo button in real life, so stay safe!

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Post-COVID selling using D365

September 24, 2020   CRM News and Info
crmnav Post COVID selling using D365

In all of its communications, Microsoft hammers home the message about how Dynamics enables a 360-degree view of your customers — and it’s true.

With D365 Sales, D365 Customer Insights, D365 Marketing, and D365 Finance — as well as connectors to platforms like LinkedIn and extensions to automate customer touchpoints ranging from social posts to sales quotes — Dynamics can deliver enough customer data to make it feel like you’re right there in the room with each and every one of them.

What Microsoft doesn’t trumpet quite as loudly is how Dynamics also empowers department leaders and managers with a 360-degree view of their teams (and not just in Teams). It’s this strength that will serve sales teams best as they navigate selling in a socially distanced, post-COVID era.

Simple practices to build on as we navigate improving sales management (and thereby improving sales) in the coming year include the technology stack a company team shares enterprise-wide, and the way that stack may be extended at an individual level.

We’re going to make what we hope is a safe assumption and it’s that everyone in your organization is using Dynamics 365, Office 365, and every essential tool they require via the cloud. (If you’re not already running your business on Azure, it’s well-past time.) It’s absolutely ok to BYOD (bring your own device), but it’s not ok to bring your own platform.

Yes, we know that some companies make exceptions for departments such as accounting and finance — citing security concerns and the like. But with D365’s multifactor authentication and ID management solutions, these concerns are as dated as file cabinets and rolodexes.

With everyone on the same Azure platform, your primary deliverable (other than providing leadership) is ensuring constant access to LOB (line of business) applications for every member on your team.

A common excuse (in the pre-COVID era, let alone now) for slips in remote productivity is to blame the tech: it was slow, it was outdated, it “didn’t work.” But with everyone integrated into the same technology stack, a manager can remove those variables from the productivity discussion, and eliminate that excuse entirely.

Most importantly, with everyone on the team enjoying the same level of access and functionality, a manager can measure productivity far more easily. While “old-fashioned” tools like funnels can get part of the job done, it’s when you roll in active monitoring and reporting solutions (e.g., via PowerBI integrations) that a manager enjoys a near omniscient level of oversight, no matter where they or their employees may be.

Want to truly empower your remote staff? Let them extend their Dynamics instance as they see fit.

For example, if you’re a manager, there are dozens of workflow monitoring and management tools you can add to Dynamics that make spotting and addressing clogs in the pipeline a task you can handle on your phone.

If you’re a sales representative, you absolutely need a way to extend D365 to automate your sales proposal processes. A configure, price, quote tool (CPQ) can help a rep send out 5x as many proposals over a manual process. Not only that, but as a manager of that rep, you’ll get analytics on the process of each proposal in the pipeline (here’s a link to one the best CPQ solutions for Dynamics).

The key thing in keeping a remote workforce happy and effective is empowering them to work as they see fit. For sales reps, the job is mostly about one thing: closing. By extending Dynamics with CPQ, they can enjoy higher closing rates regardless, and that’ll give them something to smile about in these trying times, whether they’re on the road or in the (home) office.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More
« Older posts
  • Recent Posts

    • Why the open banking movement is gaining momentum (VB Live)
    • OUR MAGNIFICENT UNIVERSE
    • What to Avoid When Creating an Intranet
    • Is Your Business Ready for the New Generation of Analytics?
    • Contest for control over the semantic layer for analytics begins in earnest
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited