• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Google details how it’s using AI and machine learning to improve search

October 16, 2020   Big Data

Automation and Jobs

Read our latest special issue.

Open Now

During a livestreamed event this afternoon, Google detailed the ways it’s applying AI and machine learning to improve the Google Search experience.

Soon, Google says users will be able to see how busy places are in Google Maps without having to search for specific beaches, parks, grocery stores, gas stations, laundromats, pharmacies, or other business, an expansion of Google’s existing busyness metrics. The company also says it’s adding COVID-19 safety information to business profiles across Search and Maps, revealing whether they’re using safety precautions like temperature checks, plexiglass, and more.

An algorithmic improvement to “Did you mean,” Google’s spell-checking feature for Search, will enable more accurate and precise spelling suggestions. Google says the new underlying language model contains 680 million parameters — the variables that determine each prediction — and runs in less than three milliseconds. “This single change makes a greater improvement to spelling than all of our improvements over the last five years,” Prabhakar Raghavan, head of Search at Google, said in a blog post.

Beyond this, Google says it can now index individual passages from webpages as opposed to whole pages. When this rolls out fully, it will improve roughly 7% of search queries across all languages, the company claims. A complementary AI component will help Search capture the nuances of what webpages are about, ostensibly leading to a wider range of results for search queries.

“We’ve applied neural nets to understand subtopics around an interest, which helps deliver a greater diversity of content when you search for something broad,” Raghavan continued. “As an example, if you search for ‘home exercise equipment,’ we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page.”

Google is also bringing Data Commons, its open knowledge repository that combines data from public datasets (e.g., COVID-19 stats from the U.S. Centers for Disease Control and Prevention) using mapped common entities, to search results on the web and mobile. In the near future, users will be able to search for topics like “employment in Chicago” on Search to see information in context.

On the ecommerce and shopping front, Google says it has built cloud streaming technology that enables users to see products in augmented reality (AR). With cars from Volvo, Porsche, and other “top” auto brands, for example, they can zoom in to view the steering wheel and other details in a driveway, to scale, on their smartphones. Separately, Google Lens on the Google app or Chrome on Android (and soon iOS) will let shoppers discover similar products by tapping on elements like vintage denim, ruffle sleeves, and more.

 Google details how it’s using AI and machine learning to improve search

Above: Augmented reality previews in Google Search.

Image Credit: Google

In another addition to Search, Google says it will deploy a feature that highlights notable points in videos — for example, a screenshot comparing different products or a key step in a recipe. (Google expects 10% of searches will use this technology by the end of 2020.) And Live View in Maps, a tool that taps AR to provide turn-by-turn walking directions, will enable users to quickly see information about restaurants including how busy they tend to get and their star ratings.

Lastly, Google says it will let users search for songs by simply humming or whistling melodies, initially in English on iOS and in more than 20 languages on Android. You will able to launch the feature by opening the latest version of the Google app or Search widget, tapping the mic icon, and saying “What’s this song?” or selecting the “Search a song” button, followed by at least 10 to 15 seconds of humming or whistling.

“After you’re finished humming, our machine learning algorithm helps identify potential song matches,” Google wrote in a blog post. “We’ll show you the most likely options based on the tune. Then you can select the best match and explore information on the song and artist, view any accompanying music videos or listen to the song on your favorite music app, find the lyrics, read analysis and even check out other recordings of the song when available.”

Google says that melodies hummed into Search are transformed by machine learning algorithms into a number-based sequence representing the song’s melody. The models are trained to identify songs based on a variety of sources, including humans singing, whistling, or humming, as well as studio recordings. They also take away all the other details, like accompanying instruments and the voice’s timbre and tone. This leaves a fingerprint that Google compares with thousands of songs from around the world and identify potential matches in real time, much like the Pixel’s Now Playing feature.

“From new technologies to new opportunities, I’m really excited about the future of search and all of the ways that it can help us make sense of the world,” Raghavan said.

Last month, Google announced it will begin showing quick facts related to photos in Google Images, enabled by AI. Starting in the U.S. in English, users who search for images on mobile might see information from Google’s Knowledge Graph — Google’s database of billions of facts — including people, places, or things germane to specific pictures.

Google also recently revealed it’s using AI and machine learning techniques to more quickly detect breaking news around crises like natural disasters. In a related development, Google said it launched an update using language models to improve the matching between news stories and available fact checks.

In 2019, Google peeled back the curtains on its efforts to solve query ambiguities with a technique called Bidirectional Encoder Representations from Transformers, or BERT for short. BERT, which emerged from the tech giant’s research on Transformers, forces models to consider the context of a word by looking at the words that come before and after it. According to Google, BERT helped Google Search better understand 10% of queries in the U.S. in English — particularly longer, more conversational searches where prepositions like “for” and “to” matter a lot to the meaning.

BERT is now used in every English search, Google says, and it’s deployed across languages including Spanish, Portuguese, Hindi, Arabic, and German.

Let’s block ads! (Why?)

Big Data – VentureBeat

Details, Google, Improve, It's, Learning, Machine, search, using
  • Recent Posts

    • solve for variable in iterator limit
    • THE UNIVERSE: A WONDROUS PLACE
    • 2020 ERP/CRM Software Blog Award Winners
    • Top 10 CRM Software Blog Posts in 2020
    • Database trends: Why you need a ledger database
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited