• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Voice

Google Assistant gets deeper app integrations as voice assistant usage skyrockets

October 8, 2020   Big Data

Automation and Jobs

Read our latest special issue.

Open Now

Starting today, Google Assistant can quickly open, search, and interact with some of the most popular Android apps on the Google Play Store. To kick off its Google Assistant Developer Day conference, Google this morning announced new Assistant shortcuts across fitness, social media, payment, ride-hailing, and other categories of apps for actions like finding a ride, ordering food, and playing music. Beyond this, it detailed improvements headed to Assistant-powered smart displays like the Nest Home Hub.

The pandemic appears to have supercharged voice app usage, which was already on an upswing. According to a study by NPR and Edison Research, the percentage of voice-enabled device owners who use commands at least once a day rose between the beginning of 2020 and the start of April. Just over a third of smart speaker owners say they listen to more music, entertainment, and news from their devices than they did before, and owners report requesting an average of 10.8 tasks per week from their assistant this year compared with 9.4 different tasks in 2019. And according to a new report from Juniper Research, consumers will interact with voice assistants on 8.4 billion devices by 2024.

Expanded App Actions

The Assistant integrations were made possible in part by App Actions, a developer service that creates deep links between Android smartphone apps and Assistant. App Actions, which Google showcased for the first time at its I/O developer conference in 2018, initially launched in four categories — health and fitness, finance and banking, ride-sharing, and food ordering — when it entered public preview last year. (It now supports 10 verticals in total including social, games, travel and local, productivity, and shopping and communications.) App Actions complement App Slices, which were introduced in 2019 to serve up content and data from apps. And both look to spark reengagement among users without forcing them to build separate experiences for Assistant.

App Actions behave like shortcuts to parts of Android apps. They build on top of existing functionality in apps, and the development process is similar for each App Action developers choose to implement. Basically, App Actions take users to content within apps via deep link URLs, which developers specify. By transferring intents and commands from Assistant to an app, App Actions enable users to do things like order Dunkin’ Donuts, buy stock with Etrade, and send money with PayPal. As for App Slices, they let users ask things like “How many miles did I run today?” and receive responses from apps such as Nike Run Club without leaving Assistant.

Among the over 30 new apps supported are Nike Adapt, Nike Run Club, Spotify, Postmates, MyFitnessPal, Mint, Discord, Walmart, Etsy, Snapchat, Twitter, Citi, Dunkin, PayPal, Wayfair, Wish, Uber, and Yahoo! Mail. Assistant now recognizes and acts on commands like:

  • “Hey Google, send a message to Rachel on Discord”
  • “Hey Google, search for candles on Etsy”
  • “Hey Google, log a berry smoothie on MyFitnessPal”
  • “Hey Google, check my accounts on Mint”
  • “Hey Google, tighten my shoes with Nike Adapt”
  • “Hey Google, start my run with Nike Run Club”
  • “Hey Google, order a smoothie on Postmates”
  • “Hey Google, send snap with Cartoon Lens”
  • “Hey Google, find Motivation Mix on Spotify”
  • “Hey Google, check news on Twitter”
  • “Hey Google, when is my Walmart order arriving?”

In a related change, Google says that Assistant will begin showing relevant Apps Actions even when users don’t mention an app explicitly by name. For example, if they say “Hey Google, show me Taylor Swift,” Assistant might highlight a suggestion chip that will guide them to the search results page in Twitter. Assistant will also suggest apps proactively depending on individual usage patterns.

 Google Assistant gets deeper app integrations as voice assistant usage skyrockets

Alongside these integrations and recommendations, Google is introducing the ability to create custom shortcut phrases for specific tasks, a feature first exposed in March by the code sleuths at 9to5Google. Instead of saying “Hey Google, tighten my shoes with Nike Adapt,” users can create a shortcut like “Hey Google, lace it.” Alternatively, they can explore suggested shortcuts by saying “Hey Google, show my shortcuts.”

Google says that all phones running Android 5 and higher should support the new app integrations and shortcuts. (Assistant on Android Go doesn’t support App Actions.) Additional apps and expanded device support are expected to arrive at a future date. “Whether you want a faster way to get into your apps, or create custom shortcuts for your most common tasks, we’re excited to keep making Android and your apps even more useful and convenient, giving you time back to enjoy what matters most,” Assistant product director Baris Gultekin wrote in a blog post.

Coincidentally, Amazon months ago launched its answer to App Actions in Alexa for Apps, which integrates iOS and Android apps’ content and functionality with Alexa. Through deep linking, developers can assign tasks like opening an app’s home page or rendering search results to Alexa app voice commands.

New Smart Display features

Germane to the smart display side of things, Google announced two new English voices that take advantage of an improved prosody model to make Assistant sound more natural and fluent. They’re now available, and developers can leverage them in existing Actions.

In addition to the new voices, Google is expanding Interactive Canvas, an API that lets developers build Assistant experiences that can be controlled via touch and voice, to education and storytelling verticals. Soon, education and storytelling intents will be open for public registration, enabling users to say things like “Hey Google, teach me something new” or “Hey Google, tell me a story” to be presented with learning or story collections of apps.

 Google Assistant gets deeper app integrations as voice assistant usage skyrockets

Above: The new Google Assistant learning hub.

Image Credit: Google

In an effort to improve sharing and transactions, Google says it’s introducing household authentication tokens that allow users in a home to share games, apps, and more. In the future, users on one smart display will be able to start a puzzle, for example, and let other users on another device pick up where they left off. As for transactions, smart displays will support voice-match as an option for payment authorization ahead of an on-display CVC entry next year.

Lastly, Google is launching two features in beta — Link with Google and App Flip — for improved account linking flows and reintroducing its Action links discovery tool as Assistant links. The Link with Google enables anyone with a logged-in Android or iOS app to complete the linking flow on a smart display with a few taps, while App Flip allows users to link their accounts to Google without having to re-enter their credentials. Meanwhile, Assistant links enables developer partners to deliver Assistant experiences on websites as well as deep links to Assistant integrations from anywhere on the web

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Alexa users can now buy gas with a voice command at some stations

September 1, 2020   Big Data
 Alexa users can now buy gas with a voice command at some stations

Automation and Jobs

Read our latest special issue.

Open Now

In partnership with ExxonMobil and fintech provider Fiserv, Amazon today launched an Alexa feature that lets users pay for gas with a voice command. The integration was first announced in January at the 2020 Consumer Electronics Show, but today marks its rollout at over 11,500 Exxon and Mobil stations across the U.S.

Starting today, customers with Alexa-enabled vehicles, Alexa-enabled devices like Echo Auto and Echo Buds, or the Alexa app on Android or iOS can say “Alexa, pay for gas” to have Alexa confirm their station location and pump number. Confirming the location and number will activate the pump, after which customers can optionally select the fuel grade and begin fueling.

Amazon says transactions are made through Amazon Pay and powered by Fiserv, which handles things like geolocation at Exxon and Mobil stations, pump activation, payment processing, and payment tokenization. Gas payments will default to the primary payment method associated with customers’ accounts (excepting Amazon gift cards), but Exxon notes on its website that there’s currently no way to earn Exxon Mobile Rewards+ points.

Amazon says access to the new feature — which doesn’t work with Echo devices “built for at-home use,” including the Echo, Echo Show, and Echo Dot — can be secured with a voice PIN if customers choose. This can be set up through the Account Settings screen in the Alexa app. The company also notes that the name, email address, payment, location, and other Amazon account information required to support the transaction will be shared with ExxonMobil at payment time.

The launch of Alexa-powered payments at ExxonMobil stations comes after Amazon debuted partnerships allowing users in some countries — including the U.S. and India — to pay utility companies with information stored in their Amazon accounts. Coinciding with the launch, a complementary bill management feature rolled out to Alexa-enabled devices in the U.S.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Voice Search Is the Future of SEO

March 26, 2020   CRM News and Info

Achieving top search position for dozens of keywords is the ultimate goal for many content marketers, but Google keeps throwing curveballs our way to make that quest even more difficult. One of these recent updates that is shaking up how we optimize our websites for SEO is voice search. 

So what is voice search? Voice search makes it just a bit easier for us to find the resources and answers we need to make purchasing decisions by allowing us to input our searches audibly instead of typing them up. And because of this added layer of convenience, this won’t be going away any time soon. Organizations will have to consider this factor when optimizing their websites and webpages for SEO. 

shutterstock 1219812214 Voice Search Is the Future of SEO

Voice search is still new, so many of us marketers are still trying to wrap our head around how it works and its vast potential. The good news is that getting ahead of the game now will put you one step ahead of your competitors. Optimizing your website for this new search medium will give your SEO a huge boost and provide a more enjoyable customer journey and user experience. Most importantly, your sales and revenue could potentially skyrocket. 

What Is Voice Search? 

Voice Search enables users to use voice commands instead of typing to search for what they need. Users can search by opening their browser, clicking on the microphone icon to the right of the search bar, and recording whatever phrase they would type otherwise. These search terms can be as simple as “coffee shops” or “coffee shops in northwest Portland open Monday at 7 AM.” 

With this feature being available on both desktop and mobile, the voice search option is quickly increasing in popularity. 

Here are three best practices for optimizing your keyword strategy for voice search.

Consider How People Talk When Developing Your Content

Very few of us talk the same way we write, and we should expect to see that difference reflected in the way our target audience looks for information via voice search.  

For example, while you might type in “Los Angeles restaurants” and then proceed to narrow down your search, you might voice search “What are some good restaurants near the Los Angeles Arts District?” Again, individuals who voice search are using this method because it’s convenient and probably don’t want to spend the additional time scrolling and narrowing down their search. 

You can optimize your content for voice search by including:

  • Questions: Incorporate “Who, What, Where, When, and Why” language into your content. Think of the information your target audience is looking for and how they’ll voice that question when speaking instead of typing. This practice will also drastically improve your chance of earning a featured snippet, which we’ll cover in the next section. 
  • Location: Try to be as narrow and descriptive as possible when it comes to describing your location. Make sure to mention the neighborhood you’re in and nearby attractions in your content (whenever it makes sense to do so; don’t force it). 
  • Related Products or Services: Is there a related product or service that your audience is constantly searching for? If so, you probably want to mention them in your content to ensure they can find you more easily. For example, marketing automation users often want to know if our platform integrates with the CRM they are using, so we make sure to include phrases such as “Does Act-On integrate with Microsoft Dynamics?” or “What are the benefits of integrating SugarCRM with Act-On?” throughout our website.
  • Highlight Your Competitive Advantages: The last thing you want is for your target customer to voice search “What is the best [insert product or service here]?” and have your company not appear in the search results. To prevent this from happening, make sure to mention what makes you stand out from the competition and feature plenty of positive customer reviews throughout your website.

Focus on Earning Featured Snippets 

You’ve probably scrolled through your search results and noticed that, many times, there’s a featured preview with the answer to the question you were looking for. And, as a marketer, you might be wondering how this company managed to get a bit more than just their website name and meta description featured in search results. 

These highlighted posts are called featured snippets, and companies can earn them by offering what Google deems a thorough answer to commonly asked questions. As we previously mentioned, the introduction of voice search means that more individuals will be searching for more specific questions instead of generic terms. So you can kill two birds with one stone by structuring your content to answer frequently asked questions — you’ll improve your ranking in search results and increase your chances of earning that featured snippet spot.

Securing a featured snippet can help you do more than get your content front and center in search results. Better search positioning means better web traffic and more customers. And since securing this spot establishes your company as a thought leader, a featured snippet can also help you build trust and credibility with your audience early on in the sales process — providing added leverage over the competition. 

If you’d like to learn more about how to earn a featured snippet on Google, check out this blog post we wrote on the topic not too long ago. 

Sticking to SEO Basics Can Go a Long Way

You have to learn how to walk before you can run, and that same concept applies if you want to optimize your website for voice search. In other words, the two tactics we mentioned will go further if you have a solid SEO foundation to start with — and that means having good keywords, meta descriptions, and little-to-no errors.  And by errors, I don’t mean just looking out for typos; you should ensure that your page is indexed, the meta description is the correct length, you are using the correct headings (H1, H2) with the proper keywords, and that your page doesn’t take too long to load. 

If you’re completely new to SEO, it can be a very difficult concept to grasp — and you might not know where to start to make sure you’re on the right track. Thankfully, there’s a wide array of tools available to ensure your webpages are optimized. Act-On’s SEO audit tool, for example, checks your web and landing pages to verify that you’re following best practices with effective keywords and zero errors.

Guide Customers Through a Targeted and Effective Customer Journey With Act-On

As any good marketer knows, SEO is only one part of the lead generation and growth marketing process. There’s so much more to creating an awesome holistic strategy!

If you’re ready to learn about the digital tools you can use to enhance your marketing strategy, please schedule a demo with one of our marketing automation experts. 

If you’re not quite there yet, please download the eBook below to learn how to optimize your funnel!

Let’s block ads! (Why?)

Act-On Blog

Read More

How AI scrubs out fake reviews and amplifies your customer’s voice (VB Live)

December 14, 2019   Big Data
 How AI scrubs out fake reviews and amplifies your customer’s voice (VB Live)

AI offers an efficient, cost-effective way to unlock actionable insights from your reviews. Watch this VB Live event to learn how AI and machine learning can help you to deal with the negative reviews, resolve customer pain points, and build trust. 

Access free on demand here.


Consumer trust in information that’s available from businesses is dropping drastically. Almost 85% of millennials don’t trust traditional advertising, and 70% rely on reviews and recommendations from other customers, which of course includes online feedback.

A lot of businesses don’t like to look at their negative reviews, and don’t know how to deal with them or respond to them. But 82% of the top-performing companies report paying close attention to the human experience around digital and tech. That means keeping up-to-the-minute tabs on your reviews.

“One of the greatest challenges is trying to take advantage of this feedback and really use it to turn around the conversation,” says Ramin Vatanparast, chief product officer at Trustpilot. “For example, negative reviews are a great opportunity for businesses to reach out to unhappy customers and understand what the problem was with their experience and try to win back their respect, and build trust and credibility.”

Positive reviews also offer vital information to businesses — they’re not just pats on the back, but taken together are an incredibly accurate barometer for the success of your customer service and your products.

Whether it’s reviews, comments in a forum, support requests, or any of the feedback that you’re getting from customers, the challenge from a technical perspective is that it’s quite hard to scale all of that text — the things that people are writing and saying about your company — and understand the core themes, says Chris Hausler, senior data science manager at ZenDesk.

“If you’re a small company receiving 10 to 20 reviews or support requests a week, it’s easy for someone to individually sit down and read those and understand your customer’s perspective,” Hausler explains. “But when you scale that to tens of thousands, or even millions, it’s no longer feasible for an individual, or even a group of individuals, to read and understand all that text.”

That’s where AI comes in. Natural language processing helps identify patterns in that text, understand the core themes in what people are talking about, and give you a perspective that you aren’t able to get as an individual with that scale of feedback.

“The biggest opportunity isn’t just the star rating for those reviews,” Vatanparast says. “It’s looking at the context and being able to take advantage of the tools and technologies that let you understand the sentiment behind those star ratings, analyze your data, and improve your business.”

The other big piece is customer trust, he says.

“When we look to the Edelman trust barometer, you’ll notice that the overall trust level for online businesses, and even online digital platforms, is going down,” he says. “At the same time, fake reviews are creating a huge challenge for consumers, who now need to identify what’s real and what’s fake.”

AI also really has a role to play here, particularly at scale, Hausler says.

“On the fake review side, particularly when it’s coming from bots, there tend to be telltale signs in the ways they communicate that make them slightly different from the way that a human would write a review,” he explains. “Much in the way that we have something like spam detection in your Gmail account, you can train your AI to identify where you’re getting these bot reviews and then move them to the side so you and your customers are not being misled.

Fraud detection models, combined with offering customers the ability to flag suspicious content, lets you secure your customer feedback and make sure the reviews they see in the platform are as legitimate as possible.

“The biggest thing for any organization nowadays is to build strong principles and values around trust and what they stand for, to be able to openly talk about it and share their data and make sure the information they provide their consumers, including reviews, is as clear as possible and as trustworthy as possible,” he says.

For a deep dive into how an AI-powered review platform works to sort, flag, and capture sentiment from your reviews, how to turn negative reviews into opportunities to connect with your customers, and how to implement a fully secure review platform to capture customer sentiment, catch up on this VB Live event now!


Access for free on demand.


You’ll learn:

  • How to find themes in your customer reviews so that you can fix the real pain points instead of putting “band-aids” on each unhappy customer
  • Steps to guide your business decisions around what your customers say they want, not what you think they want
  • The importance of addressing your most common negative issues first to see an immediate change in your customer satisfaction level
  • The concept of always improving your customer experience — searching for trends in your good reviews to turn them into great reviews

Speakers:

  • Ramin Vatanparast, Chief Product Officer, Trustpilot
  • Chris Hausler, Senior Data Science Manager, ZenDesk

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Picovoice’s web console lets device makers create their own voice assistants

December 4, 2019   Big Data

Following the rollout of its cloudless, edge device-focused voice assistant stack, which comprises wake word, speech-to-text translation, and speech-to-intent capabilities, Picovoice announced a web console that lets you easily create and train your own voice models. Alongside the web console release, the company joined the Arm AI Ecosystem Partner Program, which gives Picovoice deeper access to ARM IP and to chip manufacturers like NXP. Specifically, Picovoice is focused on ARM Cortex-M chip designs, which are extremely low power and can integrate into all manner of IoT devices — but are powerful enough to support its voice assistant without the need for a cloud connection.

The big idea is that OEMs can use the Picovoice web console to whip up voice controls for their devices large and small, for minimal cost. Products with voice assistants on board are hot, and although the likes of smart speakers and smart displays get the bulk of the attention, some level of voice control is possible on all manner of lower-power edge devices, from coffee makers to lights. Amazon in particular has aggressively pushed into this internet of things (IoT) space with household devices like its microwave and lamp, but those are all part of Amazon’s Alexa ecosystem.

Picovoice sees an opportunity to help other companies capture a chunk of that market.

“Over the course of the past few years, we realized that companies are really struggling to build robust voice experiences, because they have to use several tool sets from different companies and glue them together,” said Picovoice business development chief Mehrdad Majzoobi.

He added that training voice models is resource-intensive — even to simply create a wake word — and requires expertise that not all device makers have available, which drove Picovoice to build a tool that removes that need. “Even non-technical stakeholders in companies [like] product managers and UX designers [can] use the tool to build the experience,” he said. Then, after essentially a one-button export, a company’s engineers can handle integrating the voice capabilities into devices.

The resulting voice models are so efficient, he said, that they can run on multiple classes of tiny Cortex-M microcontrollers. Picovoice’s next goal is for its tool to be able to support 1 billion devices.

In a demo, Picovoice showed VentureBeat how easy the web console is to use. 

You start with the wake word section: Simply type in the word or phrase you want to use, select the platform (eg, ARM Cortex-M, x86_64, etc.), and click Create Wake Word Draft. Click Submit to train the model, and in a couple of hours (or less) it will be available for download.  

 Picovoice’s web console lets device makers create their own voice assistants

Next, you create the “speech-to-intent context,” which is a domain you want to use it for, like smart lighting — Picovoice has templates for common domains, or you can define your own. The click Create Context, and you’ll see a list of parameters to adjust, like “turnLight” and “turnOffLight,” state, and location. 

It doesn’t take a programmer to figure out how to set all the intents, states, and parameters thereof, but there is a bit of a learning curve. You have to type in some text commands and use characters like the “$ ” to define them. But you can test your expressions instantly right there in the browser and make sure you’re on the right track, editing or deleting any that don’t work and adding more if need be.

When you’re done, you click Train, and in a matter of hours you’ll have your model.

With the console, you can see clearly how Picovoice’s domain-specific approach makes sense. For smart lighting, you don’t need a universe of possible commands, like your phone assistant would. You just need a certain set of lights in a given location to turn on or off when you say a given word or phrase. The Picovoice console appears to make that easy for non-technical people.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

You can now use Siri to place Google Voice calls

October 17, 2019   Big Data
 You can now use Siri to place Google Voice calls

If you’re a Google Voice user who frequently kicks off Siri commands from an iPhone, iPod Touch, or iPad, there’s reason to rejoice. As of today, the Google Voice app for iOS integrates with Apple’s intelligent assistant, letting folks who opt in place calls and send messages via Voice by saying things like “Hey Siri, call John on Google Voice” and “Hey Siri, send a message using Google Voice.”

By way of refresher, Google Voice provides call forwarding and voicemail services, voice and text messaging, and call termination for Google Account customers. It’s available in the U.S., Canada, Denmark, France, Netherlands, Portugal, Spain, Sweden, Switzerland, and the U.K. at no charge, excepting a $ 0.01-and-up per minute fee for calls placed internationally.

If you happen to be in possession of more than one Google Voice account, you’ll have to first set the default account for calls initiated outside of the Google Voice iOS app — i.e., through Siri or the Contacts app. Forget to do this and calls won’t go through.

Follow these steps:

  1. Sign in to the Google Account you want to set as the default account.
  2. On the iPhone or iPad, open the Voice app.
  3. At the top left, tap Menu > Settings.
  4. Under Calls, turn on Default account for calls placed outside the app.

Next, set up Siri and Google Voice:

  1. On your iPhone or iPad, tap Settings > Siri & Search.
  2. Turn on Listen for “Hey Siri” or Press Home for Siri > Enable Siri.
  3. From the Settings screen, in the list of apps, tap the Voice app.
  4. Turn on Use with Siri.

Google Voice’s newfound support for Siri was presumably made possible by Apple’s CallKit, which enables developers to integrate calling services with other call-related apps on iOS. CallKit displays the same interfaces as the Phone app for calls and responds appropriately to system-level behaviors such as Do Not Disturb. Apps like VirtualPBX tap it currently, as do CallHippo, Skype, WhatsApp, and TrueCaller.

It’s worth noting that Voice isn’t the first Google app with native Siri support. In September, Google Maps added integration for navigation with iOS 13 and CarPlay. And last November, the Google Assistant app brought support for Siri shortcuts in an upgrade that enabled users to set up phrases they frequently use with Assistant. (For instance, a request like “Turn on the lights, Google” to Siri could open Assistant and run a routine that switches on connected lightbulbs.)

We’ve reached out to Google and Apple for additional info and will update this story when we hear back.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Capgemini: Consumers increasingly prefer voice and chat assistants to humans

September 5, 2019   Big Data
 Capgemini: Consumers increasingly prefer voice and chat assistants to humans

The human touch apparently doesn’t mean as much as it once did. A new report says consumers are rapidly embracing automated assistants as the underlying technology becomes more powerful.

The Capgemini Research Institute today released a report titled: “Smart Talk: How organizations and consumers are embracing voice and chat assistants.” While the technology still faces challenges, the report paints a generally rosy picture and concludes that users increasingly prefer voice and chat assistants to humans.

“This research establishes that conversational assistants are the future of customer interactions, valued by consumers for their convenience and by companies for the operational efficiencies they enable,” said Mark Taylor, head of customer engagement at Capgemini Invent, in a statement accompanying the report.

From a consumer electronics perspective, voice assistants on smartphones and smart speakers have been among the biggest success stories. The underlying voice recognition technology has made rapid advancements in terms of accuracy, and the intelligence that powers these services has also made great leaps.

As such, a broad range of businesses have been scrambling to figure out the most effective use cases.

The introduction of such services has not been without some controversy, of course. Just over half of all consumers continue to express concerns about privacy and data security. And the same percentage expressed a desire for greater personalization of the services, the report says.

But on the whole, these issues don’t seem to be impeding either adoption or satisfaction.

In fact, the study found that adoption is accelerating, with 40% of voice assistant consumers having started within the last 12 months. Going forward, 70% of consumers are expected to use voice or chat assistants rather than going to a store of bank.

The bottom line is that consumer satisfaction for smartphone-based voice services has increased from 61% in 2017 to 72% this year.

The study is the result of a survey of 12,000 consumers and 1,000 business executives.

Businesses seem likely to keep investing in this area, with 76% of companies saying they have experienced measurable benefits as a result of implementing the technology, such as significant reductions in spending on customer service.

The report recommends that organizations keep experimenting to find the right mix of interactions between human agents and automated services. It also urges companies to develop new visual features, such as informative videos, and to remain focused on building trust with users.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

How to Check the Pulse of Your Voice of the Customer Program

August 7, 2019   CRM News and Info

These days, what business isn’t pushing to become more customer-centric? And for good reason –88 percent of businesses view customer experience
as a competitive differentiator. A customer-centric culture is a source of long-term strategic competitive advantage. Culture is the one aspect of a business that cannot be replicated easily by the competition.

To harness this competitive advantage and separate yourself from the crowd, you’ll need to put customers at the heart of your business. Sounds simple enough, right? Not so fast. You need a structured process for listening to customers, and building a Voice of the Customer (VoC) program is like training for a marathon.

A well-designed and executed VoC program doesn’t just appear fully formed — much like you don’t start running a week before the race and expect to set a personal record. You might even think you’ve reached the finish line, when in reality you’ve still got miles to go to reach full CX maturity.

CX professionals pound the pavement at multiple levels: tactical and strategic; leadership and front-line; financial and operational. At any given moment, all these elements are moving at different speeds with unique trajectories to the winner’s circle.

As you consider how to evolve your program, drive business value and keep your key stakeholders on board for the long haul, it’s essential to take a hard look at how your program is performing.

When taking stock of your program’s success, it’s useful to dive into different aspects of maturity. While you may be at similar stages across the board, it’s likely that you’ve progressed more quickly in some areas than in others. Following are the five facets of customer experience maturity that can be used to check the pulse of your program:

1. Clarifying Vision

Trying to prove the value of CX to the entire organization? You’ll have to build a strategy that clearly links your VoC program to key business priorities and your brand promise. So, what’s the trick to creating this connection? Balance, ambition and focus — don’t try to do too much at once!

Don’t forget to think carefully about the phasing of your program: take one step at a time, make sure you’ve got each step right, and tweak your approach when necessary.

Signs of maturity: Haven’t rolled out every stage of your program? Don’t worry. A mature CX vision is about knowing what you’re setting out to achieve and having a well thought out roadmap to get there.

If you have a 12-month plan, consider developing a flexible plan for the longer-term — but stay focused on the overall business priorities, and be prepared to change your tactics as your organization evolves.

2. Tailoring Designs

Your VoC program must appeal to two critical audiences: the business and the customer. To meet the needs of your business, the program needs to be relevant, robust and prioritized. You achieve this by talking to your internal stakeholders to understand what they need from you. It also means being able to align with other data sources and information that are important to them.

For customers, surveys have to be easy and engaging. If customers aren’t compelled to respond and don’t understand what you are looking for, then your poorly designed survey will generate low response rates, and even worse, bad data.

Remember that there are many moments that matter — having quick, easy feedback mechanisms that cater to the preferences of customers (e.g., Web-based, app-based, SMS, etc.) will lead to a more robust and complete view of the customer experience.

Signs of maturity: The design of your VoC program might be a smash hit at the start but can often become an afterthought once the ball gets rolling. A mature CX team stays on its toes and continues to make design changes that suit both the business and customers as the market and expectations evolve.

3. Engaging Across the Business

To drive the right levels of focus and cross-functional thinking, business leaders and the front-line team have to be engaged. You simply can’t overvalue buy-in from your leadership team. Without it, stakeholders won’t take the program seriously.

A high-standing CX champion will motivate the rest of the organization to meet targets, and most critically, make sure your VoC program is assigned enough budget. At the end of the day, executive engagement is all about the money — you’ll need a clear view of the ROI of your program and how CX actions are delivering true business change.

At the same time, it’s the engagement of the front-line team that ultimately will determine the success or failure of your CX efforts. Your front-line team actively drives and manages CX and is the face of your business.

Want to get front-line employees involved? Deliver demonstrable value and communicate! Keep them in the loop: Relay what you are hearing from customers and celebrate successes, so employees understand how their role impacts the customer experience.

How do you share CX insights across the entire business? Break down data silos and make sure the right teams get the right information at the right time, so they can reflect upon insights.

Signs of maturity: When you’re just starting out, activities like customer journey mapping are a great tool for engaging your business. With a map, you can define ownership and provide a clear path forward.

More mature programs have a structure in place to review and amend journey maps. Advanced VoC teams also will have a clear communication strategy that includes elements like role-based reporting, internal events, and social media programs to share results and successes.

4. Driving Action

VoC is all about action at a macro (strategic) and micro (tactical) level. Action at an individual tactical/customer level is more likely to bear early fruit, so shout out those early results! That’s the easy part. The hurdles are when your program matures, and those quick wins have passed by.

This doesn’t mean you can’t accomplish great success in the later stages of a VoC program, but big wins often are more long-term, strategic changes that are harder to come by and take time to implement.

Augmenting the micro-level action with more strategic-level initiatives will lead to more substantive, sustainable change, but it takes time and teamwork to accomplish this.

Signs of maturity: Don’t get tripped up by quantity over quality. A mature program is all about making a true impact. Sure, VoC may have driven a dozen business changes in the first year, but a more mature program will deliver real process re-engineering opportunities. When judging your program, ask yourself: Are the changes you’re making tactical or strategic improvements?

5. Delivering Value

In the planning of your program, you’ll surely set expectations on how you plan to deliver value. There are plenty of ways CX insights can benefit the overall business. Is VoC data helping to reduce your churn rate? Is following up with unhappy customers and resolving their issues increasing revenue?

As your program evolves and matures, you’ll be expected to deliver on those projections and seek out new ways to deliver value across the company.

Signs of maturity: A truly mature program will be in embedded in a company’s culture, and the value it delivers will be unquestionable — but you still should be able to quantify it!

The value of CX should be clear on multiple levels — think financial and operational — that can be reached in the earlier stages of your program. The most mature VoC programs will provide cultural value too.

With companies racing to differentiate on CX, it is essential to measure how your VoC program is performing to separate from the pack. Your strategic vision, program design, level of engagement, resulting actions and the value you’re providing will shift and evolve as your program gains mileage.

As you aim to speed up success, understanding how you’re performing in each area can help you improve planning the next steps for turning your program into a true game-changer that can go the distance for the entire business.
end enn How to Check the Pulse of Your Voice of the Customer Program


Mark%20Ratekin How to Check the Pulse of Your Voice of the Customer Program
Mark Ratekin is director of CX consulting at
Confirmit.

Let’s block ads! (Why?)

CRM Buyer

Read More

Baidu’s DuerOS voice platform is now on 400 million devices

July 3, 2019   Big Data
 Baidu’s DuerOS voice platform is now on 400 million devices

A self-driving milestone and an AI accelerator partnership with Intel weren’t the only high points of Baidu’s Create conference in Beijing this week. The company took the wraps off of DuerOS 5.0, its latest-gen natural language platform, and it revealed that DuerOS’ install base recently surpassed 400 million as voice queries topped 3.6 billion. The former metric is up substantially from 150 million and 100 million last November and August, respectively, when DuerOS reached the 800-million-queries mark.

For the uninitiated, DuerOS is a suite of software developer kits (SDKs), APIs, and turnkey solutions that enable original equipment manufacturers (OEMs) to integrate the platform with smart speakers, refrigerators, washing machines, set-top boxes, and more.

Perhaps the headliner of DuerOS 5.0 is its full-duplex mode, which allows it to respond without the need for a wake word. (It’s a lot like Google Assistants’ Continued Conversation, which that rolled out to supported third-party smart displays like JBL’s Link View and LG’s Xboom AI ThinQ WK9 in March.) Other spotlight features include the ability to recognize “in sync” the moment to act on tasks and when to listen and refrain from reacting, and enhanced voice-controlled playback of Chinese online video platform iQIYI’s content on select devices.

Baidu showcased three new models in its popular Xiaodu smart speaker lineup alongside DuerOS 5.0, all of which start at about $ 28 (RMB199): King Kong, Play, and Xiaodu at Home 1C 4G. King Kong launched in June and boasts an infrared receiver and a built-in DLNA projection screen. Baidu says that Play was specially designed with “innovative features that cater to younger generations.” Last but not least, the portable Xiaodu at Home 1C 4G supports nano-SIM cards and doubles as a mobile hotspot (not unlike Huawei’s Alexa-enabled AI Cube.)

Baidu says that developer adoption of its Xiaodu speakers, which ranked third globally and first in China in the first quarter of 2019 according to Canalys, is on the upswing. There’s now 2,000 skills available from 32,000 developers, up from 16,000 developers as of November 2018), up from about 600 skills in May 2018. And Baidu anticipates a further boost with the launch of its Xiaodu VIP program this week, which it describes as “a way to harness the power of user experience through AI by allowing users and brands to interact effortlessly.”

DuerOS hasn’t quite reached the storied heights of Amazon’s Alexa and Alexa Voice Service, which have more than 90,000 third-party apps and thousands of brands signed on — not to mention compatibility with 60,000 smart home devices. But Baidu says it’s working with heavy hitters including Huawei, Vivo, Oppo, and others to integrate DuerOS into future flagships, in addition to automakers such as BMW, Daimler, Ford, Hyundai, Kia, Chery, BAIC, FAW, and Byton and hotel chains like InterContinental.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Amazon sends Alexa developers on quest for ‘holy grail of voice science’

June 15, 2019   Big Data
 Amazon sends Alexa developers on quest for ‘holy grail of voice science’

At Amazon’s re:Mars conference last week, the company rolled out Alexa Conversations in preview. Conversations is a module within the Alexa Skills Kit that stitches together Alexa voice apps into experiences that help you accomplish complex tasks.

Alexa Conversations may be Amazon’s most intriguing and substantial pitch to voice developers in years. Conversations will make creating skills possible with fewer lines of code. It will also do away with the need to understand the many different ways a person can ask to complete an action, as a recurrent neural network will automatically generate dialogue flow.

For users, Alexa Conversations will make it easier to complete tasks that require the incorporation of multiple skills and will cut down on the number of interactions needed to do things like reserve a movie ticket or order food.

Amazon VP David Limp refers to Conversations as a great next step forward. “It has been sort of the holy grail of voice science, which is how can you make a conversation string together when you didn’t actually programmatically think about it end-to-end. […] I think a year or two ago I would have said we didn’t see a way out of that tunnel, but now I think the science is showing us that [although] it will take us years to get more and more conversational, […] this breakthrough is very big for us, tip of the iceberg,” Limp said.

It begins with a night out and casual conversation

The Alexa Conversations journey is first emerging with a night-out scenario. In an onstage demo last week at re:Mars, a woman buys a movie ticket, makes dinner reservations, and hails a ride in about one minute. (Atom tickets, Uber, and OpenTable are early Alexa Conversations partners.)

The night-out scenario is the first of what Amazon says will become a collection of bundled experiences to get things done.

Conversations may someday power more difficult tasks such as a weekend trip scenario that Limp demonstrated last fall at an event to introduce nearly a dozen new Alexa-powered devices. Limp’s talk of a holy grail is a transformation that every major tech company in the world with an AI assistant is chasing: to evolve assistants from a voice interface that completes basic tasks one at a time to an assistant that can handle complex and complicated tasks.

Two years ago, during a rare onstage gathering of current or former leaders from Alexa, Google Assistant, Siri, and Cortana teams, Viv cofounder and Siri co-creator Adam Cheyer — a person who’s pondered the future of voice assistants since the 1990s – wondered aloud about an assistant that can guide you through the scenario of planning for your sister’s wedding. (Samsung acquired Viv in October 2016 to enhance their Bixby AI assistant.)

At the event, Cheyer talked about how voice will define the next decade of computing and the importance of bridging first-party AI assistant services with a third-party voice app ecosystem. “I don’t want to have to remember what a car assistant can do, the TV system do, the Alexa versus Cortana versus … too much. I want one assistant on every device to access every service without any differentiation between what’s core and what’s third-party,” Cheyer said.

Amazon is working towards that end, starting by reducing the number of interactions you need to get things done with Alexa. Last fall, Amazon introduced Follow-Up Mode, so you can engage in multiple interactions but only have to say the “Alexa” wake word once. With Conversations, the number of interactions necessary to execute the night-out scenario is cut down from 40 to about a dozen back-and-forth interactions.

To further increase the perception that Alexa is capable of natural conversation, the AI assistant learned to whisper when a person is whispering, and can now respond to name-free skill invocation. That means you can say “Get me a ride” instead of first having to launch the skill by saying, “Alexa, launch the Uber skill.”

Creating the perception of intelligence

Amazon isn’t alone in its ambition to make an assistant capable of fluid conversation like the kind you’d expect from another person. Google introduced Continued Conversations so you don’t have the say the wake word to continue to talk about something. Alexa Conversations also gives Amazon’s AI assistant the power to quickly take care of things or engage in commerce akin to Google Assistant’s new food ordering powers and Google’s Duplex. Duplex for the Web and deep connections between Android apps and Google Assistant made their debut last month. Microsoft is also bringing similar intelligence to workplace assistants with Semantic Machines, a startup it acquired in 2018.

It all points to the issue that more complex tasks require more than a single exchange, which Alexa AI senior product manager Sanju Pancholi emphasized. “When you’re starting to solve more complex problems, there is more give and take of information, there are more decisions at each point in time, and hence there are multiple actions that can come in context of the same conversation with different individuals,” he said.

He led a session at re:Mars to make a pitch for Alexa Conversations for businesses and developers, and talked about an assistant that can “solve their product and service needs in the moment of recognition when they realize they need it.”

To be seen as intelligent, Amazon thinks an assistant should understand natural language, remember context, and make proactive predictive suggestions, traits that can prove an assistant is smart enough to accomplish more complex tasks. Doing away with a need to repeat yourself is also critical.

“If you make [customers] repeat information again and again and again, you are forcing them to believe that they are talking to a dumb entity, and if that’s the rapport you’re building with them from the get-go, the chances are they’re never going to delegate higher order tasks to you, because they will never think you’re capable of solving higher-order problems for them,” he said.

The Alexa Skills Store now has more than 90,000 skills, and 325,000 developers have used the Alexa Skills Kit, Pancholi said. Alexa is now available in 100 million devices.

Pancholi shared with developers that potential next steps for Alexa Conversations scenarios may include collections of skills to help people watch content at home, get food delivered, or buy a gift.

Skills on skills

In an interview with VentureBeat, Alexa chief scientist Rohit Prasad declined to share details about use cases that may be taken up next, but believes this could include ways to help plan a weekend. Prasad, who has led Alexa AI initiatives for language understanding and emotional intelligence, said Conversations is designed to stitch together the voice ecosystem for engagement increases for skills and Alexa alike.

“The developer proposition is that you start getting more traffic and more discovery as the more cross skilled we become, like the fact that night out experience is now getting you to order a cab. So Uber and Lyft will see more traffic as well and more customer engagement. So that, and plus skill discovery will happen naturally as part of that. So that’s a huge piece of our value proposition in this case.

Even Blueprints — voice app templates for private, custom Echo skills — may soon incorporate Conversations, Prasad said. Batches of custom skills for the home could, for example, walk kids through multi-step routines, do chores, and help countdown to important dates.

The first proactive Alexa features — Hunches, which suggests event reminders and smart home actions, and Alexa Guard for detecting the sound of broken glass or smoke alarm — were rolled out last fall.

Conversations could someday also become part of Amazon’s voice assistant for the workplace offering if the module is incorporated into Alexa for Business, which added support for Blueprints in March.

Brands, indie developers, and

In January 2018, CNBC reported that Amazon was in talks with brands like Procter & Gamble and Clorox to ink deals to promote their products to Alexa users.

Amazon Alexa VP Steve Rabuchin insists there’s no way for businesses or developers to get prioritized by Alexa’s voice app recommendation system, but the Alexa voice app ecosystem may face another problem. Because of the nature of how voice apps work often without a screen, packaging skills means some skills may inevitably be left out or won’t be ranked.

This is especially important for voice apps. Unlike searching for apps on a smartphone, Alexa delivers voice app recommendation engine only serves up three skills at a time.

“Our vision isn’t to end up where it’s just the biggest brands or most popular,” Rabuchin said in an interview with VentureBeat. “A lot of our most popular skills are indie developers, individuals developers.”

Amazon’s skills recommendation engine that responds when you say things like “Alexa, get me a ride,” recommends voice apps based on measurements like engagement levels, which Amazon started paying developers for in 2017. Whether a skill works

Conversations will incorporate skill quality measurements like user ratings, engagement levels Factors like regional significance, whether a skill works on a smart display, and personal information may also decide which skills appear during Alexa Conversations interactions.

“I think we have a good playbook to start from like, I don’t think it’s a perfect playbook, but it’s a great one to start with,” Prasad said.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More
« Older posts
  • Recent Posts

    • You don’t tell me where to sit.
    • Why machine learning strategies fail
    • Why Some CRM Initiatives Fail
    • Lucas Brothers To Write And Star In Semi-Autobiographical Comedy For Universal
    • NortonLifeLock’s AI-powered smartphone app blurs out sensitive information in photos
  • Categories

  • Archives

    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited