• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Plan

Developing a backup plan

November 30, 2020   BI News and Info
SimpleTalk Developing a backup plan

The most important task for a DBA is to be able to recover a database in the event of a database becoming corrupted. Corrupted databases can happen for many different reasons. The most common corruption problem is from a programming error. But databases can also be corrupted by hardware failures. Regardless of how a database becomes corrupt, a DBA needs to have a solid backup strategy to be able to restore a database, with minimal data loss. In this article, I will discuss how to identify backup requirements for a database, and then how to take those requirements to develop a backup strategy.

Why develop a backup plan?

You might be wondering why you need to develop a backup plan. Can’t a DBA just implement a daily backup of each database and call it good? Well, that might work, but it doesn’t consider how an application uses a database. If you have a database that is only updated with a nightly batch process, then having a daily backup of the database right after the nightly update process might be all that you need. But what if you had a database that was updated all day long from some online internet application. If you have only one backup daily for a database that gets updated all day online, then you might lose up to a day’s worth of online transactions if it was to fail right before the next daily backup. Losing a day’s worth of transaction most likely would be unacceptable. Therefore, to ensure minimal data loss occurs when restoring a database, the backup and recovery requirements should be identified first before building a backup solution for a database.

Identifying backup and recovery requirements

Each database may have different backup and recovery requirements. When discussing backup and recovery requirements for a database, there are two different types of requirements to consider. The first requirement is how much data can be lost in the event of a database becoming corrupted. Knowing how much data can be lost will determine the types of database backups you need to take, and how often you take those backups. This requirement is commonly called the recovery point objective (RPO).

The second backup requirement to consider is how long it will take to recover a corrupted database. This requirement is commonly called the recovery time objective (RTO). The RTO requirement identifies how long the database can be down while the DBA is recovering the database. When defining the RTO, make sure to consider more than just how long it takes to restore the databases. Other tasks take time and need to be considered. Things like identifying which backup files need to be used, finding the backup files, building the restore script/process, and communicating with customers.

A DBA should not identify the RTO and RPO in a vacuum. The DBA should consult each application owner to set the RTO and RPO requirements for each database the application uses. The customers are the ones that should drive the requirements for RTO and RPO, with help from the DBA of course. Once the DBA and the customer have determined the appropriate RTO and RPO then, the DBA can develop the backups needed to meet these requirements.

Types of backups to consider

There are a number of different backup types you could consider. See my previous article for all the different backup types. Of those different backup types, there are three types of backups that support most backup and recovery strategies. Those types are Full, Differential, and Transaction log.

The Full backup, as it sounds, is a backup that copies the entire database off to a backup device. The backup will contain all the used data pages for the database. A full backup can be used to restore the entire database to the point in time that the full backup completed. I say completed because, if update commands are being run against the database at the time the backup is running, then they are included in the backup. Therefore, when you restore from a full backup, you are restoring a database to the point-in-time that the database backup completes.

A differential backup is a backup that copies all the changes since the last full backup off to a backup device. Differential backups are useful for large databases, where only a small number of updates have been performed since the full backup. Differential backups will run faster and take up less space on the backup device. A differential backup can’t be used to restore a database by itself. The differential backup is used in conjunction with a full backup to restore a database to the point in time that the differential backup completed. This means the full backup is restored first, then followed by restoring the differential backup.

The last type of backup is a transaction log backup. A transaction log backup copies all the transaction in the transaction log file to a backup device. It also removes any completed transactions from the transaction log to keep it from growing out of control. A transaction log backup, like a differential type backup, can’t be used by itself to restore a database. It is used in conjunction with a full backup, and possibly a differential backup to restore a database to a specific point-in-time. The advantages of having a transaction log backup are you can tell the restore process to stop at any time during the transaction log backup. By using the stop feature, you can restore a database right up to the moment before a database got corrupted. Typically, transaction logs are taken frequently, so there might be many transaction log backups taken between each full or differential backup. Transaction log backups are beneficial for situations when there is a requirement of minimal data loss in the event of a database becoming corrupted.

Developing a backup strategy for a database

When determining a backup plan for a database, you need to determine how much data can be lost and how long it takes to recover the database. This is where the RTO and RPO come in to determine which types of database backups should be taken. In the sections below, I will outline different database usage situations and then discuss how one or more of the backup types could be used to restore the database to meet the application owner’s RTO and RPO requirements.

Scenario #1: Batch updates only

When I say “Batch Updates Only”, I am referring to a database that is only updated using a batch process. Meaning it is not updated online or by using an ad hoc update processes. One example of this type of database is a database that receives updates in a flat-file format from a third-party source on a schedule. When a database receives updates via a flat-file, those updates are applied to the database using a well-defined update process. The update process is typically on a schedule to coincide with when the flat-file is received from the third-party source. In this kind of update situation, the customer would have an RPO that would be defined something like this: “In the event of a corrupted database, the database would need to be restored to after the last batch update process”. And would have an RTO set to something like this: “The restore process needs to be completed within X number of hours”.

When a database is only updated with a batch process, that is run on a schedule, all you need is to have a full back up right after the database has been updated. By doing this, you can recover to a point in time right after the database has been updated. Using the full backup only will meet the RPO. Since the time needed to restore a full backup is about the same time as it takes to backup, the RTO needs to be at least as long as it takes to run a restore process, plus a little more time for organizing and communicating a restore operation.

Scenario #2 – Batch updates only, with short backup window

This scenario is similar to the last scenario, but in this situation, there is very little time to take a backup after the batch processing completes. The time it takes to back up a database is directly proportional to the amount of data that needs to be backed up. If the time for a backup is short, it might be too short to take a full back up every time the database is updated. This might be the case when the database is very, very large. If there isn’t time to do a full database backup and the amount of data updated is small, then a differential backup would be a good choice to meet the RPO/RTO requirements. With a differential backup, only the updates since the last full backup are copied to the backup device. Because only the updates are backed up and not the entire database, a differential backup can run much faster than a full backup. Keep in mind, to restore a differential backup, you must first restore the full backup. In this situation, a full backup needs to be taken periodically with differential backups being taken in between the full backups. A common schedule for this would be to take the full backup when there is a large batch window, like on a Sunday when there is no batch processing, and then differential backups during those days when the batch window is short.

Scenario #3 – Ad hoc batch updates only

Some databases are not updated on a schedule but instead are updated periodically but only by an ad hoc batch update process that is manually kicked off. In this situation, there are a couple of different ways of handling backing up of databases that fall into this category. The first one is just routinely to run full database backups on a schedule. The second is to trigger a backup as the last step of the ad hoc batch update process.

A routine scheduled full backup is not ideal because the backups may or may not be run soon after the ad hoc batch update process. When there is a period of time between the ad hoc process and the scheduled full backup, the database is vulnerable to data loss should the database become corrupted for some reason before the full backup is taken. In order to minimize the time between the ad hoc update and the database backup, it would be better to add a backup command to the end of the ad hoc update process. This way, there is a backup soon after the ad hoc process, which minimizes the timeframe for when data could be lost. Additionally, by adding a backup command to the ad hoc update process, you potentially take fewer backups, which reduces the processing time and backup device space, over a routine backup process.

Scenario #4 – Online updates during business hours

In this scenario, the database gets updates from online transactions, but these online transactions are only run during business hours, say 8 AM to 5 PM. Outside of regular business hours, the database is not updated. In this situation, you might consider a combination of two different types of backups: Full and Transaction log backups. The full backup would be run off-hours, meaning after 5 PM and before 8 AM. The transaction log backups will be used during business hours to back up the online transactions shortly after these transactions have been made. In this situation, you need to review the RPO to determine how often to run transaction log backups. The shorter the RPO, the more often you need to take transaction log backups. For example, suppose a customer says they can lose no more than an hour worth of transactions, then you need to run a transaction log backup every hour between the hours of 8 AM and 5 PM.

Scenario #5 – Online updates 24×7

Some databases are accessed every day all day. This is very similar to Scenario #4, but in this case, the database is accessed and updated online 24×7. To handle backup and recovery in this situation, you would take a combination of full and differential backups along with transaction log backups.

With a database that is updated 24×7, you want to run the full and differential backups at times when the databases have the least number of online updates happening. By doing this, the performance impact caused by the backups will be minimized. There are way too many different database situations to tell you exactly how often a full or differential backup should be taken. I would recommend you try to take a full backup or differential backup daily if that is possible. By doing it daily, you will have fewer backup files involved in your recovery process.

The transaction log backups are used to minimize data loss. Like scenario #4, the frequency of transaction log backups is determined by the RPO requirements. Assuming that the customer can lose one hour’s worth of transitions, then transaction log backup would need to be run hourly, all day, every day to cover the 24×7 online processing.

Developing a backup plan

It is important for a DBA to work with the application owners to identify the backup and recovery requirements for their databases. The application owners determine how much data they can lose (RPO), and how long the database can be down while it is being recovered (RTO). Once the RTO and RPO requirements are defined, the DBA can then develop a backup plan that aligns with these requirements.

Let’s block ads! (Why?)

SQL – Simple Talk

Read More

A 6-Point Plan to Leapfrog to CX Leadership

April 9, 2020   CRM News and Info

Customer experience as a boardroom topic is more relevant than ever. Enterprises are investing significant digital transformation budgets and commissioning large projects to elevate CX. Yet more than 70 percent of digital transformation projects fail to move the needle at scale.

One topic that has a substantial impact on CX is the transformation of customer service operations using digital tools. Although it is often the engine room for driving transformation and change, it is met with limited success.

As enterprises across all industries embark or reflect on learnings from their journey, consider following these six steps to success.

1. Eliminate Unnecessary Contact

Radically eliminate unnecessary contact by leveraging a combination of analytics of interactions, and steering contacts to self-service channels (including cognitive digital assistants).

The first wave is to understand the root causes of contacts and unexplained spikes, while leveraging insights gained from customer journey and interaction analytics. For example, a large European-heritage telecom provider identified that 80 percent of the contacts could be bucketed into 20 percent of situations, providing for a clear angle of attack to abate contact volumes.

The next wave is to have a clear strategy to triage and steer the “deflection-ready” contacts to cognitive digital assistants (voice bots, chat bots) — and the rest to agent-assisted (e.g., live chat) or agent-only channels.

Many enterprises seem to struggle with high email volumes, causing a backlog that ultimately gets lost, and low usage of digital assets (e.g., chatbots, mobile apps). For the same telco, implementation of a natural language processing interactive voice response based on a well-mapped customer journey resulted in 30 percent reduction in contact volumes. This allowed the company to reinvest CX budget in reaching out to more customers regarding issues that it wasn’t able to address earlier.

The final wave would be to steer and hand off the complex, multiple intent queries to skilled agents in live chat or voice agents without breaking the customer service journey. For example, best in class CX operations can send a message immediately when they detect breaks in a consumer’s purchase journey. They then can fulfill the buying or support journey by transitioning seamlessly from online to phone and mobile.

2. Simplify Processes to Connect Front and Back Offices

Simplification of business processes and standardization of fewer processes are essential to reduce complexity and achieve economies of scale. Merger and acquisition integration further adds to this complexity.

Re-imagining service designs based on customer journey mapping, and reducing process steps and effort are critical parts of the simplification approach. Process reengineering and omnichannel customer engagement platforms help simplify and harmonize interactions and processes across channels.

For example, a leading travel company was facing dis-economies of scale (more agents required to support growth) due to multiple processes resulting from disjointed journeys and process redundancy. It benefited by implementing an omnichannel CX platform in an incubation lab setup that simulated a real-time customer service environment. That approach allowed them to make refinements before rolling out the platform to its agent network. The company lowered its total spend by 15 percent and improved the CX for its customers.

3. Automate Ruthlessly

While significant progress has been made toward back-office automation, front-office automation presents a significant opportunity to deliver at scale. Interplay of voice and non-voice channels, and interactions with systems of record have been challenging to automate.

Lessons learned are that a majority of efforts focus on task automation using robotic process automation, or RPA (toggling between screens, copy pasting between systems, etc.), but most of the process remains manual due to tribal knowledge and technical limitations.

Advances in technology driven by conversational artificial intelligence and machine learning have ensured that natural language understanding helps recognize multiple intents that enable sales fulfillment with conversational AI bots.

Likewise, computer vision and advanced optical character recognition tools help deliver end-to-end process automation (e.g., returns management, cancellations, document management) for back-office processes that weren’t possible with vanilla RPA tools or that resulted in scrappy implementations that were costly to manage.

4. Enable Agents to Become Super-Agents

When customers cannot solve their problem via self-service (e.g. legacy IVR or FAQ only chatbots) they typically shift to the most expensive channel (phoning an agent). By then they’re likely frustrated, making it critical that agents possess a positive attitude and resiliency at all times.

Customer complaints and questions typically are more complex, requiring the agent to employ thoughtful problem-solving skills and use good judgment to provide the best customer experience possible.

Customers expect agents to be their expert advisors and help them navigate through complex problems or even negotiate an offer, such as lower interest on a credit card. Ultimately, agents need to be prepared to address a variety of inquiries, many of which are difficult to predict. Instead of relying on scripts and training, a super-agent whose skills are augmented with digital tooling can come to the rescue.

Agent enablement and empowerment tools to create a “super-agent” through enhanced performance (accuracy, compliance, cross selling, first contact resolution) include the following:

  • A truly omnichannel 360-degree view of a customer in a unified desktop with next-best actions reduces average handling time and improves business outcomes (e.g., cross selling). For example, implementation of a unified desktop for a European online real estate platform resulted in increased revenue by 20 percent. Similarly, a telco and media company reduced revenue leakage by 20 percent by increasing the transparency of customers and pricing across siloed channels.
  • Knowledge management tools based on semantic search help improve agent productivity (capturing tribal knowledge, improved search functionality) and accuracy, with less downstream tickets resulting in a faster resolution and smaller middle office. Further, real-time Voice of the Customer programs (enabled by speech analytics) help the agent understand sentiment in real-time. This is accomplished with recommendations on next-best actions integrated within the agent cockpit to cross-sell and win-back vulnerable or lost customers.
  • AI-powered translation services allow an agent proficient in one or two languages to service customers (via synchronous or asynchronous messaging) in any other language. Using this solution, enterprises can leverage the same customer service team to serve multiple languages where needs are small-scale, but language skills are scarce.

5. Personalize Interactions and Deliver Real-Time Next Best Actions

Delivering contextual campaigns that allow proactive engagement with consumers based on their choice of day and time for sales, and preventive break/fix or renewal/retention are changing the brand-to-customer paradigm.

A global luxury retailer understood this challenge and implemented a data-driven, analytical process to engage and acquire customers proactively in social media. It automated the entire sales to fulfillment journey to cater to the demands of its target audience (mobile-first, millennials, busy during working hours, etc.).

6. Transform Models and Metrics

Most importantly, transform commercial models and CX success metrics from input-based to business outcomes.

These include reductions in total cost of ownership, as well as topline improvement and increased customer retention.
end enn A 6 Point Plan to Leapfrog to CX Leadership


Shyan Mukerjee is chief digital and transformation officer at
Majorel. He has deep expertise in customer experience services, business process outsourcing, digital services and software. Mukerjee previously served as partner at EY Strategy & Operations, and as vice president, global services and sourcing advisory, at Everest Group LLC. Throughout his career, Mukerjee has worked with clients across Fortune 500 companies and private equity with a focus on the technology and business services sector.

Let’s block ads! (Why?)

CRM Buyer

Read More

Not too early to plan for the Trump presidential library at Briny Breezes, Florida

December 26, 2019   Humor
 Not too early to plan for the Trump presidential library at Briny Breezes, Florida


A trailer park could be the site of the IMPOTUS presidential library.

It could OTOH, be another real estate grift that could be separate from the plan to put the Trump Tomb at Bedminister NJ near his golf course.

Not unlike his proposal to move Seoul, Korea away from the border with North Korea, moving the existing residents at Briny Breezes Florida should be easier. And the Trump Library site could be literally and figuratively under water eventually. Trump’s change of residence now makes him a potential favorite son.

x

A Palm Beach County trailer park could be the site of a future Donald J. Trump presidential library – and the pitch may be made by Vanilla Ice, a close friend of Donald Trump Jr. https://t.co/9KN4PmZ4AJ

— Jonathan Lemire (@JonLemire) December 21, 2019

BRINY BREEZES — A Palm Beach County trailer park could be the site of a future Donald J. Trump presidential library.

That’s the vision of James Arena, a real estate broker and resident of Briny Breezes, the 43-acre coastal town just south of Boynton Beach that’s made up entirely of a mobile home park.

Arena, an avid Trump supporter, says he thinks he can convince the president to buy the land and turn it into a personal monument. Arena said he has the ball rolling by reaching out to his friend, rapper and Palm Beach County resident Vanilla Ice, who is close to the Trump family.

“Vanilla Ice ran it by Donald Jr.,” Arena said of the president’s eldest son. “He called me back and said, ‘Man, I think they’re really into it.’”

 Not too early to plan for the Trump presidential library at Briny Breezes, Florida
Briny Breezes is a town in Palm Beach County, Florida, United States. The population was 411 at the 2000 census. Briny Breezes (or “Briny” as it is known locally) is a small coastal community of approximately 488 mobile homes along State Road A1A. Briny is a private community consisting mostly of “Snowbirds” from the Northeast, Midwest, and Canada. It and Ocean Breeze are the only two mobile home parks in Florida that are incorporated towns. 
As of the census[3] of 2000, there were 411 people, 266 households, and 129 families residing in the town. The population density was 5,912.5 inhabitants per square mile (2,267.0/km²). There were 534 housing units at an average density of 7,682.0 per square mile (2,945.4/km²). The racial makeup of the town was 99.27% White, 0.49% Asian, and 0.24% from other races. Hispanic or Latino of any race were 0.49% of the population.
en.wikipedia.org/…

Host Chuck Todd asked Trump if he’d thought about a presidential library.

After saying, “I’m so busy,” twice Trump said, “I know a lot of people mentioned it to me, the presidential library.” Todd next asked if Trump knew where he’d want the library.

Then, the man who markets everything from his name to bogus for-profit colleges to steaks, went into money making mode and answered, “I have a lot of locations actually. The nice part, I don’t have to worry about buying a location.”

[…]

Even with Mar-a-Lago being promoted using taxpayer dollars and accusations of profiteering from his inauguration, just to name a few unsavory ways Trump and his family have attempted to make some side cash since his election, Chuck Todd was somehow surprised by the president’s response, saying, “I have to say, I didn’t see the idea of his library on one of his properties coming with that answer.”

www.rollingstone.com/…!

Let’s block ads! (Why?)

moranbetterDemocrats

Read More

Microsoft brings back coopetition: Salesforce partnership part of bigger plan

November 19, 2019   CRM News and Info
digital agreement Microsoft brings back coopetition: Salesforce partnership part of bigger plan

Back in April 2016, I attended the very first Microsoft Envision Conference. While I had very mixed feelings about the conference itself, I was able to ascertain through a significant number of subtextual hints that Microsoft under Satya Nadella’s leadership was taking a crucial turn. Here’s how I described it then:

More Salesforce

“If I had to encapsulate what I hear Satya Nadella say and others support at Envision it is that Microsoft is not going to be a software company per se any longer but a company that is devoted to being a critical part of the infrastructure of business via Azure and businesses via thoroughly contemporary versions of Office and various Dynamics applications – and that Dynamics would become a business solutions platform and Office a unified communications platform – all with the express purpose of providing those highly personalized outcomes that lead to major gains in productivity. Microsoft becomes (if they achieve their vision) the company that is fully interwoven, enmeshed in the 21st century business infrastructure. That would be the “in-the-nutshell” (what a quaint expression) version.”

Literally all of this has not only been confirmed but as of yesterday is coming…I’d say with a vengeance but that’s not in the spirit of this thing…with Microsoft on a significant roll at a gathering-no-moss speed. 

On November 13, 2019, Microsoft and Salesforce announced that they had signed a deal with two technologically important provisions:

  • The Salesforce Marketing Cloud would be running on Microsoft Azure. This is significant given that the Sales and Marketing Clouds are running on AWS and in 2017 Salesforce said it would do stuff with Google Cloud though to my knowledge that is a bit meaningless at the moment. (maybe not in the future but as I write this in 2019) But Lord knows, when the cold wind blows it’ll turn your head around (Thank you James Taylor for that from Fire and Rain at 2:11 on this video) and they will actually do something with Google Cloud.  It is hard to ignore how highly business – that means the B2B world especially  –thinks of Azure, the #1 most-trusted IaaS platform. Every survey I’ve seen from 2016 to the present and there are dozens (In no particular order: here. here. here), when it comes to business Microsoft Azure wins – either as most utilized, in progress to overtake AWS or most trusted especially at the enterprise level. Now, before you complain, I’m well aware that I can find surveys that say the opposite, but the interesting thing here is that there has been enough volume and velocity of reports saying that Azure is trumping AWS in the enterprise to make that a consideration in your cloud provider strategy and assessment. I’m no fool, which is why I look at volume and velocity, not a single report. The other thing that makes this alliance particularly interesting at the level of Azure and the Marketing Cloud is that a survey from Densify and another from Kentik in 2019 both confirmed they are finding that multi-cloud deployments are increasing to the point that it’s now a real “thing.”  In fact, Densify identifies it as “multicloud deployments are becoming the new norm” something that I’m sure both Microsoft and Salesforce are aware of acutely – and each for their own reasons.
  • Salesforce also announced Sales and Service Cloud integrations with Microsoft Teams which is eating Slack for lunch these days (though I’d rather have tuna salad on soggy rye bread than Slack for lunch. Not a fan really). A lot of that is Salesforce’s recognition of the increasing adoption of Microsoft Teams and the desire to make it interoperable with Chatter. A very smart move on both parts.

However, this is not what makes this big deal a big deal. It’s Microsoft’s grand strategy, which translates into their mission and vision, that makes this is a big deal for the industry despite the limited scope of the actual arrangement.

Microsoft has the foresight to recognize that in order to become a mission-critical part of all 21st century business infrastructure, they can’t really just try to compete for the crown. That would be silly. We live in a world that, at the core, is governed by diversity – not just of color or gender but of choice.  And that dovetails with the one mission all humans have in common – in fact, the only mission that they have in common – the desire to be happy in their individual lives. Having control over those choices on their path to that happy life is paramount. Not only are their customers looking to choose what technologies they use to make their lives at work more effective and convenient, but they want control over those choices. Microsoft, rather than taking the approach of us or nothing is saying, no matter, whatever technology you use we can underlay, overlay, or embed in or integrate with so we are good with what you’ve decided, mi amore. And that we can provide, via Azure – and that’s the notable part of this – the infrastructural service you need to do so.

To that end, Microsoft has been on a tear for the last six plus months cementing alliances not only with Salesforce but with:

Microsoft-Oracle: Announced on June 5, 2019, this was a cloud interoperability partnership so that Azure Services could run on Oracle Cloud across  both. The key paragraph from the press release was this one:

“Connecting Azure and Oracle Cloud through network and identity interoperability makes lift-and-improve migrations seamless. This partnership delivers direct, fast and highly reliable network connectivity between two clouds, while continuing to provide first-class customer service and support that enterprises have come to expect from the two companies. In addition to providing interoperability for customers running Oracle software on Oracle Cloud and Microsoft software on Azure, it enables new and innovative scenarios like running Oracle E-Business Suite or Oracle JD Edwards on Azure against an Oracle Autonomous Database running on Exadata infrastructure in the Oracle Cloud.”

Defined by interoperability. At Oracle Open World, Larry Ellison in his keynote described what he literally called Oracle’s “wonderful” partner Microsoft. Aside from the fact I never heard Larry Ellison use the word “wonderful” before, in the past there was no shortage of digs at Microsoft, so this represents a significant and, to me, welcome evolution in the relationship between the two companies – and with a lot of runway for the relationship to be carried considerably further.

Microsoft-ServiceNow: The original partnership between these two companies was announced in October 2018 but was limited to the federal public sector – more of a tactical alliance. It was ServiceNow IT workflows combined with Microsoft Azure to enhance “digital transformation” at federal agencies. (Digital transformation is, of course, the catchall buzzword in every press release of every vendor for everything. Most importantly, buzzwords aside,  it was an important first step – and a tactical test – to a more strategic relationship announced in July 2019 to extend the idea of the workflows within an Azure infrastructure for “enterprise customers in highly regulated industries.” While this is still a bit restricted, with Bill McDermott coming in as the new ServiceNow chieftain, after leaving SAP, this is likely to be taken a significant amount further, since under McDermott’s tutelage, SAP and Microsoft always had a friendly, if at times competitive, relationship going back many years.  So his predisposition is to ally with Microsoft, as could be seen with the Open Data Initiative (ODI) (see below) which consists of SAP, Microsoft and Adobe and was a major cooperative venture during Bill’s tenure.  This bodes well for the broadening of this alliance even more – especially since Microsoft is thinking “ecosystems” now. That means that they see that ServiceNow plays a role in their end to end customer ecosystem which they themselves cannot play – and thus partner’s with ServiceNow accordingly.

Microsoft-SAP: This is actually one of the longest standing partnerships. The partnerships, which have had an on again off again vibe, harken back to 2005’s Mendocino/Duet technology announcement and release (2006) and now extend to in-cloud migration allowing customers and others to migrate to SAP’s  S/4 for Hana via Azure services.  This partnership, announced October 20, 2019, once again shows the common theme for all of Microsoft’s strategic partnerships is Azure and the services that it provides.

Microsoft-Adobe – I call this longer standing superbly crafted partnership the GARP – The Get A Room Partnership because it is so close and so intimate that they almost shouldn’t be doing those things in public. I’ve written extensively on this one because it is a paradigm of how strategic partnerships that are ecosystem focused should work. To get more details on this, check out what I wrote about its strengths and limitations in 2018, this Watchlist winner post this year that provides what I thought about it from Adobe’s perspective (included in it but not an article about the partnership per se) and some more of the recent evolution of the partnership with Magento and Marketo playing a role in that, according to my dear bud and CRM Playaz partner Brent Leary and included in his take on the Salesforce-Microsoft announcement of a few days ago. To summarize, this is arguably not only the best executed and closest strategic relationship I’ve ever seen from two vendors but it should be seen as a paradigm on how to do one of these.  It involves of course, running a number of Adobe’s services (most recently managed services for Adobe Experience Manager) on Azure and tying a lot of Adobes application architecture to Azure. It also involves Microsoft using Adobe Digital Marketing as Microsoft’s primary enterprise marketing B2C offering but in conjunction with that most recently, tying Marketo Engage to both Dynamics 365 and LinkedIn even as Microsoft continues to evolve its own marketing applications.   Even at the business level, Microsoft and Adobe salespeople are compensated for the sales of specific applications of the other partner in order to encourage “bundled” apps and services from both combined. This is the best of the best of strategic partnerships and is the trigger partnership for all of the others.

Open Data Initiative (ODI) – In September 2018, Microsoft, SAP, and Adobe announced what they called the Open Data Initiative (ODI). At first it was confusing, it was announced during Dreamforce and seemed to be a dig at Salesforce, and it was unclear whether or not this was to be a new universal data standard or it was to be focused on data interoperability among the three data models of the three different companies.  It may have been a dig at Salesforce, though that was denied, but the timing remains suspicious, nonetheless. But thankfully for all of us, it was not meant to be a new universal data standard but instead the architecture for interoperability for the data of the three companies.  Whew. But once again, regardless of what the purpose was, this is a strategic pact Microsoft is driving with its new partners and one that fits their game plan exceptionally well.

While these are all germane to the tech industry and Azure and are interesting to people like me (and, I presume, you) we can’t forget all the strategic ecosystem partnerships that Microsoft forged in 2019, so briefly, here they are:

Sony – announced in May 2019, this partnership was to explore opportunities in gaming and AI – which given the competition between Xbox One and PS5, made this a significant effort – again eschewing competition for coopetition. 

AT&T – Announced in July 2019, this is a multi-year effort that follows the partnership thread around an Azure framework and services effort that involves the public cloud, 5G and many other facets. The key part of the announcement from the press release:

“As part of the agreement, AT&T will provide much of its workforce with robust cloud-based productivity and collaboration tools available with Microsoft 365, and plans to migrate non-network infrastructure applications to the Microsoft Azure cloud platform.”

Humana – In October 2019, Microsoft announced a strategic partnership with Humana to reimagine health care solutions for the 21st century aging population. Embedded in the agreement was the following: 

“Using the power of Microsoft’s Azure cloud, Azure AI, and Microsoft 365 collaboration technologies, (bold mine) as well as interoperability standards like FHIR, Humana will develop predictive solutions and intelligent automation to improve its members’ care by providing care teams with real-time access to information through a secure and trusted cloud platform.”

Once again, Azure is the centerpiece of this vertically specific partnering. Even though a very different kind of company, it follows a bit along the lines of the agreement with ServiceNow. Vertically specific, but on the Azure IaaS platform.

Allianz – Just a few days, Microsoft announced that it will be working with Allianz and their insurance tech spin-off Syncier, to develop customized insurance industry solutions based on, what else?, Azure and with an Azure marketplace for insurance solution providers to show their wares (as long as they are built with Azure in mind of course).

What does this all mean?

Microsoft is executing well on its grand strategy (not the same as a strategy) – to become a mission critical part of all business infrastructure in the 21st century. Obviously, to accomplish this they need the platform, architecture, services and framework to do that – and they have that.  Azure. They also have the confidence of the enterprise market in Azure they need to justify their case that this is something that will benefit the markets they are addressing.  If you look at the current make up, AWS and Azure are the two most dominant IaaS platforms and while AWS remains the leader, Azure is gaining ground.  But that’s not all that goes into accomplishing their objective.  Having the IaaS platform is a prerequisite to be sure, but you have to be able to prove that you can support the provision of outcomes that the business community from IT to line of business is looking for. The technology has to be put to use of the customer.

In Microsoft’s case, to their great credit, they know that. Have they justified that with these alliances? Not yet.  What they have justified with these partnerships though is:

  1. They are committed to thinking about the world via platforms and ecosystems.
  2. They understand that interoperability in a diverse world is necessary, not an evil to be overcome. We don’t live in a world that sole provider exclusivity is really that much of an option if that’s all you are providing. Don’t get me wrong. It is an option.  But Microsoft is fully cognizant of that no one really wants to dump what have been highly useful investments in IT just to be devoted to a single vendor’s offering. The likelihood of that decreases as the era continues.  So they see ecosystems interwoven with a technology matrix that ties multiple companies technologies together as the way to go.
  3. Thus, the partnerships you see above.  All of them are governed by a Azure managed container and are built to meet specific objectives.
  4. Sometimes in the case of Oracle, AT&T and SAP – its to allow workloads to be managed in Azure – in others it’s a vertically specific effort – see ServiceNow, Humana and Allianz/Syncier – while couched in the language and buzzwordry (my new term) of digital transformation, are still workflows, services, etc. in an Azure container. 
  5. Microsoft recognizes this and apparently is more than any other company I’ve seen to date, willing to run with the idea that interoperability, cooperation, ecosystems, non-exclusivity, and diversity of choice are how they are going to best effect their goal. Thus the 9 strategic efforts which, with the exception of Adobe, were all generated this year. 

Do I think they can pull this off? Possibly. Microsoft has deep pockets, a clear-cut objective, the willingness to throw out the norms of competition that we’ve seen get vicious at times in the tech world.  Have they done enough? Not yet. There are many more pieces to the completion of this puzzle that go beyond just Azure containers so to speak. Office 365 plays a big role. Other alliances. Their own partner ecosystem. Their business applications (Dynamics 365) platform (PowerBI) and citizen apps builder (PowerApps) all play a role.  So there is more to consider, but this current rather dramatic effort is worth watching and Microsoft, if they can minimally stay the course and optimally escalate without losing focus, is a company that will have to be reckoned with a great deal in 2020 – in a cooperative way of course.

Let’s block ads! (Why?)

ZDNet | crm RSS

Read More

Why it is so important to adopt Geo-Analytics in your business plan

July 31, 2019   Microsoft Dynamics CRM

Geo analytics2 Why it is so important to adopt Geo Analytics in your business plan

What is Geo-Analytics

Geo-coded, or location-based data integrated with analytics results in powerful GeoAnalytics. GeoAnalytics provides hidden insights into geographical aspect of data extracting crucial information of geospatial system. It provides accurate data as geo-codes of a location or in simple terms latitude and longitudes, and a broader data as postal code, street name, house number along with other details which are subject to change.

GeoAnalytics largely helps in understanding the aspects of geospatial analysis include climate change monitoring, sales analysis, human population insight, precision farming, agricultural maps and many others. There are various modeling techniques followed in geospatial analytics. The first GIS (geographical information system) was founded for creating a manageable inventory of natural resources. Roger Tomlinson created design for automated computing to store bulk of data and deriving buried information from it for Canadian Government. He also gave GIS its name.

How Does Geo-Analytics Work

Although there has been advancement in GeoAnalytics, most of the systems are yet not apt to moderate and analyze huge chunk of data. This is especially true for geospatial data that deals with complex issues as spatial dependence. Spatial dependence is the tendency of nearby locations to influence each other and exhibit similar attributes. Spatial analysis systems were majorly designed to compute finite datasets. Large spatial data meant greater analytics to calculate spatial dependencies and thus parallel computing, where multiple processes are carried out mutually exclusively, couldn’t perform to its utmost efficiency.

In order to deal with large geospatial datasets, GeoAnalytics uses coding and scripting to create new systems. These systems are powerful enough to deal with satellite data and perform super-modern calculations emulating supercomputers where parallel computing could also be included. This is a huge migration from the rudimentary practices of geospatial analytics.

The wave of GeoAnalytics is further drifting towards Predictive Analytics now i.e. predicting future aspects based on the trends and statistics that have been recorded. Organizations having geospatial predictive analytics have a competitive edge over those that don’t. This enables businesses to have better visualization and analysis of data thus giving them the ability to aim for blueprint of a familiar structure.

Implementation of Geo-Analytics

GeoAnalytics can also be used to study behavioral pattern of demography. For instance, say there is epidemic disease in an area, the doctors can analyze the Census Data to view the major affected areas and perform GeoAnalytics determining the factors what is the root cause of the disease. They can launch campaigns hosting health-checkups, analyze eating habits, hold surveys and analyze the root cause of disease, start preventive measures in the area and take other actions.

This data can be stored in the GeoAnalytics system and further used for Predictive Analysis and taking precautionary measures. By extending the capabilities of GeoAnalytics in this era, it has become available on your fingertips. Whether you want to perform Spatial Analysis, Predictive Analysis, Sales Person Performance Analysis, Tracking POS (point of sale) data or others, you can just turn to GeoAnalytics. The advancement has happened at a lightning speed in few decades and is still growing.

There are many other reporting tools available that allow number-crunching but being able to visualize data based on location and do a visual analysis on the map is incredibly powerful. GeoAnalytics is key in defining the ‘Where’ element of data, that where is the data coming from and what is its business significance. It is adapted by major organizations and is instrumental in their growth.

Therefore, adapting GeoAnalytics in your business model for Dynamics 365 / PowerApps (CDS) allows you to stay up to date with the trend and take powerful decisions.

Maplytics™, is a CfMD preferred solution for Dynamics 365 & PowerApps helping to extract location insights of CRM data using Bing Maps Services increasing salesperson productivity through optimized map routing, turn by turn navigation with Google Maps and Waze App and distribution and planning of appointments along with Check-In enabled. One of the key features is territory management which allows defining and aligning territories based on location. Maplytics also enables searching location in proximities, improving area of service and enhancing localized marketing by campaigns. For a better analysis, Maplytics offers locational business insights with Heat Maps, shape file integration, Census Data and increases responsiveness of services team through location planning and mobility.

You can get your free trial at our Website or Microsoft AppSource!

Follow us on twitter: @maplytics

Happy Mapping!

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Adam Cheyer: Samsung’s plan for winning with Bixby is empowering third-party developers

June 23, 2019   Big Data
 Adam Cheyer: Samsung’s plan for winning with Bixby is empowering third party developers

Adam Cheyer, a Brandeis University and UCLA alum with degrees in computer science and AI, knows a thing or two about digital assistants. He previously led the Cognitive Assistant that Learns and Organizes (CALO) project at SRI International’s Artificial Intelligence Center, which sought to integrate cutting-edge machine learning techniques into a platform-agnostic cognitive assistant. Cheyer was on the founding team of Siri, the startup behind the eponymous AI assistant technology that Apple acquired in 2010 for $ 200 million, and he cofounded Viv Labs, which emerged from stealth in 2016 after spending four years developing an assistant platform designed to handle complex queries.

Samsung acquired Viv in October 2016 for roughly $ 215 million, and soon after tasked Cheyer and colleagues to build their startup’s technology into the company’s Bixby assistant, which rolled out in March 2017 alongside the Samsung Galaxy S8 and S8+. The fruit of their labor — Bixby 2.0 — made its debut in October 2017 at Samsung’s Bixby Developer Conference, and it formally launched on the Galaxy Note9 in August 2018.

Today, Bixby is available in over 200 countries and on over 500 million devices, including Samsung’s Family Hub 2.0 refrigerators, its latest-gen Smart TV lineup, and smartphone and tablet series including the Galaxy S, Galaxy Note, and mid-range Galaxy C, J, and A. (Sometime this year, the first-ever smart speaker with Bixby built in — the Galaxy Home — will join the club.) On the features front, Bixby has learned to recognize thousands of commands and speak German, French, Italian, U.K. English, and Spanish. And thanks to a newly released developer toolkit — Bixby Developer Studio — it supports more third-party apps and services than ever before.

But the Bixby team faces formidable challenges, perhaps chief among them boosting adoption. Market researcher Ovum estimates that 6% of Americans used Bixby as of November 2018, compared with 24% and 20% who used Alexa and Google Assistant, respectively. For insight into Bixby’s development and a glimpse at what the future might hold, VentureBeat spoke with Cheyer ahead of a Bixby Developer Session in Brooklyn today.

Here’s a lightly edited transcript of our discussion.

VentureBeat: I’d love to learn more about Bixby Marketplace, Bixby’s upcoming app store. What can customers expect? Will they have to search for Bixby apps and add them manually, or will they be able to launch apps with trigger words and phrases?

Adam Cheyer: Bixby Marketplace will eventually be available as part of the Galaxy Store, the app store [on Galaxy phones, Samsung Gear wearables, and feature phones]. Samsung is committed to having a single place where you can buy apps, watch faces, and other items. You’ll be able to find Capsules there in addition, and Capsules contributed by developers in Samsung’s Premier Development Program will have key placement. But I think the coolest way to interact with the Marketplace will be through Bixby itself.

There’s a number of approaches you can take already. One is Bixby Home, [the left-most dashboard] on Galaxy phones’ home screens. On [smartphones], you just tap on the Bixby button and swipe to see featured Capsules and other Capsules in all sorts of categories.

You can also discover Capsules automatically through natural language. For instance, if you say something like “Get me a ride to San Francisco,” Bixby will respond ”Well, you don’t have any rideshare providers enabled right now, but here are several providers in the Marketplace.” You’ll then be prompted to try [the different options] and decide whether you like one brand, another brand, or both. If you enable more than one, Bixby will ask which you’d like to use by default.

Also, as you suggested, you can invoke Capsules with a name or phrase. For instance, you can say “Uber, get me a ride to San Francisco.”

VentureBeat: Right. So eventually, will developers be able to charge for voice experiences — either for Capsules themselves or Capsule functionality? I’m envisioning something akin to Amazon’s In-Skill Purchases, which supports one-time purchases and subscriptions.

Cheyer: Absolutely. The first version of the Bixby Marketplace will not feature what I call “premium Capsules,” which means paid apps or subscription apps. But we’re working hard on that, and we’ll have some announcements around that soon. We know that the content providers of the world need to make a living, and we will absolutely support that.

Transactional Capsules can charge money — we have providers like Ticketmaster and 1-800-Flowers who are accepting purchases today, and we’ve worked really hard to lower purchase friction for our commerce partners. If you’ve saved your card on file anywhere within the Samsung ecosystem, Bixby will know about it — you just say “Send some flowers to my mom,” and the 1-800-Flowers Capsule will say “Great — do you want to pay with your usual card?”

Additionally, we support OAuth for partners like Uber, which have cards on file within user accounts. You’re able to attach Bixby and give it account access privileges so that you can make purchases in these partners’ payment flows.

VentureBeat: You added new languages to Bixby recently — they joined English, Korean, and Mandarin Chinese. What are a few of the localization barriers the team’s facing as they bring Bixby to new territories?

Cheyer: We’re working hard to launch at least five new languages a year, and we may up that in the future.

We believe that offering the right tools and building an ecosystem that scales will enable the world’s developers to create fantastic content for end-users. This is especially important when it comes to globalization because it means that we don’t have to localize every single service. Instead, we provide a platform that has same capabilities in each language.

VentureBeat: So on the subject of developer tools, has the Bixby team investigated neural voices like those adopted by Amazon and Google? I’m referring to voices generated by deep neural networks that sound much more human-like than the previous generation of synthetic voices.

Cheyer: I’m not going to announce anything that’s not yet in production, but I will say that Samsung has significant capabilities not only on the text-to-speech side of things but on the speech recognition side, as well. There’s significant advances being made in AI, and neural network voices is certainly one of them. There’s also a lot of work ongoing in automatic speech recognition (ASR) — we’re transitioning from hidden Markov model approaches to pure end-to-end neural networks — and we’re seeing ASR models move from the cloud to edge devices like phones.

We’re definitely aware of all of this, and you can rest assured that we’re working hard on these areas.

VentureBeat: You briefly mentioned privacy. As you’re probably aware, there’s some concern about how recorded commands from voice assistants are being stored and used.  Bixby already offers a way to delete recordings, but would the team consider introducing new commands or in-app settings that’d make it even easier to delete this data?

Cheyer: Sure — we’re open to all of those things. Privacy is an important and multifaceted issue. For me personally, it’s not just the fact that my voice was used to tune a particular speech recognition model somewhere. I’m much more concerned about what an assistant’s doing on a semantic level — what it knows about me and why it’s showing me certain information.

But different users are going to worry about different things. You have to offer a variety of ways to let users control the data that companies have, and how they use that data.

One thing that’s important to note is that we’ve made control over what Bixby learns a fundamental platform capability. I’ll give you an example: With Bixby, developers can opt to use machine learning to process requests from users. If I ask Bixby about the weather in Boston, it might not be obvious, but which “Boston” I’m referring to is actually a preference. Most people are going to choose Boston, Massachusetts, but people who live in Texas might choose Boston, Texas. It’s kind of annoying to have to repeatedly specify which Boston you want, which is why Bixby is built to learn preferences about things like restaurants, products, and ridesharing globally and locally.

We surface these learnings to users in the Understandings page. They’ll see that Bixby guessed that they meant Boston, Massachusetts last time they asked about the weather. If they didn’t, they’re able to update it or make it clear that they don’t want Bixby to know this information about them. They always have total visibility of what is known about them and how it’s being used at a very granular level.

VentureBeat: Would you say that this degree of customization and personalization is one of Bixby’s strong suits?

Cheyer: Yes. At a high level, most of what competitors provide developers is speech recognition and language understanding, but that’s pretty much it. From there, the developer has to hardwire each use case and hand-code everything that happens next, whether that’s querying a mapping server to figure out which “Boston” a user is referring to.

Bixby is a radically different platform where every question that gets asked of the user goes through a machine learning process. No other platform on the market from any competitor has anything like that. We have an AI tool and for every use case that a developer comes up with, it will write the code and take care of assembly, calling out to different APIs, translating the information from one API to another, interacting with the user, and learning from interactions. All that comes for free with the platform. This type of learning — we call it dynamic program generation — has a lot of benefits around privacy and security. It makes for a much richer experience and saves a lot of development time.

Another thing I think is super exciting about the Capsule approach is that developers can use the same Capsule across multiple devices. They don’t need a Capsule for a TV, a different Capsule for a refrigerator, and a third Capsule for a phone. When they build a Capsule, that Capsule is the same Capsule that supports all languages and all devices. We have this new Bixby Views system that’s publicly available so that developers can build multimodal graphical experiences with the richest palette of components available. And there are automatic conversions so that unmodified capsules will run brilliantly on things as small as watches, as big as The Wall TV, and everything in-between.

The primary usage of most other assistants goes to the built-in services that come out of the box with those assistants. Despite tens of thousands of skills or actions that third parties create, very little usage goes to third-party developers for a variety of reasons.

One of the things that we’re committed to with Bixby is giving as much power and focus to third-party developers as possible. That’s really our long-term plan for winning — having a vibrant ecosystem where most of the business and most of the user traffic goes to third parties.

VentureBeat: Samsung recently announced Bixby Routines, which learn habits to preemptively launch apps and settings. Do you have any updates to share on Routines or any other predictive features that might soon come to Bixby?

Cheyer: Samsung acquired not only Viv Labs’ technology but acquired SmartThings [in August 2014], and one of the things that SmartThings does is that it manages complex routines [through SmartThings Cloud]. So there’s that already, and I think there will be continued enhancements in existing integrations between Bixby and SmartThings — that’s something to look for.

A single use case in the Bixby framework like booking a flight gets broken down into around 60 different automatically-generated steps executed interactively. If you think about Routines, it’s just changing together different steps — the base technology is already being used in Capsule development. Requests are still mostly Capsule-specific, and so there’s limited cross-capsule capability. As soon as we open that up on the natural language side, the execution and dynamic programs generation capabilities are all there already.

VentureBeat: It’s fair to say that a key appeal of assistants is how seamlessly they command home devices. I’d love to hear about how the Bixby team is thinking about third-party device integration, and perhaps areas where they’re investing time and resources into making these interactions more powerful.

Cheyer: For me, one of the big announcements this year for Bixby will be device footprint. Last year, Bixby 2.0 was out on a single phone, but Samsung is aggressively working to backport the new Bixby to every smartphone with at least a Bixby button and some even without Bixby buttons. As a result, the number of devices that have Bixby will go up significantly just in the handset dimension. Samsung’s also beginning to ship Bixby today on Family Hub refrigerators, Smart TVs, and in the future Galaxy Home speakers, and there are many, many more devices that I can’t talk about that will be getting Bixby this year.

I think when you look back at the end of the year, you’ll see Bixby nearly everywhere. And as you know, Samsung has a device footprint of about a billion devices, and the company is on record as saying that by 2020, all of the devices they sell will be connected. They’re investing heavily to make Bixby a ubiquitous control interface.

VentureBeat: Speaking of ubiquity, can we expect to see Bixby on third-party devices in the near future? Will developers and manufacturers eventually be able to add Bixby to their devices?

Cheyer:  That was something extremely important to me when Samsung acquired Viv. We always intended to make the assistant as important an ecosystem as the web and mobile — once you have a thriving ecosystem and businesses are supporting it, the assistant becomes more than just a device feature. You want to get it on every single device in the world to drive more requests and allow users to benefit from its scale.

Samsung committed to me and has publicly committed to shipping Bixby on non-Samsung devices. This year, our work will focus on getting it out to millions of Samsung devices and opening a market for third-party developers. And next year or sometime soon, our intention absolutely is to open up Bixby to non-Samsung devices.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Outlook’s new AI features help you plan for meetings

March 29, 2019   Big Data

Microsoft’s beefing up Outlook with new AI-driven features, it announced today. In the next few weeks (in North America in English), it’ll roll out a trio of features in Outlook on the web that expedite meeting prep, intelligently suggest meetings, and recommending meeting venues — all powered by the new framework the Redmond company announced last September.

“You’ve heard Office has been adding intelligent technology across the suite to help you stay ahead of threats and keep you in the flow of work,” product marketing manager Gabriel Valdez Malpartida wrote in a blog post. “We’ve also been working in Outlook to deliver intelligence that will help you stay organized and get things done.”

First on the list is Meeting Insights. On the eve of your next meeting, Outlook — leveraging Microsoft Graph, a developer platform that connects multiple services, devices, and products across Windows, Office 365, Azure — will surface potentially relevant documents and notes. Post-meeting, it’ll organize files shared in emails, SharePoint, OneDrive, along with messages exchanged about the meeting and content shared both during and afterward.

 Outlook’s new AI features help you plan for meetings

Above: Meeting insights.

Image Credit: Microsoft

“The information is uniquely tailored, so people who are in the same meeting will not necessarily see the same recommendations,” Malpartida said.

Next up is Suggested reply with a meeting, which builds on Outlook’s existing suggested replies feature. (For the uninitiated, that’s the handy tool that suggests three short, canned replies at the bottom of the compose window when you respond in a thread.) Now, Outlook detects when there’s an intent to meet between people, and automatically bubbles up a Schedule a meeting option. Clicking on it pulls up a meeting form prepopulated with info.

 Outlook’s new AI features help you plan for meetings

Above: Suggested locations.

Image Credit: Microsoft

To enable Suggested replies (and by extension Suggested reply with a meeting), head to the Settings menu within Outlook, then View all Outlook settings > Mail > Compose and reply. Check the box next to Show suggested replies, select save, and you’re golden.

Lastly, there’s Smart time suggestions and Suggest locations. The former suggests days and times when all attendees are free to meet, while the latter serves up meeting spots — like conference rooms or cafes — tailored to your preferences, along with those places’ addresses, hours, and contact information.

“We aim to bring you features that will make a difference, so some of these features will show up only in Outlook on the web while we gather data on them and evaluate whether to bring them to other Outlook endpoints,” Malpartida added. “We hope these features help you save time and get things done faster.”

The updates follow on the heels of Microsoft’s refreshed Outlook for iOS, which features a new UI, and app icon, and “sensory feedback. Perhaps uncoincidentally, they also come days after Google brought AMP for Email project, an open-source branch of its Accelerated Mobile Pages (AMP) Project that promises more dynamic and web-like email experiences, into general availability.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Design, Plan, Execute & Support – A Supply Chain Evaluation Guide

October 31, 2018   NetSuite
gettyimages 875110490 Design, Plan, Execute & Support – A Supply Chain Evaluation Guide

Posted by Gavin Davidson, Product Marketing Manager for ERP

As hurricane season approaches, it’s a timely reminder that no matter how efficient your supply chain is, we’re all at the mercy of the next natural disaster. Building redundancy into your supply chain is critical, but even then you need to make sure that your supply routes are spread not only across multiple vendors, but also multiple geographies and ideally multiple shipping routes.

When NetSuite went about re-thinking its supply chain software, I went through the exercise of mapping out all of the processes and ended up grouping them into four categories: Design, Plan, Execute & Support. These categories really helped focus the customers I spoke with as we held our process mapping sessions, but they should also be useful when evaluating the effectiveness of your supply chain redundancy.

Design. Everything benefits from having a solid foundation to build on, and your supply chain is no different. True redundancy will inevitably result in multiple variations of your product designs, especially if you’re relying on your contract manufacturers to source local components. In NetSuite, the Advanced BOM feature allows for a specific version of a BOM to be applied to any combination of locations that it’s sourced from – but uniquely, it also allows a single BOM to be associated to multiple end SKUs. This really simplifies things when your dealing with multiple brands of a single item. Making sure that everyone is working from the same version of the BOM at all times reduces errors, ensures optimal cost of quality and should also minimize returns. Having your Engineering Change Order process embedded in your ERP system – unifying design, engineering and execution – should be a critical part of your company’s strategy.

Plan. Building on a well designed product and supply chain, it’s time to plan for the inevitable – disaster! Whether natural or otherwise, Murphy’s Law states its going to happen, so you must be ready for it when it does. One of the keys to effective supply chain management is a control tower, essentially a single place where you can go to see everything that’s happening, evaluate your response and execute upon it. NetSuite has launched a supply chain snapshot feature that’s a central part of any control tower that shows all of your inventory and related transactions with the ability to filter by subsidiary, location etc. Users can easily now look at your global or regional supply situation and quickly source alternates when required, but the key here is to practice your response. Using a sandbox account to plan out some simple and worst-case scenarios and discussing the best ways to resolve them and how those should be communicated internally and externally will make it easier to execute an alternate plan when it’s required.

Execute. Communication is a key aspect in all areas of business, but especially so in a complex supply chain. Keeping open lines of communication and making sure that the message being delivered is concise and specific is critical. Consider establishing a communication strategy for suppliers, customers and employees. In today's social media age, people have more information at their fingertips than they can possibly consume – so it’s your responsibility to guide them to the information that’s most important and to make sure that everyone knows what you expect of them. It is YOUR supply chain after all. Not all your suppliers will be fully digitized, but they can still take advantage of using NetSuite’s portals to make sure they are always aware of your current plan and how you expect them to execute against it.

Support. Support is an interesting category. At the end of the day we ended up putting everything that supported but didn’t directly impact the supply chain into this category. Beyond after-sales support, customer service and financials, one of the biggest areas of concern was fully understanding the cost of sourcing products from other geographies. NetSuite’s Inbound Shipment Management and Landed Cost functionality ensures that you can consider freight, duty and brokerage costs – and the impact on profitability – as part of your decision making.

With NetSuite’s 18.2 release, your supply chain managers have unprecedented access to all the information they need to make the decisions that ultimately affect your business’ ability to execute on your plan. Whether that involves design and engineering changes, full documentation control, real-time tracking of current and future inventory balances, inbound shipment management or accurate accounting of landed costs, they will find all of the necessary information at the tip of their fingers allowing them to execute on their plans and fulfill your company’s goals.

But to really make fundamental change to your business, you’d be wise to consider following our four step methodology to supply chain success: Design, Plan, Execute & Support.

Watch Sneak Peek to catch a preview of what’s new across several different areas of the latest release.

Posted on Tue, October 30, 2018
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

Read More

Hypothyroidism revolution reviews – Ultimate thyroid boosting meal plan diet

August 14, 2018   Humor

The hypothyroidism revolution book

Isn’t life too short to suffer with Hypothyroidism symptoms …especially when there’s a natural home remedy that will not only treat Hypothyroidism symptoms but also put you on a pathway to a healthier way of living, one which prevents Hypothyroidism from coming back? Isn’t life too short for you to be settling for a low-energy, low quality existence? Many people have felt the same way, and have turned to help from the program developed by Tom Brimeyer.

Hypothyroidism revolution reviews Hypothyroidism revolution reviews – Ultimate thyroid boosting meal plan diet

They were able to perform a complete overhaul of the way they lived, following Tom’s system and now their lives have changed forever. A big part of Tom’s program is the Hypothyroid Diet. He calls his system the Hypothyroidism Revolution, a complete life-changing treatment system for both Hypothyroidism and Hashimoto’s Thyroiditis.

Who is Tom Brimeyer?

does the hypothyroidism revolution program work 300x276 Hypothyroidism revolution reviews – Ultimate thyroid boosting meal plan dietPeople call him a miracle-working genius, but Tom Brimeyer is actually a Functional Medicine Practitioner and health researcher specializing in Hypothyroidism and nutrition.

He also suffered from Hypothyroidism and was getting no results from traditional medicine (drugs). After many years of research on the thyroid and hormones, he was able to find answers in the work of fellow researchers studying the effects of energy production and aging.

He began to incorporate these findings into his everyday life and after many many months formulated the plan that works. He called it Hypothyroidism Revolution, which is now an e-book detailing his entire plan for ridding the body of Hypothyroidism

 The Hypothyroidism Diet is Part of the Plan But Not All of It

One aspect of the Hypothyroidism Revolution is the Hypothyroidism Diet. I will give you a glimpse of this diet because I believe it’s the easiest way to start on Tom Brimeyer’s system right away. To find out how to incorporate the diet into Tom’s plan, you’ll have to get the whole e-book and read it carefully.

The Thyroid Diet Revolution is Based on a Synergistic Relationship Between All Areas of Your Life

The Hypothyroidism Diet is much more complex than a few pointers from a list, as it’s derived from research and designed to work with your hormonal system to treat your thyroid condition. Synergy works to build your body back to its healthy state so it can naturally fight your hypothyroidsm. These Program affects all parts of your life in profoundly beneficial ways, ways you could never imagine- you simply must read the books to discover this amazing program. You’ll learn:

  • How your hormones affect your condition and how to control them
  • Complete List of What food you should eat and Avoid from carbs , protein and fats
  • 60 Days Hypothyroid Diet Program and Meal Plan , no more guessing. This is the golden nugget of this program
  • The Cheapest and Reliable Hypothyroid Testing that you can do at your own home . This Techniques is far more reliable than common and expensive TSH Testing
  • Theurapatic Power Foods to Improve your thyroid function
  • The Eating Plan for Frequent Traveler / Workaholic , Someone who is tight on Budget , Kid , and for Low Carb Dieter
  • the effects of stress hormones on your condition and what to do to balance your hormones
  • how to put it all together: diet, stress reduction, hormone program, lifestyle changes specific to your condition and much more

tom brimeyer thyroid diet 300x250 Hypothyroidism revolution reviews – Ultimate thyroid boosting meal plan diet

Is The Hypothyroidism Diet Revolution For You?

Let be begin by saying congratulations – you’ve decided to take your personal health matters into your own hands – take charge and be in control of your health. Now that you’ve made the commitment to a healthy lifestyle that will soon be free from Hypothyroidism, you have the drive to stick with an advanced diet plan that’s actually not for everyone.

Why Isn’t the Hypothyroid Revolution for Everyone?

Truth be told, not everyone can handle these program. Not everyone is committed to their health in a way that will enable them to learn the plan and then stick to it. But if you suffered from Hypothyroidism for years the way so many people have, people who’ve been helped by Tom Brimeyer’s Hypothyroidism Revolution, you will have the motivation to stay the course.

How it Works

These Program is partly about balance:

  • balance life/work
  • balance stress through Tom’s unique program with techniques backed up by research
  • balance hormonal activity with Tom’s Hypothyroid Revolution
  • balance nutrition with the Hypothyroidism Diet:
    • Women: 1800- 2000 calories/day
    • Men: 2400 to 2600 calories/day
    • That’s the Hypothyroidism Diet, in a nutshell. Once you learn your ratios, set your calorie range and learn the foods to avoid, you’re halfway there.

A Few Facts About the Hypothyroidism Diet

For an average 2000/day calorie diet, you’ll need 125 grams of protein per day. This is a huge amount of protein, 50% of which should come from dairy. That means you will have to start collecting recipes which use a lot of dairy in them. Good thing that when you get these Low Thyroid Diet Revolution for yourself, The Hypothyroidism Cookbook is part of the package!

The Hypothyroid Diet is Not a DIET Diet…

Also, the Hypothyroidism Diet’s primary function is to rid your body of Hypothyroidism and treat symptoms…not to make you lose weight. You’ll see surprising things here, things that are specific to the body of someone suffering from Hypothyroidism. Here are some surprising items from the Hypothyroidism Diet:

  • Most of your protein should come from dairy
  • Avoid Whey Protein & Iodine
  • Polyunsaturated Fats Are A Major Cause of Hypothyroidism
  • Eat The Right Sugar!
  • Avoid Low Carbs and Low Fat Diet

Now of course these are only a few choice tidbits of information designed to show you how different this diet really is from anything you’ve ever seen before. You must learn the program in order to incorporate the diet correctly, as it all works together in synergistic relationships carefully developed by Mr. Brimeyer.

Order Up Your Own Hypothyroidism Revolution

In fact, you will learn a lot about healthy eating in general, but more importantly you will begin to see an end to you suffering from Hypothyroidism. You can get the entire system today- choose the Digital Program option. But if you prefer a hard copy upgrade, it’s on sale right now. Today it’s only $ 127 (normally $ 347). The digital version is an even better value at $ 97 …also on sale, down from $ 297.

The Best Place to buy this program is no other by visiting the official site www.hypothyroidismrevolution.com . You will see many Real & solid testimonial from there that proof this program is really works

And also when you purchase this program , you will get an email coaching from Tom Brimeyer himself , you can ask any question to him , Plus you also protected by 60 Days 100% No Question Ask Money Back Guarantee . So There is absolutely no risk for you to try this program

If you’re ready for your own personal Hypothyroidism Revolution, the Hypothyroidism Cookbook, Advanced Diet Guide and many more ,  simply Click The Link Below and get started living your life to the fullest right now. 

coupon discount book pdf Hypothyroidism revolution reviews – Ultimate thyroid boosting meal plan diet

Let’s block ads! (Why?)

3EE Reviews

Read More

Disaster Recovery Plan Barriers (And How to Overcome Them)

February 28, 2018   Big Data

Disaster recovery should be as central to your data management strategy as data security and reliability. But the fact is that it’s not at most organizations. Here’s how to overcome the challenges that most organizations face in implementing an effective disaster recovery plan.

According to Syncsort’s “State of Resilience” survey from January 2018, a full 85 percent of IT professionals report that their organizations either have no disaster recovery plan in place, or are not confident in the ability of their disaster recovery plans to meet their goals.

SoR Disaster Recovery Plan Disaster Recovery Plan Barriers (And How to Overcome Them)

That’s a pretty remarkable figure. Although disaster recovery may not be the most exciting topic for many IT professionals, it’s a crucial one for any business that wants to guarantee its survival following unexpected events.

Addressing Disaster Recovery Plan Hurdles

hurdles 1503753 960 720 600x Disaster Recovery Plan Barriers (And How to Overcome Them)

To build an effective disaster recovery plan — and avoid being counted among the 85 percent of professionals whose organizations lack one — you need to overcome the following disaster recovery hurdles:

  • Thinking backups are enough

Backing up data is part of the disaster recovery process. But it is only a part. A full disaster recovery solution requires data backups and a plan for restoring data quickly following a disaster.

  • Thinking cloud data is safe

If your data is stored in the cloud, you may think it can never disappear. In reality, there are lots of reasons why cloud-based data can be lost, ranging from cyber attacks and accidental data deletion to a failure on the part of your cloud provider. Even if you make extensive use of the cloud, you should still have a disaster-recovery plan in place.

  • Focusing only on compliance

In some cases, organizations may fail to implement effective disaster recovery because there are no compliance pressures for them to do so. Compliance frameworks are less likely to mandate disaster recovery than they are to regulate things like data security. However, compliance requirements shouldn’t be the reason you implement disaster recovery. You should have it in place for the sake of protecting your business and your customers, not because a government regulator tells you to create a disaster recovery plan.

  • Failure to understand what “disaster” means

The term disaster recovery can be a bit misleading. To some people, it implies that disaster recovery is necessary only for organizations that are at risk of suffering a major disaster, like a hurricane or massive fire that wipes out their data center. Those types of events are rare (especially if your data center is located in a region not prone to major natural disasters). However, the reality is that disasters come in many forms, and they may not always be large-scale. The failure of a single server could be a disaster if the server contains critical data that you cannot recover. Disaster recovery is about preparing for all types of disruptions, not just the fire-and-brimstone variety.

  • Lack of training

Your IT staff are well versed in tasks like installing software and storing data. Those are the things they learn in training programs and from on-the-job experience. But chances are that most of them lack prior experience with disaster recovery, and never formally studied the topic as part of training. This means that teaching your team to implement proper disaster recovery will require them to invest time in learning something new.

  • Lack of time and money

Spare time and money tend to be in short supply at most organizations. This shouldn’t be a reason for not investing in disaster recovery planning, however — especially when you consider how much the lack of a disaster recovery solution will cost you in the long run. Downtime costs vary according to the size of your business, of course, but put simply, they are enormous. Downtime also leads to employee productivity loss on the order of 78 percent. That’s a lot of money and staff time that can be saved by having an effective disaster recovery plan in place before a disruption occurs.

To learn more about the state of disaster recovery preparedness in organizations today, read Syncsort’s full “State of Resilience“ report.

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Read More
« Older posts
  • Recent Posts

    • You don’t tell me where to sit.
    • Why machine learning strategies fail
    • Why Some CRM Initiatives Fail
    • Lucas Brothers To Write And Star In Semi-Autobiographical Comedy For Universal
    • NortonLifeLock’s AI-powered smartphone app blurs out sensitive information in photos
  • Categories

  • Archives

    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited