• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Developing

Developing a backup plan

November 30, 2020   BI News and Info
SimpleTalk Developing a backup plan

The most important task for a DBA is to be able to recover a database in the event of a database becoming corrupted. Corrupted databases can happen for many different reasons. The most common corruption problem is from a programming error. But databases can also be corrupted by hardware failures. Regardless of how a database becomes corrupt, a DBA needs to have a solid backup strategy to be able to restore a database, with minimal data loss. In this article, I will discuss how to identify backup requirements for a database, and then how to take those requirements to develop a backup strategy.

Why develop a backup plan?

You might be wondering why you need to develop a backup plan. Can’t a DBA just implement a daily backup of each database and call it good? Well, that might work, but it doesn’t consider how an application uses a database. If you have a database that is only updated with a nightly batch process, then having a daily backup of the database right after the nightly update process might be all that you need. But what if you had a database that was updated all day long from some online internet application. If you have only one backup daily for a database that gets updated all day online, then you might lose up to a day’s worth of online transactions if it was to fail right before the next daily backup. Losing a day’s worth of transaction most likely would be unacceptable. Therefore, to ensure minimal data loss occurs when restoring a database, the backup and recovery requirements should be identified first before building a backup solution for a database.

Identifying backup and recovery requirements

Each database may have different backup and recovery requirements. When discussing backup and recovery requirements for a database, there are two different types of requirements to consider. The first requirement is how much data can be lost in the event of a database becoming corrupted. Knowing how much data can be lost will determine the types of database backups you need to take, and how often you take those backups. This requirement is commonly called the recovery point objective (RPO).

The second backup requirement to consider is how long it will take to recover a corrupted database. This requirement is commonly called the recovery time objective (RTO). The RTO requirement identifies how long the database can be down while the DBA is recovering the database. When defining the RTO, make sure to consider more than just how long it takes to restore the databases. Other tasks take time and need to be considered. Things like identifying which backup files need to be used, finding the backup files, building the restore script/process, and communicating with customers.

A DBA should not identify the RTO and RPO in a vacuum. The DBA should consult each application owner to set the RTO and RPO requirements for each database the application uses. The customers are the ones that should drive the requirements for RTO and RPO, with help from the DBA of course. Once the DBA and the customer have determined the appropriate RTO and RPO then, the DBA can develop the backups needed to meet these requirements.

Types of backups to consider

There are a number of different backup types you could consider. See my previous article for all the different backup types. Of those different backup types, there are three types of backups that support most backup and recovery strategies. Those types are Full, Differential, and Transaction log.

The Full backup, as it sounds, is a backup that copies the entire database off to a backup device. The backup will contain all the used data pages for the database. A full backup can be used to restore the entire database to the point in time that the full backup completed. I say completed because, if update commands are being run against the database at the time the backup is running, then they are included in the backup. Therefore, when you restore from a full backup, you are restoring a database to the point-in-time that the database backup completes.

A differential backup is a backup that copies all the changes since the last full backup off to a backup device. Differential backups are useful for large databases, where only a small number of updates have been performed since the full backup. Differential backups will run faster and take up less space on the backup device. A differential backup can’t be used to restore a database by itself. The differential backup is used in conjunction with a full backup to restore a database to the point in time that the differential backup completed. This means the full backup is restored first, then followed by restoring the differential backup.

The last type of backup is a transaction log backup. A transaction log backup copies all the transaction in the transaction log file to a backup device. It also removes any completed transactions from the transaction log to keep it from growing out of control. A transaction log backup, like a differential type backup, can’t be used by itself to restore a database. It is used in conjunction with a full backup, and possibly a differential backup to restore a database to a specific point-in-time. The advantages of having a transaction log backup are you can tell the restore process to stop at any time during the transaction log backup. By using the stop feature, you can restore a database right up to the moment before a database got corrupted. Typically, transaction logs are taken frequently, so there might be many transaction log backups taken between each full or differential backup. Transaction log backups are beneficial for situations when there is a requirement of minimal data loss in the event of a database becoming corrupted.

Developing a backup strategy for a database

When determining a backup plan for a database, you need to determine how much data can be lost and how long it takes to recover the database. This is where the RTO and RPO come in to determine which types of database backups should be taken. In the sections below, I will outline different database usage situations and then discuss how one or more of the backup types could be used to restore the database to meet the application owner’s RTO and RPO requirements.

Scenario #1: Batch updates only

When I say “Batch Updates Only”, I am referring to a database that is only updated using a batch process. Meaning it is not updated online or by using an ad hoc update processes. One example of this type of database is a database that receives updates in a flat-file format from a third-party source on a schedule. When a database receives updates via a flat-file, those updates are applied to the database using a well-defined update process. The update process is typically on a schedule to coincide with when the flat-file is received from the third-party source. In this kind of update situation, the customer would have an RPO that would be defined something like this: “In the event of a corrupted database, the database would need to be restored to after the last batch update process”. And would have an RTO set to something like this: “The restore process needs to be completed within X number of hours”.

When a database is only updated with a batch process, that is run on a schedule, all you need is to have a full back up right after the database has been updated. By doing this, you can recover to a point in time right after the database has been updated. Using the full backup only will meet the RPO. Since the time needed to restore a full backup is about the same time as it takes to backup, the RTO needs to be at least as long as it takes to run a restore process, plus a little more time for organizing and communicating a restore operation.

Scenario #2 – Batch updates only, with short backup window

This scenario is similar to the last scenario, but in this situation, there is very little time to take a backup after the batch processing completes. The time it takes to back up a database is directly proportional to the amount of data that needs to be backed up. If the time for a backup is short, it might be too short to take a full back up every time the database is updated. This might be the case when the database is very, very large. If there isn’t time to do a full database backup and the amount of data updated is small, then a differential backup would be a good choice to meet the RPO/RTO requirements. With a differential backup, only the updates since the last full backup are copied to the backup device. Because only the updates are backed up and not the entire database, a differential backup can run much faster than a full backup. Keep in mind, to restore a differential backup, you must first restore the full backup. In this situation, a full backup needs to be taken periodically with differential backups being taken in between the full backups. A common schedule for this would be to take the full backup when there is a large batch window, like on a Sunday when there is no batch processing, and then differential backups during those days when the batch window is short.

Scenario #3 – Ad hoc batch updates only

Some databases are not updated on a schedule but instead are updated periodically but only by an ad hoc batch update process that is manually kicked off. In this situation, there are a couple of different ways of handling backing up of databases that fall into this category. The first one is just routinely to run full database backups on a schedule. The second is to trigger a backup as the last step of the ad hoc batch update process.

A routine scheduled full backup is not ideal because the backups may or may not be run soon after the ad hoc batch update process. When there is a period of time between the ad hoc process and the scheduled full backup, the database is vulnerable to data loss should the database become corrupted for some reason before the full backup is taken. In order to minimize the time between the ad hoc update and the database backup, it would be better to add a backup command to the end of the ad hoc update process. This way, there is a backup soon after the ad hoc process, which minimizes the timeframe for when data could be lost. Additionally, by adding a backup command to the ad hoc update process, you potentially take fewer backups, which reduces the processing time and backup device space, over a routine backup process.

Scenario #4 – Online updates during business hours

In this scenario, the database gets updates from online transactions, but these online transactions are only run during business hours, say 8 AM to 5 PM. Outside of regular business hours, the database is not updated. In this situation, you might consider a combination of two different types of backups: Full and Transaction log backups. The full backup would be run off-hours, meaning after 5 PM and before 8 AM. The transaction log backups will be used during business hours to back up the online transactions shortly after these transactions have been made. In this situation, you need to review the RPO to determine how often to run transaction log backups. The shorter the RPO, the more often you need to take transaction log backups. For example, suppose a customer says they can lose no more than an hour worth of transactions, then you need to run a transaction log backup every hour between the hours of 8 AM and 5 PM.

Scenario #5 – Online updates 24×7

Some databases are accessed every day all day. This is very similar to Scenario #4, but in this case, the database is accessed and updated online 24×7. To handle backup and recovery in this situation, you would take a combination of full and differential backups along with transaction log backups.

With a database that is updated 24×7, you want to run the full and differential backups at times when the databases have the least number of online updates happening. By doing this, the performance impact caused by the backups will be minimized. There are way too many different database situations to tell you exactly how often a full or differential backup should be taken. I would recommend you try to take a full backup or differential backup daily if that is possible. By doing it daily, you will have fewer backup files involved in your recovery process.

The transaction log backups are used to minimize data loss. Like scenario #4, the frequency of transaction log backups is determined by the RPO requirements. Assuming that the customer can lose one hour’s worth of transitions, then transaction log backup would need to be run hourly, all day, every day to cover the 24×7 online processing.

Developing a backup plan

It is important for a DBA to work with the application owners to identify the backup and recovery requirements for their databases. The application owners determine how much data they can lose (RPO), and how long the database can be down while it is being recovered (RTO). Once the RTO and RPO requirements are defined, the DBA can then develop a backup plan that aligns with these requirements.

Let’s block ads! (Why?)

SQL – Simple Talk

Read More

Q&A: Research at Washington University in St. Louis is Aimed at Developing a Precision Medicine Approach to Fight Neurological Diseases

November 8, 2020   TIBCO Spotfire
TIBCO PredictiveAnalytics e1604071449449 696x365 Q&A: Research at Washington University in St. Louis is Aimed at Developing a Precision Medicine Approach to Fight Neurological Diseases

Reading Time: 3 minutes

Researchers at Washington University School of Medicine in St. Louis are developing a precision medicine approach to fighting neurological diseases, screening thousands of cell mutations to find and alter the ones that cause illness. Washington University in St. Louis presented an overview of the work at TIBCO NOW 2020. We sat down with William Buchser, assistant professor of genetics and director of the F.I.V.E. @ MGI facility, and Jack Bramley, a scientist in the Buchser Lab, to get a closer look.

TIBCO: What does your research entail?

William Buchser: There are a number of different therapies out there used to treat genetic diseases. One of the newest is gene therapy, utilizing a virus to “fix” a gene because it was broken. As that tool becomes more available therapeutically, then the question becomes, “What gene do we fix?” That’s where we come in—to figure out what gene is broken so that we can pass that information on to scientists to conduct additional research that may lead to the development of gene therapy to fix the problem. Finding these mutations is called functional genomics, and we employ CRISPR to generate mutations in human cell lines that we screen for microscopic markers of disease.

TIBCO: What are some of your challenges with regard to the data in your research?

WB: We humans have about 20,000 genes in our genome, and the DNA sequences of those genes can vary from person to person. If you consider small changes in the letters of those genes (called variants), then each of us has about a million of these compared with any other person. Someone with a disease has almost a million variants that are not harmful and only a handful that are. The challenge is having a tool to manage the datasets and large numbers of combinations of variants that we are testing and keep track of how the data moved through the analysis system in a natural way.

TIBCO: In addition to variants, what other data points do you look at?

JB: We have a number of different data sources that we use. We use confocal imaging to take pictures of millions of cells and figure out—through predictive analytics—if that cell is harboring a disease-causing variant. The dataset considers these cells as a matrix of features. We then get sequencing data for particular cells chosen for further analysis, and there is experimental metadata connected with the cells as well.

TIBCO: How do you use TIBCO software?

JB: We use TIBCO Spotfire to bring in our cell-based datasets and run it through quality control procedures. We also use it to generate our early machine learning models quickly, so we can start selecting cells for picking as soon as possible. We also use TIBCO Data Science Workbench (Statistica) to generate more advanced models to predict which cells are perturbed by the mutations. Once sequence data is available, Spotfire Data Canvas is used to bring all the data together and make our final analyses.

TIBCO: How many people are using the tools?

WB: Seven full-time members of the lab and five undergraduate students who rotate through the lab. We show the students how to use the tool, and then they can sit down in front of a dataset and actually see the data, manipulate it, and see the results. It is wonderful for WashU and our group to have access to these tools and technologies that assist with cutting-edge research but also expose students to the tools and facilitate their exploration of the technology.

Researchers at Washington University School of Medicine in St. Louis are developing a precision medicine approach to fighting neurological diseases, screening thousands of cell mutations to find and alter the ones that cause illness. Click To Tweet

To learn more about how universities and other academic institutions are leveraging TIBCO products, check out the TIBCO Academic Alliance.

Let’s block ads! (Why?)

The TIBCO Blog

Read More

Jermaine Fowler Developing Animated Comedy Series At Fox

October 3, 2020   Humor
 Jermaine Fowler Developing Animated Comedy Series At Fox

Yet another animated comedy could be coming to Fox.

The network has put an untitled project from comedian Jermaine Fowler into development, Variety has learned.

The prospective series is loosely based on Fowler’s formidable years growing up in a working-class town in Prince George’s County, Maryland, with his family.

Fowler will serve as creator, writer, and executive producer on the project, which hails from Fox Entertainment. He is arguably best known for his two-season performance as Franco Wicks on the CBS sitcom “Superior Donuts.” Since then, he has starred in Pete Holmes’ HBO comedy “Crashing,” and voiced the character of Pete Repeat on Netflix’s “BoJack Horseman.” His TV credits also include “The Eric Andre Show.”

On the film side, Fowler will appear next year opposite Daniel Kaluuya and Lakeith Stanfield in “Judas and the Black Messiah,” a biopic about the assassination of Black Panther Party leader Fred Hampton.

Beyond its Animation Domination Sunday lineup (which is currently comprised of “The Simpsons,” “Bob’s Burgers,” “Family Guy” and “Bless the Harts”), Fox also has multiple other animated projects on its development slate. Just over a month ago, it was revealed the network has an “X-Files” animated comedy spinoff in the works, handing it a script and presentation commitment. Back in late July, Variety exclusively reported that Fox gave Vince Vaughn’s Wild West Picture Show production company a blind script deal for an animated series, with Vaughn himself attached to executive produce.

Fowler is represented by UTA, Management 360, ID & Morris Yorn Barnes Levine Krintzman Rubenstein Kohner & Gellman.

Source: Variety

Share this:

Like this:

Like Loading…

‘Tiffany Haddish Presents: They Ready’ Renewed At Netflix

Let’s block ads! (Why?)

The Humor Mill

Read More

Developing an effective lead scoring methodology in Dynamics 365

September 25, 2020   CRM News and Info
crmnav Developing an effective lead scoring methodology in Dynamics 365

Lead scoring models provide marketers and salespeople with a way to see how close a lead is to becoming an opportunity or even a sale. Typically lead scoring models calculate a score for each lead based on demographic details such as location or job title, and/or behaviors like opening a marketing email or visiting a website. Each action itself can generate a point value for the lead that adds up to an accumulative score. So how do you know if your scores are accurate for where the lead is in the sales cycle? The key to making an effective lead scoring model is reviewing and adjusting point values as you go. Before diving into methodology, let’s look at how lead scoring models work in Dynamics 365 Marketing.

How to create a lead scoring model in Dynamics 365 Marketing

The Lead Scoring entity in Dynamics 365 is located in Dynamics 365 Marketing. If you are unfamiliar with Dynamics 365 Marketing, it is Microsoft’s answer to Hubspot – a robust digital marketing application built into the Dynamics 365 ecosystem. On top of email marketing, event marketing, social media marketing, and marketing automation – Microsoft has built a lead scoring entity to track lead scores based on marketing campaigns within Dynamics 365.

For more information about setting up a lead scoring model in Dynamics 365, check out this blog that walks through the entire process. It’s important to note that before a lead can be scored it must be associated with a contact or account record in Dynamics 365.

Scoring methodology

The most important piece of a lead scoring model is the point values attached to lead actions or contact attributes. If a point value is set too high, a salesperson may mistakenly think that the lead is ready for the next step in the sale cycle and attempt to move them forward at the wrong point. While initial point values should be considered carefully, it will take constant evaluation and adjustment to get to a point where your lead scoring model accurately tells you where the lead is in the sales cycle.

A quick google search will turn up point scoring methodologies that tell you exactly what point values you should assign to each action. The truth is there is no one size fits all method when it comes to assigning point values. Your sales process and target market is not the same as every other business. So where should you start? Here are some ideas to think about when building your first scoring model.

Determine a scoring scale

Every lead scoring model needs a scoring scale. Until you know what score range a lead can have, you won’t be able to give an action a point value. An easy scoring scale is 0-100. This allows you to think of a lead’s progress in the sales cycle as a 0-100 percentage. The issue with this scale is the volume of scoring actions. If your organization runs email campaigns regularly, certain actions like email opens and website visits may add up quickly. A 0-300 scoring scale may be useful for organizations that have a high volume of scored actions.

Determine a grading scale

Once a scoring scale is determined, a grading scale needs to be added to provide the lead owner a glanceable look at how hot the lead is. You can have unlimited grades in a lead scoring model, but that doesn’t mean you should over grade. A great place to start is to set 3 grades that evenly split up the scoring scale. For example if you are using a 0-100 scale, you might set a Cold grade for leads scored between 0-33, a warm grade for leads between 34-66, and a hot grade for leads scored 67 or above. You can also play around with naming conventions of grades such as A,B,C or another unique identifier. Additional grades can be added at any point meaning it is easy to segment leads further based on their position in the sales cycle if you determine it necessary.

Set Point values

After determining a scoring scale and grading scale, the final step is to define what points should be given to a lead when they take an action or belong to a specific demographic. Point values should be relative to your scale, so if you are using a 0-100 scale, the point value of an action should be lower than if you are using a 0-300 scale. This blog will not provide specific point values you should use, but instead provide recommendations on actions that should be valued higher than others.

Basic actions that show interest in your company should be valued lower than others if scored individually. Examples of these types of actions include single email opens, email forwards, email clicks, website visits, etc. If, however, you are grouping these actions into a single score, that value might be higher. An example of this would be setting the score of a single email open to 2 points or setting 5 email opens to 15 points.

Other actions are more deliberate and thus should carry higher point values. Examples of these types of actions include Event registrations, form submissions, segment subscriptions, etc. Understanding what types of actions lead to higher chances of a sale will improve the accuracy of your lead scoring models and help your salespeople focus on the right prospects.

Get help with your lead scoring model in Dynamics 365

Schedule a call with a Dynamics 365 consultant to discuss your sales process and provide recommendations for building and monitoring an effective lead scoring model in Dynamics 365 Marketing.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

IBM uses AI to evaluate risk of developing genetic diseases

August 20, 2020   Big Data
 IBM uses AI to evaluate risk of developing genetic diseases

Automation and Jobs

Read our latest special issue.

Open Now

In a study published in the journal Nature Communications, scientists at IBM, the Broad Institute of MIT and Harvard, and health tech company Color find evidence that the presence of genetic mutations isn’t a reliable precursor to genetic diseases. They claim diseases can be so greatly influenced by other factors that the risk in carriers is sometimes as low as in that of noncarriers.

The research — which stems from a larger, three-year collaboration between IBM Research and the Broad Institute that was announced in 2019 — aims to support clinicians leveraging data to better identify patients at serious risk for conditions like cardiovascular disease. Insights could be useful in making health care and prevention decisions, helping clinicians choose whether to recommend imaging or more drastic, surgical interventions, like mastectomies.

In the course of the study, an IBM-led team developed models that analyze a person’s genetic risk factors, clinical health records, and biomarker data to more accurately predict the onset of conditions like heart attacks, sudden cardiac death, and atrial fibrillation. With Color, the researchers investigated whether polygenic background — the variants and factors within an individual’s genome — could influence the occurrence of disease in genomic conditions such as familial hypercholesterolemia, hereditary breast and ovarian cancer, and Lynch syndrome.

The coauthors analyzed de-identified records from over 80,000 patients across two large data sets, the UK Biobank and Color’s own. Among carriers of a monogenic risk variant, they identified “substantial variations” in risk based on polygenic background, implying carriers don’t always develop diseases. For example, the probability of developing coronary artery disease (CAD) by age 75 ranged from 17.5% to 77.9% for carriers of familial hypercholesterolemia. For comparison, CAD prevalence among noncarriers ranged from 13% if they had a low-risk polygenic score to 41% with a high-risk score.

“By leveraging large databases to combine and analyze medical and genomic data from tens of thousands of people, we have been able to shed significant new light on a number of serious, chronic diseases,” IBM Research principal scientist and coauthor Kenney Ng said. “Ultimately, our findings unveil a silver lining: Even if an individual carries a genetic mutation associated with one of these diseases, their absolute risk might not be as set in stone as previously thought. In fact, their absolute risk might be nearly equivalent to an individual who doesn’t carry the mutation at all — depending on other factors and mutations within their specific genome.”

IBM says future work will include investigating ways genomics, clinical data, and AI can be harnessed to develop new tools that offer health professionals insight into disease risk. The goal is to build algorithms that accurately indicate a predisposition to a health condition and make those tools available — including methods to calculate an individual’s risk of disease based on variants in a genome.

IBM previously collaborated with the Broad Institute in 2016. As part of a five-year project, the company sought to help researchers using AI and genomics better understand how cancers become resistant to therapies.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

COVID-19: The after economy and developing appropriate responses

June 8, 2020   CRM News and Info
istock 1213432934 COVID 19: The after economy and developing appropriate responses

I’m in the midst of revamping a lot of projects and programs — partially due to the new realities that we will face (though what they are remain to be seen to a large extent), and partially due to some changes I want to make in my business model and my life in general. So, I haven’t written much, though you will see something from me very soon, and that will trigger quite a bit of content both here and elsewhere.

Also: China, Iran, and Russia worked together to call out US hypocrisy on BLM protests

In the meantime, I am going to provide a forum for some of the other thought leaders in their respective spaces to give you something to chew over that isn’t just fat. Chewing fat is unhealthy, so that isn’t a good thing. So, this is the chewing of the good proteins kind of content. And one of the best people I know to provide you with that is Marshall Lager. I’m sure that many of you know him. 

I met this good man when he did a column for CRM Magazine, and I have watched him go on to a career as an analyst for at first Ovum, then G2, and now, as an independent analyst. He is incredibly cogent, bright, and funny, and what he says should be taken seriously, even when you might be laughing. It’s not just his voice and style and ability to engage an audience; his content is meaningful. Look, I know I tend to think the best of people and thus wax effusively, but Marshall truly is one of the best writers I’ve ever known — and that’s for style and content.

So, take heed of what he’s saying here. He is speaking to some of the needs and the expectations post-pandemic. It won’t be post-COVID for a long time, but we may get it to the point of control, and Marshall is raising, as always, valid and straightforward concerns and some solutions for those concerns, and remains humble.

Take it away, Marshall! And I hope that everybody is staying safe and sane in this time of crisis.


The perpetual uncertainty of this extended wait-and-see pandemic response has really taken its toll on human behavior, and especially on businesses that rely on regular traffic. Health and safety, crowds and protesters, and shortages of goods have to be taken into account every time we want to buy something, to an extent most people in the developed world never thought possible. And it’s not even nearly as bad as it could be.

At this point, two months into the first really bad phase of the pandemic (give or take; I’m not sure when this will go live), we appear to lack a cohesive plan. Not the action-stations drill that we executed (poorly) when the virus proved to be more than a minor concern. We need to develop ideas for what various facets of commercial life will look like after we crawl out from our bunkers for good.

This isn’t a political column, so I’ll refrain from giving my opinion of the response by local and national governments; what I’m looking for isn’t something that can be handed down from there anyway, for the most part. Businesses, large and small, have got to look at their operations and figure out how to address the sort of disruption that comes from a lengthy period of time when customers can’t do what is customary.

I have to admit that it took me a lot longer than I expected to write this article. Partly, it’s because my original motivation came from a more personal place, the same one that most people are in right now. We are waiting to get back to the lives and jobs we remember, but we don’t feel there has been much guidance in regard to how or when that will happen. Mostly, it’s because I’m not the futurist that some of my colleagues are, and my expertise is more along the lines of assessing customer experience in the moment than in planning its shape for tomorrow.

Some businesses have proposed a few visions of the “new normal” (a phrase many of you have probably come to hate by now) for their post-lockdown operations, and that’s at least a start. What we haven’t really had yet is a come-together moment to show the public that there is organized thought being given to the issue. Modern economies run on confidence in the system, and we’re fresh out.

A large number of industry working groups are devoting real skull-sweat to developing broader solutions to address any future crises; organizations of bakers, emergency services workers, and semiconductor manufacturers (to name just a few) have formed committees to develop appropriate responses. It’s those groups who need to better communicate their efforts to the public in order to restore confidence. News isn’t traveling quite as fast during the lockdown, and people want to know what to expect once it’s safe to congregate in public again. It’s great that businesses are considering the everyday experiences of consumers under pressure and finding ways to make the commercial world continue to function for everybody, from individuals to small businesses to mighty conglomerates, but it’s not being communicated effectively yet. We need transparency.

We have learned that many of the people who receive minimum wage (or less) are essential workers, to use the current popular term. Economic disease recovery is going to have to include real recognition that these jobs are a true life buoy — they keep families afloat, provide access to vital goods and services, and prevent financial ruin. They deserve better, and they deserve more, and sooner or later their employers will have to do something about that by redirecting some resources that would normally be taken as profit, turning them into higher pay and better quality of life..

I can already hear your voices from the future, talking about fiduciary responsibility to the shareholders and disproportionate burden on small businesses. Well, if we’re trying to sculpt the future, maybe we should consider our history first. If you look at the progression of economic paradigms, there is a steady (if sometimes glacially slow) trend that more opportunity and wealth comes when people have greater access to the profits of enterprise. Mercantilism gave way to open trade; unrestrained capitalism was moderated by anti-monopoly laws and the rise of unions; each time, overall profits improved in step with the human condition.

I believe that we’re due for another round of changes. Salaries for exempt employees have kept up with the cost of living, more or less; wages for non-exempt workers are not even close, and it’s those essential hourly workers who are receiving much praise but little actual support. We prefer to do business with companies that share our values, right? Well, one of my values is knowing when extracting profit is not as important as taking care of the people who earn it, and giving them a reason to want the company to succeed.

Less confrontationally stated, executives and shareholders should have the enlightened self-interest to realize that improving wage-earners’ situations improves employee loyalty, strengthens brand expression, and leads to continued success in the long run. The unfortunate image of workers who can barely support themselves, let alone a family, despite holding a full-time job and possibly a side job as well, has to change. Imagine how effective such a shift in prosperity and respect could be in motivating workers, not just because they need their jobs, but because they are proud of those jobs and can look at their paychecks without worrying about which bills they will and won’t pay this month.

Earlier, I said this was the first really bad phase of the pandemic, and I meant it. I’m no epidemiologist, but some things follow a pattern, and the spread of disease is one that history allows us to track. Whether we’re talking about the Black Death, typhoid, the flu of 1918 (which wasn’t from Spain), or our new friend the coronavirus, care must be taken after the first round of outbreaks or there will be a second, often much larger one. Reopening society too soon might force us right back into isolation. That’s going to be the first test of what we’ve learned. Will we return to lockdown with ease, or will we stumble again as we switch directions?

Here’s something to think about regarding our eventual recovery from and adjustment to the pandemic: For years, we have been watching and lamenting the decline of bricks-and-mortar shopping as e-commerce has supplanted it. Yet the very thing driving the economy down, and the thing which has consumers most desperate to return to normal, is the inability of local, physical businesses to operate normally. Nobody is allowed to congregate at shops, but everybody wants to, and businesses are suffering — especially small businesses. We get a different experience from local SMBs than we do from national chains, and it’s what we seem to want, so that’s the dollar they should chase

The emergency situation in retail does have at least one good side to it — businesses such as grocery markets, department stores, and restaurants are our laboratories for developing effective coping strategies. When all the shops enforce social distancing and better hygienic behavior, it becomes the new mode of operation. This ties back into better pay and conditions for workers — too many people treat workers with disrespect and even hostility because they believe wearing a name tag makes them powerless and disposable. An empowered workforce doesn’t have to take that abuse, and it’s more likely that patrons will have friends and family (even themselves) in similar dignified positions, creating empathy.

We can hope that businesses (and consumers) will know how to adapt once we start to work our way out without losing too many steps. Hope is not certainty. The less we prepare for what’s to come, the more gaffes we’ll make when it arrives. We need a clear and manageable crisis model for B2B and B2C before we’re caught with our pants down again, and — here’s the important part — the transparency to communicate it to the public in advance so we know what to expect. Businesses are so afraid of small losses in valuation that they don’t manage consumer expectations, resulting in huge losses in valuation when things get bad. Be proactive, get the word out that you’re monitoring a situation and are making plans to cope with what comes from it. This is not showing weakness or eroding confidence, it is leadership, and leadership strengthens brands.

Something as simple as Green – Yellow – Red statuses with attendant precautions would be a good start. Any protocol put in place will have to be mandatory, and if you aren’t willing to follow company rules you can’t do business there. We don’t have a problem with No Shirt – No Shoes – No Service, so how hard can it be to add masks to the mix? Custom-printed paper face masks could be new merch for your brand.

A better understanding of which roles are essential on-site, and which are remote work-friendly, will also aid clarity — and if businesses aren’t able for some reason to provide higher general wages to the essential workers, they can at least institute generous hazard pay when we reach our next time of need. Government needs to do its part in making lengthy lockdowns less onerous for business owners as well. Leadership is about more than getting us through the current crisis; it’s also about preparing us to tackle the next one.


Thank you, Marshall! As always, your writing is great. 

ANNOUNCEMENT: The CRM Watchlist 2021 is now re-opening for registration. I had to make some changes to the questionnaire, and that took some time given the change in how impact is going to be “measured” going forward. So, even if you have registered, I will need you to re-register. Please send me an email at paul-greenberg3@the56groiup.com and ask for the registration form. The registration period and even to some extent the questionnaire submission period has been extended. Thanks for your patience. Because it is late, if I feel that we have an insufficient number of registrations by a certain date, I will suspend the Watchlist for this year and pick it up next year. I hope that isn’t the case, but it would be certainly understandable.

Have a great one and be safe.

Coronavirus Updates

Let’s block ads! (Why?)

ZDNet | crm RSS

Read More

Gabrielle Union Developing Drama Titled ‘Afro.Punks’ HBO Max

February 4, 2020   Humor
 Gabrielle Union Developing Drama Titled ‘Afro.Punks’ HBO Max

Gabrielle Union is continuing to make moves behind-the-scenes as a producer. The latest project she has lined up is Afro.Punks, a coming-of-age drama series that is in development at HBO Max.

The hour-long series is from writer Jenina Kibuka, Union and Holly Shakoor Fleischer’s I’ll Have Another Productions and Sony Pictures TV. I’ll Have Another is under a deal at SPTV. Anton Cropper will direct and executive produce. Cropper also directs and executive produces L.A.’s Finest, which Union stars in for Spectrum Originals.

Deadline, who first reported the news, calls Afro.Punks “a coming-of-age, one-hour drama that will follow three teenage misfits on the brink of rebellion as they navigate life, love and Afro-Punk.”

Union’s other projects include Morgan Cooper’s Black Coffee, which is set up at Quibi.

Source: Shadow & Act

Share this:

Like this:

Like Loading…

‘Lethal Weapon 5′ Is In Development At Warner Brothers

Let’s block ads! (Why?)

The Humor Mill

Read More

Developing and Cultivating Strong Company Values

June 24, 2019   CRM News and Info
Strong Company Values Developing and Cultivating Strong Company Values

In today’s business world, we are accustomed to measuring the success of our organization based on factors such as growth, conversion rates, and how our overall results compare to those of our competitors. What many companies and leaders fail to realize, however, is that cultivating strong company values and promoting a healthy and thriving work environment contribute as much to their bottom line as securing and converting leads.

Your employees will ultimately help you get to where you want to be, so encouraging them to commit to your cause and remain loyal to your organization plays a huge role in motivating them to go above and beyond to get you the results you want. In fact, a study by Gallup found that teams with high engagement demonstrate a 21% increase in profitability (1). Plus, keeping your employees satisfied often leads to unintended benefits such as high employee retention, which will save you money in the long run.

An uninspiring workplace, in turn, can negatively impact your reputation in the community, lead to poor employee performance, and decrease your bottom line. Furthermore, if your prospective customers get word of your poor company culture, this can cause concern about your employee retention rates and your ability to deliver a great product or service in the future.

Still not convinced that developing strong values and a positive work environment can have a huge impact on your overall success? These tips will encourage employee engagement and satisfaction — and get you the type of results you crave.

Survey Employees to Learn What Matters to Them

An easy way to figure out what your employees value and how they perceive your brand is by simply asking them. Send out an initial survey form asking your employees to identify 3-5 values that are important to them in their current role. Using these responses, pick out the top 20 keywords/themes and send out a second survey to further narrow down your list.

After you’ve gathered employee responses from your second survey, have your leadership team or an appointed committee meet to analyze and interpret these results. From this list, pick 4-5 keywords or themes that will represent your company values and develop a strategy of how you will implement them into your work moving forward.

Ask Your Employees for Feedback

Transforming your workplace and cultivating strong company values isn’t something that will happen overnight. Occasionally, you might wonder if your efforts are having any impact at all. The reality is that some of the strategies you implement to promote this change will work better than others, and it’s up to you to figure out where it’s best to invest your time and effort.

A quick and easy way to determine what is resonating with your employees is by asking them to give you feedback through an NPS survey. At Act-On, we use AskNicely to survey both employees and customers about how we’re doing and how we can improve. Asking our employees to rate and provide feedback shows them that we care about meeting them where they’re at, and the insights they provide give us a better idea of how we can do that.

Identify How Each Department Contributes to Your Mission and Vision

Naming a few values that you feel apply to the work of your organization is not enough to promote a healthy and thriving work environment. If you truly want to gain buy-in for your new values, you have to encourage your employees to play a part in helping you realize those goals.

To do this, we suggest encouraging each of your teams to set apart some time to discuss how they see your company values reflected in their work, as well as what changes they think they can make to further promote them. From time to time, ask them to evaluate how they’re doing in terms of applying those concepts to their work and identify any areas for improvement. This practice will encourage your employees to feel connected to your company values and make them accountable for doing their part in promoting them across the organization.

Keep the Momentum Going with a Strong Internal Communications Strategy

One of the biggest contributors to dissatisfied, disconnected employees is that they feel as if they’re working in the silos. If you want to ensure your employees are happy and engaged, practicing good communication and keeping them in the know is crucial.

An easy way to do that is by implementing a monthly newsletter featuring company news and topics channeling organizational values that are of interest to your employees. If you use Act-On’s marketing automation, you can take your efforts a step further to track engagement and perform A/B testing to see what kind of content is resonating best with your employees — and then optimize your efforts accordingly.

The practical tips above will help remind your employees what you company culture is all about, why they should be proud to be part of your organization, and improve overall engagement. Let us know if you plan to implement any of these practices at your organization or if you have any activities that have served you well in the past.

Let’s block ads! (Why?)

Act-On Blog

Read More

1/17/2019 Webinar: Updates for Developing Custom Visuals for Power BI by Ted Pattison

January 16, 2019   Self-Service BI

In this week’s webinar Ted Pattison will be covering how to develop Custom Visuals for Power BI.

Power BI has supported the development of custom visuals since its initial release. However, the custom visuals platform has been constantly evolving and has recently been updated to support dynamic module loading which makes it possible to develop custom visual for Power BI using the latest versions of D3.js including D3 version 5. Join Ted and Chuck as they discuss the current state of the custom visual platform and explain how you how to take advantage of the most recent enhancements.

When 1/17/2019 10AM PST

 1/17/2019 Webinar: Updates for Developing Custom Visuals for Power BI by Ted Pattison

Ted Pattison is an author, instructor, co-founder and owner of Critical Path Training, a company dedicated to education on Power BI, Office 365 and SharePoint technologies. He is a 12-time recipient of Microsoft’s MVP award and for last three SharePoint releases Ted has worked with Microsoft’s Developer Platform Evangelism group researching and authoring training material for early adopters. Ted has already taught hundreds of professionals how to get started building custom business solutions using Microsoft technologies.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Read More

Samsung is reportedly developing a second Bixby smart speaker

December 28, 2018   Big Data
 Samsung is reportedly developing a second Bixby smart speaker

Google Home has the Google Home Mini. Amazon’s Echo has the Echo Dot. And when Samsung’s Bixby-driven smart speaker — the Galaxy Home — debuts sometime in 2019, it might have its own entry-level counterpart.

That’s according to SamMobile, which in a report today claimed that Samsung will release at least one other speaker next year in addition to the Galaxy Home. It will reportedly bear the model number “SM-V310” — not too far off from the Galaxy Home’s SM-V510 — and come in black.

Where features are concerned, Samsung’s second smart speaker will presumably be on par with its big brother.  And it’s almost a guarantee that it’ll run Bixby 2.0, the next-gen version of Bixby that Samsung previewed at Mobile World Congress 2018 in March.

Bixby 2.0 boasts better natural language processing and faster response times than its predecessor, along with built-in noise reduction tech. It’s also more conversational — if you ask it about upcoming concerts around New Year’s, for example, it’ll remember the date range when looking for tickets in the future. And it can make recommendations based on your previous searches.

At a press event in August, Samsung revealed that the Galaxy Home has functionality now considered standard issue for smart speakers, such as the ability to play music and control smart home devices without the need to lift a finger. It boasts AKG-tuned omnidirectional speakers and a subwoofer for deep bass, plus an array of eight microphones for far-field voice recognition.

Smart speaker demand

Market momentum for smart speakers shows no sign of slowing — quite the opposite, in fact. A report published by research firm Canalys indicates that worldwide smart speaker shipments grew 137 percent year over year in the third quarter of 2018 to reach 19.7 million units, up from 8.3 million in Q3 2017.

The number of voice-enabled speakers in use could come close to 100 million by the end of this year (up from 50 million at the end of 2017), some suggest, and early momentum more or less aligns with that prediction.

NPR and Edison Research estimated in July 2018 that 18 percent of American adults — around 43 million people — owned a smart speaker. (Consumer Intelligence Research Partners and Voicebot put the number at 50 million and 47.3 million, respectively.)

According to a survey commissioned by Adobe, almost half of all consumers in the United States will own a smart speaker by the end of the year. And a Global Market Insights report forecasts that the global smart speaker market could be worth as much as $ 30 billion by 2024.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More
« Older posts
  • Recent Posts

    • Quality Match raises $6 million to build better AI datasets
    • Teradata Joins Open Manufacturing Platform
    • Get Your CRM Ready for Some Good News
    • MTG
    • TripleBlind raises $8.2 million for its encrypted data science platform
  • Categories

  • Archives

    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited