• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Monthly Archives: December 2020

Cedric The Entertainer-Produced Dramedy, Starring D.L. Hughley, Titled ‘Johnson’ Lands On Bounce TV

December 31, 2020   Humor
 Cedric The Entertainer Produced Dramedy, Starring D.L. Hughley, Titled ‘Johnson’ Lands On Bounce TV

Created by Deji LaRay, the series comes from A Bird & A Bear Entertainment, along with LaRay and Thomas Q. Jones’ Midnight Train Productions.

Bounce TV has set the showrunners and lead cast for its dramedy series Johnson.

The show follows life-long best friends — all with the same last name of Johnson — as they navigate love, friendship, heartbreak and personal growth.

Led by D.L Hughley, the cast also includes Thomas Q. Jones, series creator Deji LaRay, Philip Smithey, Derrex Brady, Khalilah Joi, Terri Abney and Rosa Acosta. Jones and LaRay serve as showrunners.

Johnson has begun filming in Atlanta, where the show also takes place. The dramedy is set to premiere in the summer of 2021.

Created by LaRay, Johnson is produced by Cedric the Entertainer and Eric Rhone’s A Bird & A Bear Entertainment, along with LaRay and Jones’ Midnight Train Productions. Along with Jones and LaRay, Cedric the Entertainer, and Rhone and Reesha L. Archibald will also executive produce. Knoko Chapple is producer.

“Black people are not monolithic and we as Black men can be very complex. We want to peel back those layers on the show and give women an inside look at why men do and say the things we do,” said LaRay. “While our lead characters are specifically speaking from the Black man’s point of view, they truly represent all men. Our goal is for men and women to have a better understanding of each other when they watch the show.”

Added Jones, “We created this series because we weren’t seeing anything like it on television. Insecure and Sex and the City and other shows capture the beautiful essence of friendship among women, but it’s harder to distill that for men.”

Jones, a former NFL running back, has appeared in film and television shows such as Straight Outta Compton, P-Valley, Luke Cage, Shameless, Being Mary Jane, Runaway Island and A Violent Man, which he also executive produced.

Bounce, a broadcast and multi-platform entertainment network that targets African Americans, is owned by Scripps.

Source: Bounce

WATCH: ‘Coming 2 America’ Releases New Trailer!

The OWN Channel To Air John Legend Hosted Special Titled ‘Revisiting Underground’

Let’s block ads! (Why?)

The Humor Mill

Read More

Teradata Appoints Todd Cione as Chief Revenue Officer

December 31, 2020   BI News and Info

Experienced Global Sales Leader with Proven Record Growing Revenue and Executing Cloud Transformations

Teradata (NYSE: TDC), the cloud data analytics platform company, today announced the appointment of Todd Cione as Chief Revenue Officer, effective January 4, 2021.

 Teradata Appoints Todd Cione as Chief Revenue OfficerCione brings to Teradata more than 25 years of experience in global sales, marketing, channel, and operations at large multi-national technology organizations, including most recently at Apple, and previously with Oracle, Rackspace and Microsoft. He is a results-oriented executive with a proven track record of delivering predictable and profitable growth, and has successfully led organizations through cloud-based transformations. Cione will report to Steve McMillan, President and CEO of Teradata.

“I am pleased to welcome Todd to the Teradata team and look forward to working closely with him as we advance our go-to-market initiatives and deliver consistent, profitable revenue growth,” said McMillan. “We have been successfully executing on our cloud-first strategy, which has deeply resonated with our customers globally. We are confident that Todd’s experience and diverse skill set will be invaluable in helping customers unlock value from their data.”

“I am excited about joining Teradata and helping to accelerate its cloud-first momentum. I deeply share Teradata’s customer-centric purpose and look forward to delivering exceptional customer experiences with our cloud-based data analytics platform,” said Cione. “Working closely with Steve and the rest of the Company’s talented team, we will continue to help our customers harness the power of data and provide sustainable value for our shareholders. I look forward to hitting the ground running to drive sales strategies that strengthen our market position and continue to drive top-line growth.”

About Todd Cione
Cione most recently served as Head of U.S. Enterprise Accounts at Apple Inc., a role he held since 2017, where he was responsible for overseeing large enterprise customers and was a key contributor to driving customer success. Previously, he was SVP of Oracle’s Digital, North America Applications, and also served as Chief Revenue Officer of Rackspace. Additionally, Cione spent 15 years at Microsoft where he held roles of increasing responsibility in the U.S. and Asia, including General Manager, Asia Pacific Region Marketing & Operations, a market with over $ 4 billion in revenue, and General Manager and Managing Director, Asia Pacific Region Enterprise & Services Sales. During his tenure at Microsoft, Cione was an integral contributor to the Company’s transition to the cloud and helped launch Azure and Office 365 in U.S. and Asian markets.

Cione has a BA in English and Business from Baylor University with continuing executive education at INSEAD’s Asian International Executive Program and the University of Nebraska-Lincoln’s Gallup Leadership Development Program.

Let’s block ads! (Why?)

Teradata United States

Read More

The Grand Canyon

December 31, 2020   Humor
0 The Grand Canyon

If you’ve ever been to the Grand Canyon, you’ll agree it’s an astounding place to visit. I first set eyes on the canyon at the age of 12. It’s the only time I’ve hiked in the canyon, taking the Bright Angel Trail from the South Rim to Indian Gardens and returning. I don’t remember it much, other than it was a tough climb back up. In the summer of 2018, I got to fulfill a lifelong dream of traversing the entire canyon by raft. We started at Lee’s Ferry and took eight days on a motorized raft to Pearce Ferry at the upper end of Lake Mead, the entire 280 miles. So far, it’s my all time favorite adventure and I would do it again in a heartbeat if I could.

It’s really enjoyable to learn as much about the place as I can. I found the following video series by accident. It’s a five part history of “running” the Grand Canyon from Rim to Rim or even Rim to Rim to Rim. I’m not a runner and could care less about listening to stories of runners, but I thought I’d check it out just for the scenery. It turned out that this five part series has more about the history of the Grand Canyon than I’ve read or seen in a long time. I found it quite interesting. If you’re stuck at home and have nothing better to do than spend almost three hours watching some great videos, you’ll learn some really interesting history and see some great pictures.



This last part is really all about the running. I came away from this episode thinking that if these assholes all came barrelling past me while I was trying to enjoy a leisurely hike through the canyon, I’d be calling them the assholes they are, a bunch of selfish attention seekers.

Let’s block ads! (Why?)

ANTZ-IN-PANTZ ……

Read More

Visualize 5 Cool Insights on Holiday Tree Trends Over Time

December 31, 2020   TIBCO Spotfire
TIBCOSpotfire ChristmasTree scaled e1608573759606 696x365 Visualize 5 Cool Insights on Holiday Tree Trends Over Time

Reading Time: 3 minutes

Did you know Thomas Edison’s assistants proposed putting electric lights on Christmas trees? There’s a long and rich history surrounding holiday trees, in America and around the world. According to the History Channel, symbolic traditions involving evergreen trees in winter began in ancient Egypt and Rome and continue to take on new meaning today. 

New Holiday Traditions: Annual Analytics

Here at TIBCO, we’ve started our own holiday tradition involving the classic festive trees: using our data visualization software to understand trends in holiday tree data. Last year, we shared our analysis and “treemap” visualization (quite literally a treemap of trees) via TIBCO Spotfire®. This year, we dived even deeper into the data, using the new Spotfire Mods functionality to design custom apps for greater interactivity. Here’s what we found:

  • Top Tree Producing States: All 50 states contribute to the holiday tree industry, but our analysis shows the greatest production occurs in Oregon, North Carolina, Michigan, and Pennsylvania. Also interesting is that while Oregon and North Carolina are top producers overall, states like Ohio and Michigan definitely over-index for total tree producing counties as a percentage of their total land area. 
 Visualize 5 Cool Insights on Holiday Tree Trends Over Time
Immersive, interactive exploration of a bubble “tree-map” visualization Mod alongside county-level Spotfire geoanalytics  [*source: USDA census data]    
  • Artificial vs. Real Tree Sales: As you can see below, artificial tree sales have been on the rise over the last decade, with 162 percent growth between 2004 and 2018. Artificial trees are taking over. Actually, 81 percent of the trees on display, whether in storefronts, businesses, or homes, in 2019 were of the artificial variety. But what does that mean for the global economy when China produces 80 percent of artificial trees worldwide and given that artificial trees cannot be recycled like real trees?
 Visualize 5 Cool Insights on Holiday Tree Trends Over Time
Tree sales volume over time in this area chart visualization Mod in Spotfire [*source: National Christmas Tree Association] 
  • Rising Average Price of Real Trees: According to an article in the Hustle, “During the recession in 2008, ailing farmers planted too few trees. As a result, prices have been much higher since 2016.” The article also cites the National Christmas Tree Association as stating that the average retail price for a real tree in 2019 was $ 75. Obviously, this is a huge market, but one that continues to shift with economic and social changes—which makes us wonder just how different our analysis next year will look.
  • Consumer Demand Lower in 2019: In the area chart visualization above, we see that sales for natural trees still account for a larger share of the market. However, the artificial tree category set new high marks for sales in each year progressively from 2016 to 2018. Why could this be? One hypothesis might be that as Baby Boomers retire as “empty-nesters” and downsize their homes, they are buying fewer trees, but let us know your thoughts on this surprising find. 
  • The More the Merrier? Multiple Trees: According to a survey by the American Christmas Tree Association, the number of households in the United States that display more than one Christmas tree has grown by 10 percent from 2014 to 2019. In 2019, approximately 16 percent of American households display multiple trees. But will this trend continue or, as with the overall tree sales, will the number of trees per household decrease in the coming years?

We’ve started our own holiday tradition involving the classic festive trees: using our data visualization software to understand trends in holiday tree data. Click To Tweet

A New Tradition: Immersive Yourself in Custom Analytics Applications 

But this is just one festive story you could tell around data trends. What about shopping trends this year, will there be an increase in small business online sales? What will be the top gifted items in 2020? 
You tell us! Join our tradition, and read our whitepaper to learn how the immersive qualities of Hyperconverged Analytics will create new value for your business. For a closer look at all of “What’s New in Spotfire®” including visualization Mods, watch our 20-minute intro webinar on demand. 

Previous article20 for 2020: Looking Back on a Year of Blogging

Shannon Peifer is a Marketing Content Specialist at TIBCO Software in Denver, CO. She graduated from the University of Texas at Austin in 2018 with a double major in marketing and English honors, and loves writing engaging content related to technology. Shannon grew up overseas, and loves to explore new places. When she’s not writing, you can find her swimming laps at the pool, gulping down iced lattes at local coffee shops, or scouring the shelves at the bookstore.

Let’s block ads! (Why?)

The TIBCO Blog

Read More

You don’t code? Do machine learning straight from Microsoft Excel

December 31, 2020   Big Data

Transform 2021

Join us for the world’s leading event about accelerating enterprise transformation with AI and Data, for enterprise technology decision-makers, presented by the #1 publisher in AI and Data

Learn More


Machine learning and deep learning have become an important part of many applications we use every day. There are few domains that the fast expansion of machine learning hasn’t touched. Many businesses have thrived by developing the right strategy to integrate machine learning algorithms into their operations and processes. Others have lost ground to competitors after ignoring the undeniable advances in artificial intelligence.

But mastering machine learning is a difficult process. You need to start with a solid knowledge of linear algebra and calculus, master a programming language such as Python, and become proficient with data science and machine learning libraries such as Numpy, Scikit-learn, TensorFlow, and PyTorch.

And if you want to create machine learning systems that integrate and scale, you’ll have to learn cloud platforms such as Amazon AWS, Microsoft Azure, and Google Cloud.

Naturally, not everyone needs to become a machine learning engineer. But almost everyone who is running a business or organization that systematically collects and processes can benefit from some knowledge of data science and machine learning. Fortunately, there are several courses that provide a high-level overview of machine learning and deep learning without going too deep into math and coding.

But in my experience, a good understanding of data science and machine learning requires some hands-on experience with algorithms. In this regard, a very valuable and often-overlooked tool is Microsoft Excel.

To most people, MS Excel is a spreadsheet application that stores data in tabular format and performs very basic mathematical operations. But in reality, Excel is a powerful computation tool that can solve complicated problems. Excel also has many features that allow you to create machine learning models directly into your workbooks.

While I’ve been using Excel’s mathematical tools for years, I didn’t come to appreciate its use for learning and applying data science and machine learning until I picked up Learn Data Mining Through Excel: A Step-by-Step Approach for Understanding Machine Learning Methods by Hong Zhou.

Learn Data Mining Through Excel takes you through the basics of machine learning step by step and shows how you can implement many algorithms using basic Excel functions and a few of the application’s advanced tools.

While Excel will in no way replace Python machine learning, it is a great window to learn the basics of AI and solve many basic problems without writing a line of code.

Linear regression machine learning with Excel

Linear regression is a simple machine learning algorithm that has many uses for analyzing data and predicting outcomes. Linear regression is especially useful when your data is neatly arranged in tabular format. Excel has several features that enable you to create regression models from tabular data in your spreadsheets.

One of the most intuitive is the data chart tool, which is a powerful data visualization feature. For instance, the scatter plot chart displays the values of your data on a cartesian plane. But in addition to showing the distribution of your data, Excel’s chart tool can create a machine learning model that can predict the changes in the values of your data. The feature, called Trendline, creates a regression model from your data. You can set the trendline to one of several regression algorithms, including linear, polynomial, logarithmic, and exponential. You can also configure the chart to display the parameters of your machine learning model, which you can use to predict the outcome of new observations.

You can add several trendlines to the same chart. This makes it easy to quickly test and compare the performance of different machine learning models on your data.

 You don’t code? Do machine learning straight from Microsoft Excel

Above: Excel’s Trendline feature can create regression models from your data.

In addition to exploring the chart tool, Learn Data Mining Through Excel takes you through several other procedures that can help develop more advanced regression models. These include formulas such as LINEST and LINREG, which calculate the parameters of your machine learning models based on your training data.

The author also takes you through the step-by-step creation of linear regression models using Excel’s basic formulas such as SUM and SUMPRODUCT. This is a recurring theme in the book: You’ll see the mathematical formula of a machine learning model, learn the basic reasoning behind it, and create it step by step by combining values and formulas in several cells and cell arrays.

While this might not be the most efficient way to do production-level data science work, it is certainly a very good way to learn the workings of machine learning algorithms.

Other machine learning algorithms with Excel

Beyond regression models, you can use Excel for other machine learning algorithms. Learn Data Mining Through Excel provides a rich roster of supervised and unsupervised machine learning algorithms, including k-means clustering, k-nearest neighbor, naive Bayes classification, and decision trees.

The process can get a bit convoluted at times, but if you stay on track, the logic will easily fall in place. For instance, in the k-means clustering chapter, you’ll get to use a vast array of Excel formulas and features (INDEX, IF, AVERAGEIF, ADDRESS, and many others) across several worksheets to calculate cluster centers and refine them. This is not a very efficient way to do clustering, but you’ll be able to track and study your clusters as they become refined in every consecutive sheet. From an educational standpoint, the experience is very different from programming books where you provide a machine learning library function your data points and it outputs the clusters and their properties.

 You don’t code? Do machine learning straight from Microsoft Excel

Above: When doing k-means clustering on Excel, you can follow the refinement of your clusters on consecutive sheets.

In the decision tree chapter, you will go through the process calculating entropy and selecting features for each branch of your machine learning model. Again, the process is slow and manual, but seeing under the hood of the machine learning algorithm is a rewarding experience.

In many of the book’s chapters, you’ll use the Solver tool to minimize your loss function. This is where you’ll see the limits of Excel, because even a simple model with a dozen parameters can slow your computer down to a crawl, especially if your data sample is several hundred rows in size. But the Solver is an especially powerful tool when you want to fine-tune the parameters of your machine learning model.

 You don’t code? Do machine learning straight from Microsoft Excel

Above: Excel’s Solver tool fine-tunes the parameters of your model and minimizes loss functions.

Deep learning and natural language processing with Excel

Learn Data Mining Through Excel shows that Excel can even express advanced machine learning algorithms. There’s a chapter that delves into the meticulous creation of deep learning models. First, you’ll create a single layer artificial neural network with less than a dozen parameters. Then you’ll expand on the concept to create a deep learning model with hidden layers. The computation is very slow and inefficient, but it works, and the components are the same: cell values, formulas, and the powerful Solver tool.

 You don’t code? Do machine learning straight from Microsoft Excel

Above: Deep learning with Microsoft Excel gives you a view under the hood of how deep neural networks operate.

In the last chapter, you’ll create a rudimentary natural language processing (NLP) application, using Excel to create a sentiment analysis machine learning model. You’ll use formulas to create a “bag of words” model, preprocess and tokenize hotel reviews, and classify them based on the density of positive and negative keywords. In the process you’ll learn quite a bit about how contemporary AI deals with language and how much different it is from how we humans process written and spoken language.

Excel as a machine learning tool

Whether you’re making C-level decisions at your company, working in human resources, or managing supply chains and manufacturing facilities, a basic knowledge of machine learning will be important if you will be working with data scientists and AI people. Likewise, if you’re a reporter covering AI news or a PR agency working on behalf of a company that uses machine learning, writing about the technology without knowing how it works is a bad idea (I will write a separate post about the many awful AI pitches I receive every day). In my opinion, Learn Data Mining Through Excel is a smooth and quick read that will help you gain that important knowledge.

Beyond learning the basics, Excel can be a powerful addition to your repertoire of machine learning tools. While it’s not good for dealing with big data sets and complicated algorithms, it can help with the visualization and analysis of smaller batches of data. The results you obtain from a quick Excel mining can provide pertinent insights in choosing the right direction and machine learning algorithm to tackle the problem at hand.

Ben Dickson is a software engineer and the founder of TechTalks. He writes about technology, business, and politics.

This story originally appeared on Bdtechtalks.com. Copyright 2020

VentureBeat

VentureBeat’s mission is to be a digital townsquare for technical decision makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you,
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more.

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Set Targets and Track Dynamics 365 CRM User Performance with Ease on Daily, Weekly or Monthly Basis

December 30, 2020   CRM News and Info

xSet Targets and Track Dynamics 365 CRM 625x357.jpg.pagespeed.ic.VVa2gk1t4K Set Targets and Track Dynamics 365 CRM User Performance with Ease on Daily, Weekly or Monthly Basis

Looking for a complete Dynamics 365 CRM monitoring solution? Get User Adoption Monitor for seamless tracking of Dynamics 365 CRM user actions!

Recently, our user actions monitoring app – User Adoption Monitor, a Preferred App on Microsoft AppSource – had released three new features making it a formidable app for tracking user actions across Dynamics 365 CRM/Power Apps. In our previous blog, you were given in-depth information about one of the newly released User Adoption Monitor feature – Data Completeness– which helps you to ensure that all Dynamics 365 CRM records have the necessary information required to conduct smooth business transactions. And now in this blog, we will shed more light on its remaining two new features – Aggregate Tracking & Target Tracking.

So, let’s see how these two new features will help you in tracking Dynamics 365 CRM/Power Apps user actions.

Aggregate Tracking

With this feature, you will be able to track the aggregations of respective numeric fields of the entity on which the specific user action has been defined. For example, as a Sales Manager you want to track how much sales your team has made for a specific period. To know this, all you have to do is configure aggregate tracking for the ‘Opportunity-win’ action. Once the tracking is done, you will get the SUM of the Actual Revenue of all the Opportunities won by each of your team members for a defined period of time. Similarly, you can get the aggregate value (SUM or AVG) of Budget Amount, Est. Revenue, Freight Amount, etc.

ximage001 2 625x277.png.pagespeed.ic.Crva0sLuKh Set Targets and Track Dynamics 365 CRM User Performance with Ease on Daily, Weekly or Monthly Basis

Target Tracking

Using this feature, you will be able to allot sales targets to your team members and keep regular track of it in Dynamics 365 CRM/Power Apps. You can set targets in both count & value. Consider a scenario where you want to appraise the performance of your sales team. With this feature, you can set a target for each of your team member,s and based on the tracking result will be able to analyze their performance easily. You can now track the performance of your team members using this feature in the following ways:

Target based on Aggregation Value

If you want to track the total sales value generated by your team members then you can use this feature and set the target in aggregate value. Once set, you can now monitor the performance of your team members by comparing the aggregate value of the target set and the total aggregate value of the target achieved by them on a daily, weekly, or monthly basis.

ximage002 625x242.png.pagespeed.ic.UID43CELeX Set Targets and Track Dynamics 365 CRM User Performance with Ease on Daily, Weekly or Monthly Basis

Target based on Count

Similarly, if you want to keep tab on the count of sales made by your team members then you can set the target in the count. Once set, you can now monitor and compare the target set (in count) and the total target achieved (in count) by your team members on a daily, weekly, or monthly basis.

ximage003 1 625x240.png.pagespeed.ic.nwaW0SYIMm Set Targets and Track Dynamics 365 CRM User Performance with Ease on Daily, Weekly or Monthly Basis

Target defined for Fixed Interval

In the above two scenarios, the Targets were defined with the Interval set as ‘Recurring’. In such cases, tracking will be done on a daily, weekly, or monthly basis. Other than that, you can also define the Target for a fixed period of time by setting the Interval as Fixed.

After you set the interval of Target Configuration as fixed, you can define the Start Date and End Date for which you want to set the target tracking.

With this, you can easily monitor and compare the target set and the total target achieved (both in count or aggregate value) by your team members for a given fixed period of time.

ximage004 625x206.png.pagespeed.ic.gu7M568O5S Set Targets and Track Dynamics 365 CRM User Performance with Ease on Daily, Weekly or Monthly Basis

ximage005 1 625x264.png.pagespeed.ic.t3eqjfnJyU Set Targets and Track Dynamics 365 CRM User Performance with Ease on Daily, Weekly or Monthly Basis

A handy app to have at the time for appraisals, isn’t it?

So, wait no more! Satisfy your curiosity by downloading User Adoption Monitor from our website or Microsoft AppSource and exploring these amazing features for a free trial of 15 days.

For a personal demo or any user actions monitoring queries, feel free to contact us at crm@inogic.com

Until next time –Wish you a Safe and Glorious New Year!

Keep Tracking Us!

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Crossing the Line

December 30, 2020   Humor
blank Crossing the Line

A couple of days ago, I posted a video of the restaurant owner blocking the health inspector’s car, so he couldn’t move and thus, continue to do his job. The point being made was, “If I can’t do my job, you can’t do yours.”

Here is an excellent read on this incident and the subject at hand. The author makes the point that there are FAR more of us than there are of them. It’s worth the read.

Let’s block ads! (Why?)

ANTZ-IN-PANTZ ……

Read More

Sisense and Signals Analytics Bring the Power of External Data to the Enterprise

December 30, 2020   Sisense

We’re stronger when we work together. In our Partner Showcase, we highlight the amazing integrations, joint projects, and new functionalities created and sustained by working with our technology partners at companies like AWS, Google, and others. 

Business teams constantly want to know how their companies are performing — against their internal goals and those of the market they compete in. They benchmark their performance against their previous results, what their customers are asking for, what their customers are buying, and ideally what their customers will buy. To get their answers, businesses typically rely on data sources that are all internal, showing decision-makers only part of the picture.

That’s now in the past. Today, through a strategic partnership, Signals Analytics and Sisense are making it easy to incorporate external data analytics into a company’s main BI environment. The result is a broader, more holistic view of the market coupled with more actionable and granular insights. Infusing these analytics everywhere, democratizes data usage and access to critical insights across the enterprise. 

Organizations who are truly data-driven know how to leverage a wide range of internal and external data sources in their decision-making. The integration of Signals Analytics in the Sisense business intelligence environment gets them there faster and seamlessly, without the need for specialized resources to build complex systems.

Kobi Gershoni, Signals Analytics co-founder and chief research officer

Why external data analytics?

The integration of Signals Analytics with the Sisense platform delivers on the promise of advanced analytics — infusing intelligence at the right place and the right time, upleveling standard decisions to strategic decisions, and speeding the time to deployment. Combining internal and external data unlocks powerful insights that can drive innovation, product development, marketing, partnerships, acquisitions, and more. 

Primary use cases for external data analytics

External data is uniquely well-suited to inform decision points across the product life cycle, from identifying unmet needs to predicting sales for specific attributes, positioning against the competition, measuring outcomes, and more. By incorporating a wide range of external data sources that are connected and contextualized, users benefit from a more holistic picture of the market.

For example, when combining product reviews, product listings, social media, blogs, forums, news sites, and more with sales data, the accuracy rate for predictive analytics jumps from 36% to over 70%. Similar results are seen when going from social listening alone to using a fully connected and contextualized external data set to generate predictions.  

The Sisense and Signals Analytics partnership: What you need to know

  • Signals Analytics provides the connected and contextualized datasets for specific fast-moving consumer goods (FMCG) categories
  • Sisense users can tap into one of the broadest external datasets available and unleash the power of this connected data in their Sisense dashboards
  • The ROI of the analytics investment dramatically increases when combining historical data, sales, inventory, and customer data with Signals Analytics data

Integrate external data analytics in your Sisense environment in three easy steps

Step 1: Connect

From your Sisense UI, use the Snowflake data connector to connect to the Signals Analytics Data Mart. The data can be queried live in the Sisense ElastiCube.

Step 2: Select

Once the data connection has been established, select the data types needed by filtering the relevant “Catalog.”

Step 3: Visualize

Select the dimensions, measures, and filters to apply, then visualize.

More data sources, better decisions

Your company is sitting on a large supply of data, but unless and until you find the right datasets to complement it, the questions you can answer and the insights you can harness from it will be limited. Whatever your company does and whatever questions you are trying to answer, mashing up data from a variety of sources, inside the right platform, is vital to surfacing game-changing insights.

To get started on the next leg of your analytics journey start a free trial or become a partner.

traditional vs cloud data warehouse blog cta banner 770x250 1 Sisense and Signals Analytics Bring the Power of External Data to the Enterprise
Tags: advanced analytics | analytics implementation

Let’s block ads! (Why?)

Blog – Sisense

Read More

9 trends in enterprise database technology

December 30, 2020   Big Data
 9 trends in enterprise database technology

Transform 2021

Join us for the world’s leading event about accelerating enterprise transformation with AI and Data, for enterprise technology decision-makers, presented by the #1 publisher in AI and Data

Learn More


The database has always revolved around rock-solid reliability. Data goes in and then comes out in exactly the same way. Occasionally, the bits will be cleaned up and normalized so all of the dates are in the same format and the text is in the same character set, but other than that, nothing should be different.

That consistency is what makes the database essential for any enterprise — allowing it to conduct things like ecommerce transactions. It’s also why the database remains distinct from the data warehouse, another technology that is expanding its mission for slower-twitch things like analysis. The database acts as the undeniable record of the enterprise, the single source of truth.

Now databases are changing. Their focus is shifting and they’re accepting more responsibilities and offering smarter answers. In short, they’re expanding and taking over more and more of the stack.

Many of us might not notice because we’ve been running the same database for years without a change. Why mess with something that works? But as new options and features come along, it makes sense to rethink the architectures of data flows and take advantage of all the new options. Yes, the data will still be returned exactly as expected, but it will be kept safer and presented in a way that’s easier to use.

Many drivers of the change are startups built around a revolutionary new product, like multi-cloud scaling or blockchain assurance. For each new approach to storing information, there are usually several well-funded startups competing to dominate the space and often several others still in stealth mode.

The major companies are often not far behind. While it can take more time to add features to existing products, the big companies are finding ways to expand, sometimes by revising old offerings or by creating new ones in their own skunkworks. Amazon, for instance, is the master at rolling out new ways to store data. Its cloud has at least 11 different products called databases, and that doesn’t include the flat file options.

The other major cloud providers aren’t far behind. Microsoft has migrated its steadfast SQL Server to Azure and found ways to offer a half-dozen open source competitors, like MySQL. Google delivers both managed versions of relational databases and large distributed and replicated versions of NoSQL key/value pairs.

The old standards are also adding new features that often deliver much of the same promise as the startups while continuing support of older versions. Oracle, for instance, has been offering cloud versions of its database while adding new query formats (JSON) and better performance to handle the endless flood of incoming data.

IBM is also moving dB2 to the cloud while adding new features like integration with artificial intelligence algorithms that analyze the data. It’s also supporting the major open source relational databases while building out a hybrid version that merges Oracle compatibility with the PostgreSQL engine.

Among the myriad changes to old database standards and new emerging players, here (in no particular order) are nine key ways databases are being reborn.

1. Better query language

SQL may continue to do the heavy lifting around the world. But newer options for querying — like GraphQL — are making it easier for front-end developers to find the data they need to present to the user and receive it in a format that can be dropped right into the user interface.

GraphQL follows the standard JavaScript format for serializing objects, making it easier for middle- and front-end code to parse it. It also hides some of the complexity of JOINs, making it simpler for end users to grab just the data they need. Developers are already adding tools like Apollo Studio, an IDE for exploring queries, or Hasura, an open source front-end that wraps GraphQL around legacy databases like PostgreSQL.

2. Streaming databases follow vast flows

The model for a standard database is a big ledger, much like the ones clerks would maintain in fat bound books. Streaming databases like ksqlDB are built to watch an endless stream of data events and answer questions about them. Instead of imagining that the data is a permanent table, the streaming database embraces the endlessly changing possibilities as data flows through them.

3. Time-series database

Most database columns have special formats for tracking date stamps. Time-series databases like InfluxDB or Prometheus do more than just store the time. They track and index the data for fast queries, like how many times a user logged in between January 15 and March 12. These are often special cases of streaming databases where the data in the streams is being tracked and indexed for changes over time.

4. Homomorphic encryption

Cryptographers were once happy to lock up data in a safe. Now some are developing a technique called homomorphic encryption to make decisions and answer queries on encrypted data without actually decrypting it, a feature that vastly simplifies cloud security and data sharing. This allows computers and data analysts to work with data without knowing what’s in it. The methods are far from comprehensive, but companies like IBM are already delivering toolkits that can answer some useful database queries.

5. In-memory database

The original goal of a database was to organize data so it could be available in the future, even when electricity is removed. The trouble is that sometimes even storing the data to persistent disks takes too much time, and it may not be worth the effort. Some applications can survive the occasional loss of data (would the world end if some social media snark disappeared?), and fast performance is more important than disaster recovery. So in-memory databases like Amazon’s ElasticCache are designed for applications that are willing to trade permanence for lightning-fast response times.

6. Microservice engines

Developers have traditionally built their code as a separate layer that lives outside the database itself, and this code treats the database as a black box. But some are noticing that the databases are so feature-rich they can act as microservice engines on their own. PostgreSQL, for instance, now allows embedded procedures to commit full transactions and initiate new ones before spitting out answers in JSON. Developers are recognizing that the embedded code that has been part of databases like Oracle for years may be just enough to build many of the microservices imagined by today’s architects.

Jupyter notebooks started out as a way for data scientists to bundle their answers with the Python code that produced it. Then data scientists started integrating the data access with the notebooks, which meant going where the information was stored: the database. Today, SQL is easy to integrate, and users are becoming comfortable using the notebooks to access the database and generate smart reports that integrate with data science (Julia or R) and machine learning tools. The newer Jupyter Lab interface is turning the classic notebook into a full-service IDE, complete with extensions that pull data directly from SQL databases.

7. Graph databases

The network of connections between people or things is one of the dominant data types on the internet, so it’s no surprise that databases are evolving to make it easier to store and analyze these relationships.

Neo4j now offers a visualization tool (Bloom) and a collection of data science functions for developing complex reports about the network. GraphDB is focusing on developing “semantic graphs” that use natural language to capture linguistic structures for big analytic projects. TerminusDB is aimed at creating knowledge graphs with a versioning system much like Git. All of them bring efficiency to storing a complex set of relationships that don’t fit neatly into standard tables.

8. Merging data storage with transport

Databases were once hidden repositories to keep data safe in the back office. Delivering this information to the user was the job of other code. Now, databases like Firebase treat the user’s phone or laptop as just another location for replicating data.

Databases like FaunaDB are baking replication into the stack, thus saving the DBA from moving the bits. Now, developers don’t need to think about getting information to the user. They can just read and write from the local data store and assume the database will handle the grubby details of marshaling the bytes across the network while keeping them consistent.

9. Data everywhere

A few years ago, all the major browsers began supporting the Local Storage and Indexed Storage APIs, making it easier for web applications to store significant amounts of data on the client’s machine. The early implementations limited the data to 5MB, but some have bumped the limits to 10MB. The response time is much faster, and it will also work even when the internet connection is down. The database is not just running on one box in your datacenter, but in every client machine running your code.

VentureBeat

VentureBeat’s mission is to be a digital townsquare for technical decision makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you,
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more.

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

The OWN Channel To Air John Legend Hosted Special Titled ‘Revisiting Underground’

December 29, 2020   Humor
 The OWN Channel To Air John Legend Hosted Special Titled ‘Revisiting Underground’

Los Angeles – OWN: Oprah Winfrey Network will air “Revisiting Underground,” an all-new half-hour special from Sony Pictures Television, immediately following the previously announced January 5 premiere on OWN of the critically acclaimed historical drama “Underground on OWN” at 9 p.m. ET/PT.  The dramatic series acquired from Sony Pictures Television will have a revitalized presentation on OWN, with newly filmed episodic introductions by cast members, never-before-seen behind the scenes footage and more.  The premiere episode has been expanded to 90 minutes, with “Revisiting Underground” debuting at 10:30 p.m. ET/PT.

Grounded in this moment of cultural shift, “Revisiting Underground” reminds us of the importance of taking history off of the shelves and bringing it into our daily lives. Executive Producers John Legend, Misha Green and Joe Pokaski, along with the cast and crew, share personal stories, reflect on their time working on the show, and discuss the power of activism and the relevance of the show today.

We’ll look at the research everyone did to prepare for their roles, hear their personal stories of challenges and triumphs from their time working on these intense stories, and explore the music of the show. Finally, we’ll share how the cast honors the heroes of the Underground with the activism in their daily lives.

“Underground,” which aired for two seasons (20 episodes) on WGN America, was co-created by Misha Green (“Lovecraft Country,” “Sons of Anarchy,” “Heroes”) and Joe Pokaski (“Cloak & Dagger,” “Heroes,” “Daredevil”).  They serve as executive producers alongside Emmy®-nominated director Anthony Hemingway (“Power,” “Red Tails,” “Treme”); Academy Award-winning writer Akiva Goldsman (“A Beautiful Mind,” “I Am Legend”) of Weed Road Pictures; Tory Tunnell (“Spinning Out,”“King Arthur”) and Joby Harold (“King Arthur,” “Edge of Tomorrow”) of Safehouse Pictures; and EGOT winner John Legend, Emmy® and Tony® Award winner Mike Jackson and Emmy® winner Ty Stiklorius of Get Lifted Film Co (“Rhythm + Flow”, “Jingle Jangle: A Christmas Journey”); and Mark Taylor (“MadTV”). Additionally, Legend and Get Lifted oversaw the score, soundtrack and all music aspects of the series.

In 1857, a restless slave named Noah (Aldis Hodge) organizes a small team of fellow slaves on the Macon plantation outside Atlanta, and puts together a plan to run for their lives — 600 dangerous miles North — to freedom. The odds of success are slim; the path to freedom’s terrain is unforgiving, and Tom (Reed Diamond), their politically ambitious owner will surely kill anyone attempting to run.  For those who make it off the plantation, the risks and uncertainties multiply.  They leave family behind to pay for their sins, as they face danger and death at every turn.   They’re aided along the way by an abolitionist couple in Ohio, new to running a station on the Underground, unprepared for the havoc it will wreak with their personal lives, while they evade a ruthless slave catcher hell-bent on bringing them back, dead or alive.

“Underground” stars Aldis Hodge (“City on a Hill,” “Brian Banks,” “Hidden Figures”) as Noah, a restless slave who organizes a small team of fellow slaves on the Macon plantation to plan an escape; Jurnee Smollett (“Lovecraft Country,” “Birds of Prey: And the Fantabulous Emancipation of One Harley Quinn,” “True Blood”) as Rosalee, a shy house slave with a powerful inner strength and courage; Emmy-nominated actor Christopher Meloni (“Law & Order: Special Victims Unit,” “Happy!”) as August Pullman, a secretive man who walks a tightrope between morality and survival; Alano Miller (“Cherish The Day,” “Jane The Virgin,” “Loving”) as Cato, a cunning, charismatic man despised and feared by his fellow slaves; and Jessica De Gouw (“Pennyworth,” “The Crown,” “The Last Tycoon”) as Elizabeth Hawkes, a socialite who shares the abolitionist ideals of her husband, John (Marc Blucas, “Knight and Day,” “Buffy the Vampire Slayer”), a lawyer whose principles clash with the legislation he’s sworn to uphold.  The internationally renowned cast includes Adina Porter (“American Horror Story,” “True Blood,” “The 100,”) as Pearly Mae, a strong-willed wife and mother; Mykelti Williamson (“Fences,” “Forest Gump”) as her husband Moses, a fiery preacher; Amirah Vann (“How to Get Away with Murder,” “Star Trek: Picard,” “Unsolved: The Murders of Tupac and the Notorious B.I.G”) as Ernestine, head house slave and fiercely devoted mother; Johnny Ray Gill (“Rectify”) as Sam, Rosalee’s half-brother and a talented carpenter; Chris Chalk (“When They See Us,” “Perry Mason,” “Homeland”) as William Still, an abolitionist ally; Reed Diamond (“13 Reasons Why”) as Tom Macon, a plantation owner and political candidate; and Jussie Smollett (“Empire”), who joins his sister Smollet in the cast as Josey, a wild-eyed runaway who doesn’t trust anyone. In season 2, Legend guest stars as iconic abolitionist, orator and author Frederick Douglass, Aisha Hinds (“9-1-1,” “Unsolved: The Murders of Tupac and the Notorious B.I.G,” “True Blood”) steps into a recurring guest star role as HarrietTubman; and Sadie Stratton (“Westworld”) portrays notorious slave trader Patty Cannon.

Cedric The Entertainer-Produced Dramedy, Starring D.L. Hughley, Titled ‘Johnson’ Lands On Bounce TV

Dave Chappelle To Buy Ohio Fire Station And Turn Into Comedy Club

Let’s block ads! (Why?)

The Humor Mill

Read More
« Older posts
  • Recent Posts

    • Conversational Platform Trends for 2021
    • The Great Awakening?
    • Another Success Story: McWane
    • The Dynamics 365 Sales Mobile App Helps Salespeople Stay Productive From Anywhere
    • THEY CAN FIND THE GUY WHO BROKE A WINDOW BUT NOT A MURDERER?
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited