• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: points

Recognizing data points that signal trends for the future of business post-pandemic

October 18, 2020   Big Data
 Recognizing data points that signal trends for the future of business post pandemic

The audio problem

Learn how new cloud-based API solutions are solving imperfect, frustrating audio in video conferences.

Access here

Planning for a post-COVID-19 future and creating a robust enterprise strategy require both strategic scenario planning and the ability to recognize what scenario planners call “news from the future” — data points that tell you whether the world is trending in the direction of one or another of your imagined scenarios. As with any scatter plot, data points are all over the map, but when you gather enough of them, you can start to see the trend line emerge.

Because there are often many factors pushing or pulling in different directions, it’s useful to think of trends as vectors — quantities that are described by both a magnitude and a direction, which may cancel, amplify, or redirect each other. New data points can also show whether vectors are accelerating or decelerating. As you see how trend vectors affect each other, or that new ones need to be added, you can continually update your scenarios.

Sometimes a trend itself is obvious. Twitter, Facebook, Google, and Microsoft each announced a commitment to new work-from-home policies even after the pandemic. But how widespread will this be? To see if other companies are following in their footsteps, look for job listings from companies in your industry that target new metro areas or ignore location entirely. Drops in the price or occupancy rate of commercial real estate, and how that spills over into residential real estate, might add or subtract from the vector.

Think through possible follow-on effects to whatever trend you’re watching. What are the second-order consequences of a broader embrace of the work-from-home experience? Your scenarios might include the possible emptying out of dense cities that are dependent on public transportation and movement from megacities to suburbs or to smaller cities. Depending on who your workers and your customers are, these changes could have an enormous impact on your business.

What are some vectors you might want to watch? And what are examples of news from the future along those trend lines?

The progress of the pandemic itself. Are cases and deaths increasing or declining? If you’re in the U.S., Covid Act Now is a great site for tracking the pandemic. This suggests that pandemic response won’t be a “one and done” strategy, but more like what Tomas Pueyo described in his essay “The Hammer and the Dance,” in which countries drop the hammer to reduce cases, reopen their economies, see recurrences, and drop the hammer again, with the response increasingly fine-grained and local as better data becomes available. As states and countries reopen, there is a lot of new data that will shape all of our estimates of the future, albeit with new uncertainty about a possible resurgence (even if the results are positive).

Is there progress toward treatment or a vaccine? Several vaccine candidates are in trials, and new treatments seem to improve the prognosis for the disease. A vector pushing in the other direction is the discovery of previously missed symptoms or transmission factors. Another is the politicization of public health, which began with masks but may also extend to vaccine denial. We may be living with uncertainty for a long time to come; any strategy involving a “return to normal” needs to be held very loosely.

How do people respond if and when the pandemic abates? Whatever comes back is likely to be irretrievably changed. As Ben Evans said, sometimes the writing is on the wall, but we don’t read it. It was the end of the road for BlackBerry the moment the iPhone was introduced; it just took four years for the story to play out. Sometimes a seemingly unrelated shock accelerates a long overdue collapse. For example, ecommerce has been growing its share for years, but this may be the moment when the balance tips and much in-person retail never comes back. As Evans put it, a bunch of industries look like candidates to endure a decade of inevitability in a week’s time.

Will people continue to walk and ride bikes, bake bread at home, and grow their own vegetables? (This may vary from country to country. People in Europe still treasure their garden allotments 70 years after the end of World War II, but U.S. victory gardens were a passing thing.) Will businesses have the confidence to hire again? Will consumers have the confidence to spend again? What percentage of businesses that shut down will reopen? Are people being rehired and unemployment rates going down? The so-called Y-shaped recovery, in which upper-income jobs have recovered while lower-income jobs are still stagnant, has been so unprecedented that it hasn’t yet made Wikipedia’s list of recession shapes.

Are there meaningful policy innovations that are catching on? Researchers in Israel have proposed a model for business reopening in which people work four-day shifts followed by ten days off in lockdown. Their calculations suggest that this would lower transmissibility of the virus almost as well as full lockdown policies, but allow people in many more occupations to get back to work, and many more businesses to reopen. Might experiments like this lead to permanent changes in work or schooling schedules? What about other long-discussed changes like universal basic income or a shorter work week? How will governments pay for the cost of the crisis, and what will the economic consequences be? There are those, like Ray Dalio, who think that printing money to pay for the crisis actually solves a long-standing debt crisis that was about to crash down on us in any case. Others disagree.

Are business models sustainable under new conditions? Many businesses, such as airlines, hotels, on-demand transportation, and restaurants, are geared very tightly to full occupancy. If airlines have to run planes with half as many passengers, will flights ever be cheap enough to attract the level of passengers we had before the pandemic? Could “on demand” transportation go away forever? Uber and Lyft were already unprofitable because they were subsidizing low prices for passengers. Or might these companies be replaced as the model evolves, much as AOL yielded online leadership to Yahoo!, which lost it in turn to Google? (My bet is that algorithmic, on-demand business models are still in their infancy.)

These topics are all over the news. You can’t escape them, but you can form your own assessment of the deeper story behind them and its relevance to your strategy. Remember to think of the stories as clustering along lines with magnitude and direction. Do they start to show patterns? More importantly, find vectors specific to your business. These may call for deep changes to your strategy.

Also remember that contrarian investments can bring outsized returns. It may be that there are markets that you believe in, where you think you can make a positive difference for your customers despite their struggles, and go long. For O’Reilly, this has been true of many technologies where we placed early bets against what seemed overwhelming odds of success. Chasing what’s “hot” puts you in the midst of ferocious competition. Thinking deeply about who needs you and your products and how you can truly help your customers is the basis for a far more robust strategy.


The audio problem:

Learn how new cloud-based API solutions are solving imperfect, frustrating audio in video conferences. Access here


Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

How to plot a line made by the touching points of two function?

April 2, 2020   BI News and Info
 How to plot a line made by the touching points of two function?

suppose we have the following functions
f[x_,y_]:=Cos[x]+Cos[y];
g[x_,y_]:=x^2+y^2;
How to plot the contour formed by (x,y) points where the functions f[x,y] touches with g[x,y]? I had tried by the following command,
ContourPlot[f[x,y] ==g[x,y],{x,-Pi,Pi},{y,-Pi,Pi},PlotPoints -> 100];
But I failed.
Someone please help me how to fix it.

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

Dreamforce Points to New Disruption

December 4, 2019   CRM News and Info

There were more than 170,000 people at Dreamforce a couple of weeks ago, and many more watching from afar. Also, there were well over 2,000 sessions (full disclosure, I led a panel discussion on employee engagement). Numerous announcements sprang from the big Salesforce event.

Some announcements and sessions, like the Tim Cook fireside chat, stand out. Of the top six, I reckon that five were more about integration and working across vendor boundaries than CRM. Given where we are in the evolution of the industry, that’s a good thing.

To me all of this signals a new disruption forming that will change IT as fundamentally as Salesforce’s arrival on the scene did 20 years ago. Back then almost everyone thought that CRM was the significant innovation, but it was really the cloud computing and subscription software paradigms that were the big deal.

Most conventional vendors dismissed the cloud as a toy. Most cloud vendors thought of it only as an alternative delivery paradigm. They were wrong, of course. Cloud computing led a huge standardization and commoditization wave that’s still being felt. A wave forming now will take that initial wave much further.

Here’s how I see the announcements.

Tableau’s Place

Salesforce only recently completed the acquisition of Tableau, an analytics and data visualization provider, for US$ 15.7 billion. Analytics and visualization are critical for any company undergoing a digital transformation, because they enable it to make sense of all the data it’s been collecting and then to act on the findings.

Tableau does everything it can to keep the user above the fray. The demos that I saw were really gorgeous, combining multiple data sources and the ability to pivot around them with a few clicks and not a line of code in sight. This is largely in keeping with Salesforce’s approach of no-code, low-code, and code-if-you-really-need-to app development.

It’s also a form of commoditization that democratizes technology by putting it into the hands of business users, obviating the need for experts.

With Tableau there is no visible gap between Salesforce and non-Salesforce data, and that’s the point. Tableau retains its identity and brand post-merger, and much of its customer base is non-Salesforce. That’s important because I think Salesforce is aiming at a broader audience.

Einstein Voice Assistant

There is an old maxim often attributed to the early 20th century Boston politician, Martin Lomasney: “Never write if you can speak; never speak if you can nod; never nod if you can wink.”

Modern political operatives also might wish to add, “Never put it in social media or email,” for good measure, but I digress. Lomasney was speaking of potentially incriminating nefarious political activity. Salesforce seems to be recapitulating the idea in much of its doings, but only for good.

“Never code when you can click” might be the new maxim — or “when you can drag and drop,” for that matter. Another might be “Never engage in keyboard activities if you can speak.”

That seems to be the genesis of Einstein Voice, a feature built into Salesforce CRM that enables users to speak to their apps and perhaps avoid carpel tunnel syndrome. Einstein Voice also enables users to iterate more rapidly on requests if one should fail.

In this, Einstein Voice is not dissimilar to products from Oracle, Zoho and many other vendors today. Where Salesforce shines is in the completeness of its offering — from the fundamental existence of the product to the training it offers through Trailhead.

There is something of a big brother aspect to products like Einstein Call Coaching (GA winter 2020), which provides the sales coaching that managers don’t have time for today. However, it might be easier to take suggestions from a machine than from a boss, so maybe this is a good thing.

There’s also Einstein Voice Assistant and Einstein Voice Skills (beta spring 2020), which will give admins the ability to create custom versions of the voice assistant for every user.

Finally, Service Cloud Voice (GA summer 2020) will enable Einstein to read keywords from call transcripts and serve knowledge articles and next-best actions for agents.

Taken together, these innovations enable more automation throughout many CRM processes, which will free more people to do higher value-added work.

Partnership With Apple

Three of the six major announcements stemming from Dreamforce were about partnerships with other vendors. News of Microsoft and AWS partnerships dribbled out before the conference, and some activity with Apple also was indicated. Salesforce seemed to have put the most wood behind the Apple arrow by including a
fireside chat with Tim Cook, Apple’s CEO.

The partnership includes tighter integration of the companies’ products, most notably reimagining the Salesforce Mobile App in iOS and the new Salesforce Mobile SDK optimized for Swift and iOS 13.

There’s also Trailhead GO with 700 modules exclusive to iOS and iPadOS. So it appears that Salesforce is trying to leapfrog its business apps’ usability to the level of consumer apps.

If you’re keeping score, that’s another example of enhancing usability and inevitably commoditizing business functions to improve business performance.

AWS’ Role

Some time ago, Salesforce and Amazon announced a partnership that would make AWS a hosting platform for future specialized Salesforce instances. The newest announcement takes this a step further.

As part of its Service Cloud Voice offering, Salesforce will offer the Amazon Connect contact center service. This will include Einstein Voice interoperability with Amazon Alexa, as well as other voice assistants in the future.

Collaboration With Microsoft

Salesforce and Microsoft last month announced joint efforts on several fronts, including naming Microsoft Azure as public cloud provider for the Salesforce Marketing Cloud, and new integrations between Microsoft Teams and the Salesforce Sales and Service Clouds.

I could be wrong, but I thought Marketing Cloud components — once Exact Target and Pardot — were situated on Azure to begin with. If so, the announcement is more about business relations than technology.

Regardless, Microsoft is an increasing part of Salesforce’s life, as it should be given the huge customer base the companies share. These announcements provide further evidence of a mass market approach — not only to CRM but also to IT generally. I’m looking for more out of this relationship, and quite possibly Microsoft will turn into the glue that holds together a future IT utility.

Customer 360 Truth

When it was announced shortly before Dreamforce, “Customer 360 Truth” seemed like an awkward name. It still does. The intent was to position Salesforce as the single source of information for its customers — the sole source of truth.

In a world of competing data sources and analytics engines, that promise has currency. However, developing customer trust was also an important Dreamforce theme, and calling something “truth” is like saying “you can trust me,” which is a technique of scoundrels. That’s not what Salesforce is about.

To get customer trust, one first has to put some skin in the game but that seemed lacking. In this instance, building trust requires a vendor to declare what it is honor-bound to do for customers, and then to live up to the declaration. That’s been the way it has worked for ages, with promises like “double your money back” in a simpler time.

Declarations like that seem rare in the modern world. Instead of putting their honor behind their policies, vendors seem happy to highlight their wonderful technology, strongly implying that it will never fail. In response to that I have two words: Cambridge Analytica.

At the time of the Customer 360 Truth announcement, Salesforce pushed a related idea — CIM, or customer information model — along with its partners, The Linux Foundation’s Joint Development Foundation, Genesys and AWS.

They positioned CIM as a way to standardize data interoperability across cloud applications, but I think there’s more here. First, it’s an information model and not a data model, and there’s a difference. Apps have data models, but integrations of multiple apps need information models, because the several applications in an integration need to trade information, and that’s different.

Whatever you want to call it, the edifice of technologies and partnerships Salesforce has constructed increasingly is coming to dominate information exchange — the same way that SQL dominates data storage. The parallel is important, because it signals further consolidation of the industry into a utility.

My Two Bits

I started this marathon stating that we’re witnessing the formation of a new disruption that will change IT as fundamentally as Salesforce’s arrival on the scene did 20 years ago. Cloud computing and subscriptions were the disruption then, and they fundamentally dissociated enterprises from their IT.

The new disruption appears to be reassembling the components into a new whole. In this case, we’re watching the industry further commoditize, removing labor from practice and simplifying approaches.

This has been going on as long as enterprises have been using computers, but the consolidation we’re now watching doubtless will have some benefits that we are seeking government assistance to provide today.

Most notably, if we’re going to have a grid that supports information sharing and interoperability, we’ll need to create some standards and protocols for reducing or eliminating some of the worst effects of hacking and cyberwarfare.

The private sector now has it within its power to lock out bad actors, preventing them from accessing their life blood: data. In a world where truth and trust are of paramount importance, it’s hard to see how you support these values unless you also ostracize bad actors. We might still be a few years away from locking down cloud IT, but the area where bad guys can operate already is shrinking.

Oracle’s Autonomous Database makes it nearly impossible for bad actors to gain access to sensitive IT farms. Partnerships between the largest IT vendors — like Salesforce, Microsoft and Oracle — are generating increasingly stringent standards and robust systems that span vendor silos.

At some point vendors that don’t get on board, citing First Amendment concerns for the bad guys, will find their access and influence greatly diminished.

All this seems like it’s a long way from Dreamforce, but it’s not. Dreamforce, as usual, showed us a glimpse of the future, when CRM is largely built out. That era will be more about leveraging the sophistication of technologies.
end enn Dreamforce Points to New Disruption

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.


Denis%20Pombriant Dreamforce Points to New Disruption
Denis Pombriant is a well-known CRM industry analyst, strategist, writer and speaker. His new book, You Can’t Buy Customer Loyalty, But You Can Earn It, is now available on Amazon. His 2015 book, Solve for the Customer, is also available there.
Email Denis.

Let’s block ads! (Why?)

CRM Buyer

Read More

Reverse color of overlapping points?

November 10, 2019   BI News and Info
 Reverse color of overlapping points?

Is there a simple way to change the color of the region where two Points operlap?
For example in Graphics[{PointSize[2], Point[{1, 1}], Point[{0, 0}]}], the resulting two points have an area where they overlap (their union). I am looking for a simple way to make that area White, for example.

The reason I am asking is that I have a list of data points, some of which overlap. I want to show the data in a graph but I want the reader to be easily able to realize that two neighboring and overlapping points are separate. That is why I am looking for an easy way to reverse the color of their union. Points that overlap perfectly, I treat separately, by making a circle around the first point.

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

Multiple Integration Points between Microsoft Dynamics 365 Customer Engagement and Business Central for Improved Efficiency

October 4, 2019   Microsoft Dynamics CRM
crmnav Multiple Integration Points between Microsoft Dynamics 365 Customer Engagement and Business Central for Improved Efficiency

Microsoft Dynamics business management solutions offer a fully integrated platform from which you can manage all aspects of your organization. If you are currently using the CRM solution Microsoft Dynamics 365 Customer Engagement and are considering acquiring Microsoft Dynamics 365 Business Central to manage your operations in the cloud or vice-versa, Microsoft has made the process even easier with multiple integration points between the two.

By combining both platforms, your organization can obtain a 360° vision of your business, ensuring everyone has the visibility they need on all the data relevant to their tasks. This increases synergy and productivity, while also improving collaboration and communication between teams and users. Data will be kept up to date in both systems at once, which will eliminate the number of manual or double entries, saving time and reducing the risk of errors.

Dynamics 365 uses mappings to associate specific types of data records in the two solutions, avoiding duplication. Here are some of the integration points between Dynamics 365 Customer Engagement and Business Central, including the direction of the integration:

Customer Engagement Direction Business Central
Customer ↔ Account
Contact ↔ Contact
Currency → Transaction Currency
Customer Price Group → Price List
Item ↔ Product
Opportunity ↔ Opportunity
Resource ↔ Product
Sales Invoice Header → Invoice
Sales Invoice Line → Invoice Product
Sales Price → Product Price List
Salesperson ← User
Unit of Measure → Unit Group

Updating these records in either one of the solutions will automatically push the changes to the other, depending on the direction of the integration. This ensures that the sales team has a view on transactions and invoices, while the financial team has access to customer details when needed. As such, your organization will benefit from a fully cloud-based platform to store all of its data and perform the complete management of its sales and operations.

By JOVACO Solutions, Microsoft Dynamics 365 integration specialist in Quebec

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Compare two plots by finding the minimum distance among points

September 23, 2019   BI News and Info

I have a question about comparing the points within two plots.
I would like to compare two plots and find the minimum distance among their points, in order to find the nearest/common points (i.e. those ones with minimum -or zero-distance) and plot it (overlapping).
What I did is to extract the coordinates of their respectively points. But I do not know how to compare them and/or the two plots. I used the following line of code, but the result is completely different from what I am looking for.

Outer[EuclideanDistance, seq1, seq2, 1] // Flatten

e3Bc4 Compare two plots by finding the minimum distance among points

u9OaQ Compare two plots by finding the minimum distance among points
The result should show the points on the plot equal (almost in common) between the two plots.

Could you please help me?
Many thanks,
Val

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

Provide a list of colours for the points of ListPlot

September 19, 2019   BI News and Info

I want to colour the points in a ListPlot with a list of colours, i.e. something like

data = Table[Cos[x]^2, {x, 0, 10, 0.1}];
colourData = Table[Cos[x]^2, {x, 0, 10, 0.1}];
colours = (ColorData["TemperatureMap"][#] &) /@ colourData;
ListPlot[data,Joined->True (* Something like PlotStyle -> colours*)]

And I want an output that looks like this

Fjmlz Provide a list of colours for the points of ListPlot

1 Answer

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

BI Implementation Insights: Clear and Easy Starting Points

April 6, 2019   Sisense

Business intelligence implementation can seem like a daunting task at the outset. There are so many moving parts, needs, and requirements that finding the right starting point may feel like a shot in the dark.

However, one of the most important aspects of running a successful business intelligence project is finding the right starting point. Clear starting points can help you launch new projects faster and acclimate to your platform’s tools. It can make your life and the life of your end users a bit easier.

In this “BI Implementation Insights” series I’m going to walk you through implementing your analytics with the least amount of hiccups as possible. In this first post, I’ll give you a quick and easy cheat sheet to always finding the right starting point for your projects.



Finding the Starting Line

Start by identifying who will be the first groups to benefit from your dashboards.

Consider that dashboards generally have two sets of users, who together are dashboard stakeholders. They are the power users and business users, and their buy-in will determine the success of your dashboarding project. Power users are analysts and other data experts who will spend a lot of time using the software. Business users are the end users who rely on the data from the BI platform to make decisions every day. New projects need to consider the needs of these users.

Open with questions to identify opportunities.

Your goal is to identify your power users’ low hanging challenges and processes that you can quickly and easily use BI to solve or automate. Start with simple challenges like basic analyses and tasks with lower data complexity, such as ratios, top/bottom x items, and data from a single data source.

Throughout this process, you will test new features and explore relevant data to tackle bigger problems. You will prove value by creating simple-to-use dashboards that both introduce BI into your core group’s workflow and provide a clear ROI in terms of time saved.

Starting Point Cheat Sheet

Now for that cheat sheet I mentioned earlier.

1. Identify primary stakeholders for your project.

2. Pose questions that need answering.

3. Evaluate the complexity associated with each.

4. Choose low hanging fruit first.

5. Complete a planning document covering:

  • An outline
  • Concrete goal(s)
  • Target audience
  • Call to action
  • Data update frequency
  • Timelines
  • Mockups
  • Define KPIs (necessary to answer your primary business question)
  • Data requirements
  • Security and additional constraints (optional)

6. Use the agile methodology to create your implementation towards a small batch of high priority KPIs. This methodology consists of three phases – planning, execution, and delivery.

image14 770x383 BI Implementation Insights: Clear and Easy Starting Points

7. Use the inverted-pyramid model and start dashboards with high-level indicators, working down from indicators to trends to details. By doing this you’ll be displaying the most significant insights on the top part of the dashboard, trends in the middle, and granular details at the bottom.

8. Think laterally.

With your project launched, your KPIs will start yielding insights and new questions will begin to emerge. This allows you to start experimenting by approaching the same issue from different angles and casting a broader net in terms of the datasets you’re using and how you’re using them.

Once a model is designed, analyzing the data via different dimensions can yield insights that were not obvious before. For example, if you have data on ad unit performance, analyzing it vs the size of the units could reveal interesting results: does ad unit size influence the click-through rates? What else are you missing?

Combining tangential or parallel data points to your datasets can yield correlations that answer questions you hadn’t even thought to ask and present meaningful insights you might otherwise have missed.

You’re on Your Way

If you follow the eight-step cheat sheet above you’ll be well on your way to a smooth and effective analytics implementation. Implementations don’t stop there, though! Coming up in the BI Implementation Insights series I’ll talk about building a BI team for success, how to outline an implementation timeline, and a deep-dive into how to identify early wins.

Tags: Cheat Sheet | KPIs

Let’s block ads! (Why?)

Blog – Sisense

Read More

4 Big Data Infrastructure Pain Points and How to Solve Them

November 19, 2018   Big Data
4 Big Data Infrastructure Pain Points and How to Solve Them 4 Big Data Infrastructure Pain Points and How to Solve Them
Christopher Tozzi avatar 1476151897 54x54 4 Big Data Infrastructure Pain Points and How to Solve Them

Christopher Tozzi

November 19, 2018

Making the most of big data requires not just having the right big data analytics tools and processes in place, but also optimizing your big data infrastructure. How can you do that? Read on for tips about common problems that arise in data infrastructure, and how to solve them.

What Is Big Data Infrastructure

Big data infrastructure is what it sounds like: The IT infrastructure that hosts your “big data.” (Keep in mind that what constitutes big data depends on a lot of factors; the data need not be enormous in size to qualify as “big.”)

More specifically, big data infrastructure entails the tools and agents that collect data, the software systems and physical storage media that store it, the network that transfers it, the application environments that host the analytics tools that analyze it and the backup or archive infrastructure that backs it up after analysis is complete.

Lots of things can go wrong with these various components. Below are the most common problems you may experience that delay or prevent you from transforming big data into value.

Slow Storage Media

Disk I/O bottlenecks are one common source of delays in data processing. Fortunately, there are some tricks that you can use to minimize their impact.

One solution is to upgrade your data infrastructure solid-state disks (SSDs), which typically run faster. Alternatively, you could use in-memory data processing, which is much faster than relying on conventional storage.

SSDs and in-memory storage are more costly, of course, especially when you use them at scale. But that does not mean you can’t take advantage of them strategically in a cost-effective way: Consider deploying SSDs or in-memory data processing for workloads that require the highest speed, but sticking with conventional storage where the benefits of faster I/O won’t outweigh the costs.

The New Rules for Your Data Landscape 4 Big Data Infrastructure Pain Points and How to Solve Them

Lack of Scalability

If your data infrastructure can’t increase in size as your data needs grow, it will undercut your ability to turn data into value.

At the same time, of course, you don’t want to maintain substantially more big data infrastructure than you need today just so that it’s there for the future. Otherwise, you will be paying for infrastructure you’re not currently using, which is not a good use of money.

One way to help address this challenge is to deploy big data workloads in the cloud, where you can increase the size of your infrastructure virtually instantaneously when you need it, without paying for it when you don’t. If you prefer not to shift all of your big data workloads to the cloud, you might also consider keeping most workloads on-premise, but having a cloud infrastructure set up and ready to handle “spillover” workloads when they arise—at least until you can create a new on-premise infrastructure to handle them permanently.

Slow Network Connectivity

If your data is large in size, transferring it across the network can take time—especially if network transfers require using the public internet, where bandwidth tends to be much more limited than it is on internal company networks.

Paying for more bandwidth is one way to mitigate this problem, but that will only get you so far (and it will cost you). A better approach is to architect your big data infrastructure in a way that minimizes the amount of data transfer that needs to occur over the network. You could do this by, for example, using cloud-based analytics tools to analyze data that is collected in the cloud, rather than downloading that data to an on-premise location first. (The same logic applies in reverse: If your data is born or collected on-premise, analyze it there.)

Suboptimal Data Transformation

Getting data from the format in which it is born into the format that you need to analyze it or share it with others can be very tricky. Most applications structure data in ways that work best for them, with little consideration of how well those structures work for other applications or contexts.

This is why data transformation is so important. Data transformation allows you to convert data from one format to another.

When done incorrectly—which means manually and in ways that do not control for data quality—data transformation can quickly cause more trouble than it is worth. But when you automate data transformation and ensure the quality of the resulting data, you maximize your data infrastructure’s ability to meet your big data needs, no matter how your infrastructure is constructed.

Learn how to unleash the power of data – Download our eBook: The New Rules for Your Data Landscape.

Let’s block ads! (Why?)

Syncsort Blog

Read More

Plot3D seems to be ignoring points

March 28, 2018   BI News and Info

Consider the following function:

f[N_, i_, g_]:= PDF[BinomialDistribution[N, g], i]

The output of

Plot3D[f[100, i, g], {i, 0, 100}, {g, 0.1, 0.9}]

produces:

MAigx Plot3D seems to be ignoring points

By only looking at the 3D plot would incorrectly suggest that, say, f(100, 25, 0.2) = 0. However, the value of f at this point is approximately 0.04. My question is: how can I make the output of Plot3D more accurate for the function above?

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More
« Older posts
  • Recent Posts

    • Accelerate Your Data Strategies and Investments to Stay Competitive in the Banking Sector
    • SQL Server Security – Fixed server and database roles
    • Teradata Named a Leader in Cloud Data Warehouse Evaluation by Independent Research Firm
    • Derivative of a norm
    • TODAY’S OPEN THREAD
  • Categories

  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited