Incremental refresh in Power BI Premium

social default image Incremental refresh in Power BI Premium

Power BI Premium was announced generally available less than a year ago as the platform of choice for enterprises deploying high-performance, scalable BI systems in the cloud. In the May release of Power BI Desktop, we announced the next step in our vision: Power BI Premium supports incremental refresh (preview)! Incremental refresh enables large datasets in the Power BI Premium service with the following benefits.

  • Refreshes are faster. Only data that has changed needs to be refreshed. For example, refresh only the last 5 days of a 10-year dataset.
  • Refreshes are more reliable. For example, it is not necessary to maintain long-running connections to volatile source systems.
  • Resource consumption is reduced. Less data to refresh reduces consumption of memory and other resources.

Please see this article to learn how to use incremental refresh.

In the following video, Adam Saxton from the Power BI CAT Team and Christian Wade from the Power BI Premium product team discuss the significance of incremental refresh.

   

Incremental refresh has traditionally been in the realm of Azure Analysis Services for high-end, scalable BI models. Whilst one of Analysis Services’ greatest strengths, it’s a feature that can require considerable time and effort to set up. With Power BI Premium, incremental refresh is simplified to essentially a dialog box with a few checkboxes and dropdowns!

Related items coming soon

Shared capacity support

Based on customer feedback, we plan to make incremental refresh available on shared-capacity (non Premium) workspaces.

Update metadata

Once the dataset is published and refreshed, if a change needs to be made to the model or reports, it needs to be republished from Power BI Desktop to the service. The current publish workflow detects when there is already a dataset with the same name and prompts if it should be replaced. Replacing a dataset in this way replaces the data within it. This could mean having to reload historical data. We plan to provide the ability to update and retain the data on publish. Incremental refresh will stay in public preview until this feature is released.

Increased dataset size

We plan to remove the 10 GB dataset-size limit in the Power BI Premium service. This will allow datasets utilizing incremental refresh to be limited only by the capacity size.

Override effective date

We plan to allow setting the current date for a refresh operation. This will be useful to use with datasets like Adventure Works that don’t have data up to the current date, and for testing purposes.

XMLA Endpoints

XMLA Endpoints will enable connectivity, manageability and programmability in Power BI Premium comparative to that of Azure Analysis Services. For example, you will be able to use SQL Server Management Studio (SSMS) to connect to datasets with incremental-refresh policies and generate scripts to refresh specific historical periods (partitions).

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

99 cent store special

 99 cent store special

via

Advertisements

About Krisgo

I’m a mom, that has worn many different hats in this life; from scout leader, camp craft teacher, parents group president, colorguard coach, member of the community band, stay-at-home-mom to full time worker, I’ve done it all– almost! I still love learning new things, especially creating and cooking. Most of all I love to laugh! Thanks for visiting – come back soon icon smile 99 cent store special


Posted on May 20, 2018, in Animals, Awkward, Shopping, Troll. Bookmark the permalink. Leave a comment.

Let’s block ads! (Why?)

Deep Fried Bits

You implemented a new CRM system. Great! But why is no one using it?

CRM Blog You implemented a new CRM system. Great! But why is no one using it?

You implemented a new CRM system. Great! The hardest part is done now, right? Well . . .  not exactly. Getting your employees to actually use your new system can often be a major hurdle.

According to Principal Analyst Bill Band of Forrester, the biggest threat to a successful CRM project implementation is slow user adoption. Band also states that not only is early and pervasive user acceptance one of the biggest indicators of success, it is a MANDATORY precursor to success. But before we dig into this, it’s important to define exactly what we mean by User Adoption

USER ADOPTION DEFINED

User adoption is not simply the use of a CRM system. True user adoption occurs only when users want to use the CRM system. Users must be motivated and have a genuine desire to achieve key benefits that the system can bring. You can’t just force a new technology system on your staff. That strategy will never last long term and trying to push it through will be very painful for you and your entire staff.

Our CRM consultants have also found that having a power user, someone who is tech savvy and a team player, can help encourage others to use the new system. This can be helpful as users are often more likely to listen to their colleague than they are to an outside consultant. Having said that, there are five areas of focus that can make or break the user adoption of your CRM system:

5 FACTORS THAT CAN MAKE OR BREAK THE USER ADOPTION OF YOUR CRM SYSTEM

1. Executive Sponsorship

Getting buy-in from the executive level can make a world of difference. Leaders can articulate the strategy, goals and importance of the CRM system, making it easier for employees to understand their role in rolling out the project.

2. Business Processes

As you are preparing the roll out your new CRM platform, it is the perfect time to review your business processes. A system like Microsoft Dynamics CRM is highly configurable and has the flexibility to adapt to whatever business processes you identify. This also allows the CRM to grow with your organization, as your CRM partner can help you reconfigure the system as needed if your business processes change over time.

3. Communication

This seems like an obvious one but it can’t be overlooked. We encourage our clients to develop a communications plan to keep the executive team updated on the status of the CRM project, including “wins” along the way. Communication to the end users is also important so your staff is not left guessing as to what is happening and when.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Nvidia is training robots to learn from watching humans

 Nvidia is training robots to learn from watching humans

Nvidia has developed a method to train robots how to carry out actions by first watching human activity. In initial applications, robots learned to pick up and move colored boxes and a toy car in a lab environment using a Baxter robot.

Learnings from such research will be used to retrain robots and create robots that can work safely alongside people in industrial settings and homes.

“In the manufacturing environment, robots are really good at repeatedly executing the same trajectory over and over again, but they don’t adapt to changes in the environment and they don’t learn their tasks,” Nvidia principal research scientist Stan Birchfield told VentureBeat in an interview.”So to repurpose a robot to execute a new task you have to bring in an expert to reprogram the robot at a fairly low level and its an expensive operation. What we’re interested in doing is making it easier for a non-expert user to teach a robot a new task by simply showing it what to do.”

The system has a series of deep neural networks that perform perception, planning, and control and the networks are trained entirely on synthetic data.

“There’s sort of a paradigm shift happening in the robotics community now,” he said. “We’re at the point now where we can use GPUs to generate essentially a limitless amount of pre-labeled data for free essentially for free to develop and test algorithms and this potentially going to allow us to develop these robotics systems that need to learn how to interact with the world around them in ways that scale better and are safer.”

The findings were shared today at the International Conference on Robotics and Automation (ICRA) taking place this week in Brisbane, Australia.

The new AI system was made with help from the Nvidia robotics research lab. First announced late last year, the lab now has six employees and is preparing to open up offices adjacent to the University of Washington in Seattle this summer.

The research lab will continue to work with the robotics community and in-house at Nvidia to explore the use of synthetic datasets for training AI systems, Nvidia head of robotics research Dieter Fox told VentureBeat.

Such knowledge could be used to strengthen the Isaac SDK, a framework for training robots with simulations first introduced in May 2017.

“Nvidia actually has been working in that domain for quite a while in the gaming context for instance where its all about setting up 3D virtual environments that are photo realistic and give you some kind of content modeling, and what we want to do is also work with all these teams that have all this expertise but help them expand it in a way so that it becomes better applicable in a robotics setting,” Fox said.

Research like the kind released today will be central to the creation of the next generation of robots, Fox said.

“We’re talking about robots that have to open doors, open drawers, pick up objects, move them around, even physically interacting with people, helping them, for example elderly people in the home,” he said. “These robots need to be able to recognize people, they need to see what a person wants to do, they need to learn from people, learn from demonstration for example, and they also need to be able to anticipate what a person wants to do in order to help them.”

Nvidia joins a growing number of companies like Google and SRI International interested in the development of AI systems with a kind of environmental awareness, as Google AI chief Jeff Dean put it, more “common sense.”

To read more about the new robotics research, see this Nvidia blog post.

Let’s block ads! (Why?)

Big Data – VentureBeat

When Screen Scraping became API calling – Gathering Oracle OpenWorld Session Catalog with …

[unable to retrieve full-text content]

image thumb18 When Screen Scraping became API calling – Gathering Oracle OpenWorld Session Catalog with ...

A dataset with all sessions of the upcoming Oracle OpenWorld 2017 conference is nice to have – for experiments and demonstrations with many technologies. The session catalog is exposed at a website here.

With searching, filtering and scrolling, all available sessions can be inspected. If data is available in a browser, it can be retrieved programmatically and persisted locally in for example a JSON document. A typical approach for this is web scraping: having a server side program act like a browser, retrieve the HTML from the web site and query the data from the response. This process is described for example in this article – https://codeburst.io/an-introduction-to-web-scraping-with-node-js-1045b55c63f7 – for Node and the Cheerio library.

However, server side screen scraping of HTML will only be successful when the HTML is static. Dynamic HTML is constructed in the browser by executing JavaScript code that manipulates the browser DOM. If that is the mechanism behind a web site, server side scraping is at the very least considerably more complex (as it requires the server to emulate a modern web browser to a large degree). Selenium has been used in such cases – to provide a server side, programmatically accessible browser engine. Alternatively, screen scraping can also be performed inside the browser itself – as is supported for example by the Getsy library.

As you will find in this article – when server side scraping fails, client side scraping may be a much to complex solution. It is very well possible that the rich client web application is using a REST API that provides the data as a JSON document. An API that our server side program can also easily leverage. That turned out the case for the OOW 2017 website – so instead of complex HTML parsing and server side or even client side scraping, the challenge at hand resolves to nothing more than a little bit of REST calling. Read the complete article here.

PaaS Partner Community

For regular information on business process management and integration become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

177013 When Screen Scraping became API calling – Gathering Oracle OpenWorld Session Catalog with ... Blog twitter on When Screen Scraping became API calling – Gathering Oracle OpenWorld Session Catalog with ... Twitter linkedin on When Screen Scraping became API calling – Gathering Oracle OpenWorld Session Catalog with ... LinkedIn  When Screen Scraping became API calling – Gathering Oracle OpenWorld Session Catalog with ... Facebook  When Screen Scraping became API calling – Gathering Oracle OpenWorld Session Catalog with ... Wiki

Technorati Tags: SOA Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress

Oracle Blogs | Oracle The Data Warehouse Insider Blog

Debit Card Skimming and Fraud

This March, in Indonesia we had a hard wake-up call: debit card skimmers are still at large. We found out that more than 1,400 cards were being skimmed from a single team of perpetrators. And 1,200 of those skimmed cards came from a single bank. According to the news, these fraudsters, which started less than a year ago in July 2017, are working around several big cities in Indonesia.

The media reported that the skimmers stole up to 22 Million IDR, which is around $ 1,500 USD (rate USD 1 = IDR 13700), per customer. If we extrapolate the numbers, assuming a customer loses $ 1,000 USD on average (and the bank has to pay that balance back), and around 1,000 customers are impacted each year, that’s approximately $ 1 Million USD that they stole in just a few months by a single team of perps. Also, it’s quite likely that were more incidents than reported, but the banks did not share the actual number so as not to alarm the public.

From the bank’s point of view, while the monetary loss was not monumental (What’s $ 1 million to the billions the bank makes a year?), from the overall business point of view, this will impact:

  1. Trust
  2. Customer experience
  3. Brand Name of the Bank
  4. Being famous for the wrong reasons as an “easy target”, more will try to find ‘loopholes’

A poor perception of trust, customer experience, brand name and getting a reputation of being an easy target, can have a long-lasting negative impact to the bank. To combat this, banks need to implement adaptable anti-fraud solutions as soon as possible that can help them prevent and reduce the activity of these tricky fraudsters.

Based on my experience as a technologist, I have looked at these acts of fraud from the tech angle to see how they might be prevented.

How to prevent bank fraud using TIBCO solutions

First, to understand fraud/skimming, you need to understand how it happens and when it happens. It all depends on data. While most of your data is probably in a data-warehouse, there might be another data source that you use. You might use data from excel, another database or table, a web-service, or even from another SaaS tool such as Google Analytics.

If you have so many disparate sources of data, you can start the fraud detection process by using TIBCO Data Virtualization. Data virtualization gives you a seamless way to collect data from many disparate sources and make all your data uniform so you can a 360-degree view in a single dashboard. After all, you know what they say, “garbage in – garbage out”. You need to have good data going into your dashboard so you get good data coming out of it.

1 Debit Card Skimming and Fraud

Then, using analytical tools such as TIBCO Spotfire combined with your historical data, you can start to analyze your data to find which kind of transaction is likely to be fraudulent or not.

After doing the analysis, the next step would be implementing the analytical model in real-time, either as an event rule or as a streaming analytical event. Both can be used separately or in conjunction with one another. Of course, we’re not stopping at that.

2 Debit Card Skimming and Fraud

While we can monitor and score the transaction in real time, we also need to put in Case Management whenever we find potential fraud. This enables us to catch suspicious activities and also ensures that we don’t block the card unless we’re 100% sure that this is a fraudulent transaction.

The Financial Fraud Accelerator is available from TIBCO Community to help you speed up your time to understanding the data, visualizing the data, building the analytics model, doing machine learning, model scoring, and so on. This accelerator or template is an all-in-one, end-to-end solution for fraud detection.

3 Debit Card Skimming and Fraud

TIBCO’s fraud detection tools can be implemented either in a modular manner or as an end-to-end solution, depending on your needs. So, what are you waiting for? Fraud can be lethal to banks.

Debit Card Skimming Debit Card Skimming and Fraud

Let’s block ads! (Why?)

The TIBCO Blog

Middle Office or No-Man's Land?

It was once common knowledge that
Apttus was the next big acquisition target on Salesforce’s radar, until Salesforce decided to acquire another CPQ (configure, price, quote) vendor to augment its Sales Cloud. In the two-ish years since, Apttus has nicely pivoted, diversifying its partner base and justifying its existence with a new strategy and nomenclature around what it calls the “middle office.” I was present at Accelerate, the company’s annual user conference, last week in San Francisco.

As you can assume, the middle office is neither front-office customer relationship management nor back-office enterprise resource planning. It is a potentially viable territory where back-office data influences front-office business processes. As you might expect, the middle is rife with landmines for all companies with aging on-premises ERP system and a need to connect with cloud CRM.

That the middle office has taken so long to emerge from the cloud revolution does not make it suspect as an entity; it more reflects the reality that it wasn’t needed as long as front- and back-office infrastructures were being built out.

However, front- and back-office clouds largely are built out at this point, even if there are many holdouts, and this raises an interesting question: Can independent middle-office vendors survive, or will they of necessity be absorbed by either ERP or CRM purveyors?

The Data Dilemma

Apttus tried to answer that question for itself last week by highlighting its unique perch overseeing the sales process with a platform (OMNI) and analytics capability (MAX Proactive) unique to its central domain, utilizing the data models on either side.

Demonstrations using focused artificial intelligence and machine learning showed Max advising sales people on next best actions in a variety of situations, from identifying target opportunities to closing the quarter.

As with any such sales analytics, the machine learning requires a hefty data set to properly train it to make the best recommendations. Unfortunately, a chicken-and-egg situation crops up almost immediately for many sales organizations: Their data is in one or many ways inadequate.

Either it reflects bad practices that the client company wants to get away from — such as excessive discounting — or the company has a chaotic sales process, and its data doesn’t lend itself to training anything.

Only about a quarter of sales organizations actually have the well-managed sales processes that truly would benefit from Apttus style analytics, based on decades of excellent research from CSO Insights, which I’ve referenced previously. Another quarter have chaotic processes and would need a lot of work getting up to speed.

That leaves half of the sales organizations in some form of no-man’s land, in need of better processes but lacking the data to train a model.

Adaptability Matters

Apttus could dedicate itself to the top quarter of companies and make a nice living, while others work to catch up — but if history is a guide, many won’t. Fortunately, Apttus doesn’t have to limit itself. In addition to its machine learning, Apttus has given itself a back door for getting into sales organizations that lack the needed training data but avidly want to improve.

The Apttus model has a capability for setting explicit rules in the event ML algorithms fail to find enough data to grow on. This leaves the organization free to work and collect better data in advance of a time not too far down the road when it can train its algorithms adequately.

It’s a neat trick if it works, but the fly in the soup is that any SFA vendor with analytics can do the same thing. Where Apttus further differentiates, though, is in its easy ability to access the data models on each side of the middle office and adapt to specific situations.

That means it can use back-office data such as pricing, personnel and catalog to steer front-office processes, which include making quotes and closing business. Its real advantage might be that its focus is on the middle and not the more provincial needs of either major silo. As CEO Kirk Krappe says, it’s in the middle where commerce gets done. He has a point.

My Two Bits

Apttus — and any other middle-office vendor — has an opportunity to stake out a unique territory that, from all indications, has a valid rationale for its existence. However, a valid rationale isn’t enough to guarantee success.

Other vendors, especially in the front-office cloud, easily could take over the niche and turn it into a feature. Apttus’ challenge is to put enough “stuff” into the middle office to make it difficult for others on its flanks to muscle in — and it has the stuff.

That’s why it introduced OMNI, its version of a platform that consolidates its advances in CPQ and machine learning into a useful whole, and importantly, a digestible narrative.

Apttus’ big challenge right now is to be crisp in its messaging and positioning to get its ideas to the largest audience, and it has some work to do — though it’s early days. Speaking with a representative of Thompson Reuters, a year-old customer with 1,200 sales users, it appears that Apttus knows what it is doing. The need now is to scale.
end enn Middle Office or No Man's Land?

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.


Denis%20Pombriant Middle Office or No Man's Land?
Denis Pombriant is a well-known CRM industry analyst, strategist, writer and speaker. His new book, You Can’t Buy Customer Loyalty, But You Can Earn It, is now available on Amazon. His 2015 book, Solve for the Customer, is also available there.
Email Denis.

Let’s block ads! (Why?)

CRM Buyer

Teradata Reports 2018 First Quarter Results

Download PDF
Teradata United States

Help Improve Power BI

We love hearing feedback from the Power BI community and today we have another quarterly survey to keep improving Power BI with your feedback. In the past, our survey has focused on Power BI Desktop, but this quarter we are expanding it to cover the Power BI service as well. This survey helps us to understand what we're doing well, and what we can change to make your experience even better. This is a great opportunity to provide feedback directly to the product team and influence our development directly. In addition, if you take the survey by 5 p.m. PST on May 30th, you will be entered for a chance to win a $ 50 giftcard.

ddaae357 74d6 4267 8f42 f38b53eca497 Help Improve Power BI

Note about survey prizes:
We will randomly select three survey respondents to each win a $ 50 gift card. Winners will be notified by June 1st, 2018 and a list of winners may be requested by emailing
ucsurvey@microsoft.com after this date. For our sweepstakes rules, please refer to http://www.microsoft.com/usability/uxcsweeps.htm. To be eligible for our sweepstakes, you must complete the survey. You must also be a U.S. Citizen or Permanent Resident with a valid Social Security Number. Only one entry into the sweepstakes will be eligible per person. In accordance with IRS regulations we are required to collect 1099 information (your address and social security number) if the suggested retail value of gratuity items that you select exceeds $ 599 in a given calendar year. We will not ask you to provide any tax information in this survey, but will contact you before shipping your gratuity item, if necessary. Questions regarding gratuity issues may be directed to ucsurvey@microsoft.com.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Dog Laments Missing Dog Pal

 Dog Laments Missing Dog Pal

Animals grieve a loss too.

My dog died almost a year ago and he seen the picture of her and recognized her.  He misses his sister just as much as we miss her.

“Missed friends. ”
Image courtesy of https://imgur.com/gallery/OXkhb02.

Advertisements

Let’s block ads! (Why?)

Quipster