Why Automakers Are Turning To Resilient Lean

Risk management is at the heart of every production process and supply chain. Companies are realizing that it can take one little problem to completely stop manufacturing operations, often leading to unexpected downtime and missed deadlines. The automotive industry is no exception to this issue.

Most automotive manufacturers want to have greater control of their production lines. Many technology companies, material suppliers, and OEM parts builders help make vehicles that will satisfy customer demand. If even one of these companies experiences an issue, production runs can come to a complete stop. This scenario can create lost time and profits that affect the automotive manufacturer’s bottom line.

To further enhance their processes, manufacturers are turning to resilient lean methodologies. These principals provide better control over several factors of the production process and supply chain.

What is resilient lean?

When talking about resilient lean, you have to look at the concept as two separate entities working together for the good of the factory. Lean manufacturing means that the automaker is looking into ways that help lower product waste while maximizing productivity. Resilient manufacturing is when a company aims to create a sustainable supply chain system that can recover from any type of disruption. These disruptions could be natural disasters, brand negativity, quality issues, or even customer demand loss.

Resilient lean principals seek to offer continual improvement to production processes. The concept makes processes cost-efficient and stable while reducing material and energy waste. Then, processes can weather any unpredictability. This factor helps manufacturers with their risk management initiatives.

Key areas to apply resilient lean

There are many ways to apply resilient lean to the production process and supply chain. For automakers, some key areas can include production management, scheduling, and materials optimization.

Production management flexibility

One issue in managing production processes and supply chains is relying on a multitude of plants to complete vehicles. Parts are being sent from one factory to the next. They are being refined and built out before the final product is shipped to its destination. When adopting resilient lean, an automotive manufacturer seeks to integrate processes in a single plant. It can better use available assets in a more cost-effective way. This approach lowers shipping expenses and fuel costs while increasing productivity.

The automakers also can gain more transparency in their processes as well as improve factory locations by using the Internet of Things (IoT) and real-time data connectivity. These technologies analyze user demand so that automakers can determine the best locations to create new plants that are closer to their customers.

In addition, sensors on equipment can analyze every step of the production process to search for equipment issues. Automakers can improve machinery output levels and manufacturing cycles for more resilient operations.

Real-time scheduling

Creating vehicles can become a long and arduous process. Automotive manufacturers rely on time-to-market scheduling to determine the amount of time it will take for a vehicle to be designed, developed, tested, redesigned, completed, and shipped. Any issue that crops up in the time-to-market schedule can slow down the entire process. Unfortunately, customer demand may wane to the point where, once the vehicle is ready, nobody wants to buy it.

Real-time data information technologies can create faster time-to-market schedules for production lines. Computers, robots, and systems can communicate with each other using real-time data to improve workflow. They can also communicate with workers over quality issues to reduce downtimes, as workers can immediately address problems in a timely manner. These technologies help companies reach resilient lean initiatives for optimized operations.

Materials optimization

Another key factor that can benefit from resilient lean methodologies is material waste. Automakers always want to be able to create enough vehicles to satisfy demand without overburdening the current market. Since new models roll off production lines on a yearly basis, dealers can get stuck with too many vehicles that are not able to be sold, which in turn creates excessive inventory in factories since there are no requests for vehicle shipments.

Resilient lean can focus on improving the supply chain by providing accurate and timely analysis regarding customer demand. By using IoT, automakers can gain important customer feedback and move production processes to focus on other vehicle designs at the appropriate time. This method reduces excess materials, saving more working capital for the automaker.

Creating robust supply chains with resilient lean

Seeking improved manufacturing processes and supply chains using resilient lean principals can seem like a daunting process. Yet with the new technologies that are available, automotive manufacturers can create smart factories that help lead to faster production lines and lower material waste for a more robust and efficient factory environment. Investing in these digital technologies at the optimal time can provide transformative solutions to the automaker’s processes so they can stay competitive in their industry market to please their customers.

Learn how to bring new technologies and services together to power digital transformation by downloading The IoT Imperative for Discrete Manufacturers: Automotive, Aerospace and Defense, High Tech, and Industrial Machinery.

Let’s block ads! (Why?)

Digitalist Magazine

how do I link system modeler to Mathematica

 how do I link system modeler to Mathematica

I have a version of Mathematica 11.0.1 and it will not work with the latest system modeler, is there a way I can fix that

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Credit Knowledge is Power

Screen Shot 2018 07 10 at 3.08.33 PM Credit Knowledge is Power

This is a guest post from Rourke O’Brien, Assistant Professor of Public Affairs at the University of Wisconsin-Madison

Does regular consumer access to their FICO® Score influence financial knowledge and behavior?

Professors Abigail Sussman, Tatiana Homonoff and I recently completed a multi-year research study with 400,000 Sallie Mae customers.  Using a randomized control trial research design, we studied whether student loan borrowers who check their credit scores regularly make better financial decisions and manage finances more responsibly. The answer? An emphatic YES.

Kelly Christiano, SVP at Sallie Mae and I had the opportunity to present the findings at this year’s FICO World. Sallie Mae was the first national private education lender to offer free access to quarterly FICO Scores through the FICO®Score Open Access program.

Our research found that after one year, student loan borrowers who logged on to view their FICO Score had fewer past due accounts and took steps to establish a credit history. These positive behaviors ultimately translated into higher FICO Scores for borrowers.

For students who want to improve financial health, it is important to be aware of the terms of opening new accounts (only open new lines of credit when needed) and to make sure that payments are on time. Moreover this research suggests that all consumers, not only student loan borrowers, should regularly check their FICO Score to be thoughtful and empowered about how their credit behavior impacts their long-term financial health.

You can also review the full research details here “Does Knowing Your FICO Score Change Financial Behavior? Evidence from a Field Experiment with Student Loan Borrowers

Let’s block ads! (Why?)

FICO

Stairway to heaven

 Stairway to heaven

via

Advertisements

About Krisgo

I’m a mom, that has worn many different hats in this life; from scout leader, camp craft teacher, parents group president, colorguard coach, member of the community band, stay-at-home-mom to full time worker, I’ve done it all– almost! I still love learning new things, especially creating and cooking. Most of all I love to laugh! Thanks for visiting – come back soon icon smile Stairway to heaven


Let’s block ads! (Why?)

Deep Fried Bits

Agile or Waterfall?

agile 300x225 Agile or Waterfall?

As Microsoft Dynamics 365 projects evolve, the overarching question that project managers need to ask themselves is “which framework best fits my project?” In this blog, we discuss some pros and cons of the “Agile” and “Waterfall” frameworks that might assist you in deciding which is the best fit for your project.

Waterfall

This framework is named “Waterfall” because each life cycle is a sequence that flows downward in a series of phases. Typical phases are Feasibility > Plan > Design > Build > Test > PROD > Support.

Pros:

  • Waterfall is a structured process defined in phases with deliberate deliverables in each phase.
  • There is an emphasis on documentation such as requirements and design documents.
  • Requirements are defined in advance with little flexibility in allowing change once requirements have been signed off.

Cons:

  • Changes in design must go through a formal change control process that can impede timelines and budget.
  • Oftentimes stakeholders aren’t completely sure of the design they are asking for. Waterfall is rigid in its design and its ability to change with the clients evolving requirements.
  • The inability to change things done in previous stages.

Agile

The agile framework is based on an incremental and iterative approach. Instead of all the planning being up front, the framework allows for changing requirements over time.

Pros:

  • Change is part of the process and not seen as a weakness to designing the product.
  • Since delivery is iterative, issues and problems tend to come to light much quicker.
  • User feedback is crucial throughout the entire lifecycle of the project.

Cons:

  • Documentation can sometimes be neglected. The Agile Manifesto refers to the value behind working software then comprehensive documentation.
  • Planning tends to be less “fixed” since the requirements are continually growing as the product evolves.
  • There is a risk that the final product may look quite different from the original design.

The goal of each framework is to deliver a functional and quality product. At the end of the day, everyone who is contributing to the Dynamics 365 project must collectively decide on the best approach.

Be sure to keep checking our blog for more Dynamics 365 tips!

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Checkr and Uber built a service to monitor workers’ background records

 Checkr and Uber built a service to monitor workers’ background records

Companies that employ on-demand workers rely on checks of criminal records, employment verification, and driving records to spot red flags during the hiring process. The trouble is, those checks are a one-time deal — if a contract worker commits a crime a few weeks into a job, their employer usually remains none the wiser. Case in point? Last year in Massachusetts, more than 8,000 ridesharing drivers failed a state background check for infractions like license suspensions, sexual offenses, and violent crimes.

San Francisco-based Checkr, which counts Instacart, Grubhub, and Uber among its base of more than 10,000 customers, today rolled out a new product that seeks to address this problem. It’s called Continuous Check, and it dynamically spots potentially disqualifying records to ensure that problematic workers are quickly identified and flagged.

“With today’s on-demand workforce, there’s a need to move beyond static background reports to dynamic screenings,” Checkr cofounder and CEO Daniel Yanisse said in a statement. “Through Continuous Check, Checkr is creating a new standard of safety for the [gig economy], and beyond that will provide critical insight into changes in someone’s background that may affect their eligibility to work.”

When Continuous Check identifies that an employee was involved in criminal activity, like a new or pending charge for a DUI, it notifies the worker’s employer, who can investigate further. That’s heaps better than the current arrangement — as many gig economy employers only rerun background checks once a year.

Continuous Check was co-developed with Uber, Yanisse said, and the ridesharing company is the first to deploy it. It’ll roll out to the rest of Checkr’s clients later this year.

“Safety is essential to Uber, and we want to ensure drivers continue to meet our standards on an ongoing basis,” said Uber VP Gus Fuldner. “This new continuous checking technology will strengthen our screening process and improve safety.”

Checkr didn’t immediately respond to questions about the service’s pricing, but historically, Checkr has billed its clients per individual background check, which includes education verification, reference checks, and drug screening.

Checkr announced a $ 100 million funding round last year led by T. Rowe Price, Accel, and Y Combinator, bringing its total funding to about $ 150 million. The 180-person company says it runs more than a million background checks a month.

One thing’s for sure: Checkr is well-positioned for growth. The gig economy isn’t slowing down anytime soon — it accounts for almost 4 million workers, according to Intuit, a number that’s expected to grow to 7.7 million by 2020.

Let’s block ads! (Why?)

Big Data – VentureBeat

A Trade War in the Cloud?

So far, the looming trade war is limited to actions and reactions related to durable things that trade throughout the global economy — cars, steel and aluminum, for instance. Will that remain the battleground? Or should we expect greater contentiousness around services — specifically, Software as a Service, and CRM in particular?

Retaliation for U.S. trade initiatives so far has been aimed at things produced by Trump-leaning industries and states, like motorcycles, soybeans and bourbon whiskey. However, those industries and their workers might not provide the leverage that other nations need to put the genie back in the bottle.

Some workers who make bourbon and motorcycles are vocal about their politics and form part of the Trump base. However, their actions seem limited mostly to reactions, not initiative. They’ll support a Trump policy, but they haven’t been out front advocating.

Looking for Leverage

On the other side of the coin — and I accept that this is a simplification — the tech sector is by definition entrepreneurial, and able and willing to initiate action. So, any attack on subscription providers by foreign nations could, in theory, motivate more reaction designed to stop the trade war.

There’s a lot of potential leverage there, because the U.S. has a dominant share of the Software as a Service economy and the subscription economy generally. While U.S. companies have been stepping up efforts to locate cloud data centers in many countries to support data residency requirements, their services still could look like imported services and thus be subject to tariffing.

The tech sector is also vociferous — both pro and con on numerous political issues. For every Marc Benioff advocating for progressive ideas, there’s a Peter Thiel supporting the administration. However, the progressives might have the numbers, if only because so much of the industry is based in California. So there’s all the more reason for foreign governments looking for leverage against American tariffs to target the tech sector and especially software.

Permanent Alteration

U.S. dominance in SaaS and the cloud generally is a double-edged sword. U.S. companies have dominant positions, but for this reason it would be easy to target the industry with, say, 50 percent tariffs. As with other industries, tariffs could be expected to slow growth and adoption, and possibly cause some vendors to set up production in other countries, as Harley-Davidson is doing.

Once a path is set and investment is made, the shape of any industry could be altered permanently. It will be hard for any business to abandon sunk costs in other lands once a trade truce is agreed to. So we’re possibly looking at a generational or even secular shift in the shape of the cloud and its industries should a trade war infect tech.

A trade war could be a driving force that sends clouds scattering to the corners of the Earth. That could be good for some businesses, but bad for individuals whose jobs would be put in jeopardy. It would be an acceleration of the natural commoditization cycle with one key difference. There wouldn’t be many new job openings for displaced American tech workers whose jobs departed for parts unknown.

Last word

When the idea of a trade war first surfaced about a year ago, some prescient commentators suggested that there likely would be unforeseen consequences. No one has perfect visibility into the heart or actions of an adversary, regardless of the analytics used.

However, retaliating against what an adversary does well is an old story, and it wouldn’t surprise me to see tariffs on cloud computing services should cooler heads not prevail. Those tariffs will have follow-on effects that we can only guess at right now, but it’s doubtful that those actions will improve the U.S.’ standing in tech.
end enn A Trade War in the Cloud?

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.


Denis%20Pombriant A Trade War in the Cloud?
Denis Pombriant is a well-known CRM industry analyst, strategist, writer and speaker. His new book, You Can’t Buy Customer Loyalty, But You Can Earn It, is now available on Amazon. His 2015 book, Solve for the Customer, is also available there.
Email Denis.

Let’s block ads! (Why?)

CRM Buyer

Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

The series so far:

  1. Power BI Introduction: Tour of Power BI — Part 1
  2. Power BI Introduction: Working with Power BI Desktop — Part 2
  3. Power BI Introduction: Working with R Scripts in Power BI Desktop — Part 3
  4. Power BI Introduction: Working with Parameters in Power BI Desktop — Part 4

Power BI Desktop, the downloadable application that supports the Power BI service, lets you define parameters that you can use in various ways when working with datasets. The parameters are easy to create and can be incorporated into the import process or later to refine a dataset or add dynamic elements. For example, you can create parameters that supply connection information to a data source or that provide predefined values for filtering data.

This article demonstrates how to create a set of parameters to use with data imported from a local SQL Server instance. The examples are based on data from the AdventureWorks2017 database, but you can use an earlier version of the database or whatever database you choose, as long as the returned data follows a structure similar to what is used here. Just be sure to update the parameter values accordingly as you work through the article.

Adding Connection-Specific Parameters

Some data source connections in Power BI Desktop—such as SharePoint, Salesforce Objects, and SQL Server—let you use parameters when defining the connection properties. For example, if retrieving data from SQL Server, you can use a parameter for the SQL Server instance and a parameter for the target database.

Because parameters are independent of any datasets, you can create them before adding a dataset or at any time after creating your datasets. However, you must define them and set their initial values in Query Editor. The parameters you create are listed in the Queries pane, where you can view and update their values, as well as reconfigure their settings.

In this article, you will create two connection parameters for retrieving data from a SQL Server database. The first parameter will include a list of SQL Server instances that could host the source data. (Only one instance needs to work in this case. The rest can be for demonstration purposes only.)

To create the parameter, open Query Editor, click the Manage Parameters down arrow on the Home ribbon, and then click New Parameter. In the Parameters dialog box, type SqlSrvInstance in the Name text box, and then type a parameter description in the Description text box.

From the Type drop-down list, select Text, and from the Suggested Values drop-down list, select List of values. When you select the List of values option, a grid appears, where you type in the individual values you want to assign to the variable, as shown in the following figure. Be sure that at least one of those values is the name of an actual SQL Server instance.

word image Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

After you type the list of values, select a default value from the Default Value drop-down list and then select the variable’s current value from the Current Value drop-down list. In the figure above, a local SQL Server instance named SqlSrv17a is used for both the default and current values.

Click OK to close the Parameters dialog box. Query Editor adds the parameter to the Queries pane, with the current value shown in parentheses. When you select the parameter, Query Editor displays the Current Value drop-down list and the Manage Parameter button in the main pane, as shown in the following figure.

word image 1 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

You can change the current value at any time by selecting a different one from the Current Value drop-down list, or you can change the parameter’s settings by clicking the Manage Parameter button, which returns you to the Parameters dialog box.

To create the parameter for the target database, repeat the process, but name the parameter Database, as shown in the following figure. Be sure to provide at least one valid database in your list of databases. When you click OK, Query Editor adds the parameter to the Queries pane, just like it did for the SqlSrvInstance parameter.

word image 2 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

That’s all there is to creating the connection parameters. You can also configure parameters with different data types, such as decimals or dates, or with different value formats. For example, you can configure the parameter to accept any value or to use values from a list query, a type of query that contains only one column. You can create a list query manually by using an M statement, or you can create a list query based on a regular dataset, although this approach still seems somewhat buggy. For this article, you’ll stick with lists, but know you have other options.

Using Parameters to Connect to a Data Source

With your parameters defined, you’re ready to connect to the SQL Server instance and retrieve data from the target database. If you haven’t already done so, apply your changes in Query Editor and close the window.

For this exercise, you will be running a T-SQL query, but before you try to do this, verify that the Require user approval for new native database queries property is disabled. If it is enabled, you will receive an error when trying to run the query. To access the property, click the File menu, point to Options and settings, and then click Options. In the Options dialog box, go to the Security category, clear the property’s checkbox if selected, and click OK.

Then, in the main Power BI Desktop window, go to Data view and click Get Data on the Home ribbon. In the Get Data dialog box, navigate to the Database connections and select SQL Server database, as shown in the following figure.

word image 3 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

When you click the Connect button, the SQL Server database dialog box appears. In the Server section at the top of the dialog box, click the down arrow associated with the first option (the one to the left) and then click Parameter. The second option will change from a text box to a drop-down list that contains the two parameters you just created. Select SqlSrvInstance, and then repeat the process in the Database section, only this time, select Database.

Next, click the Advanced options arrow and enter the following T-SQL statement in the SQL statement box (or a comparable statement appropriate for your data):

The SQL Server database dialog box should now look similar to the one shown in the following figure, with the SqlSrvInstance and Database parameters selected.

word image 4 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

When you click OK, Power BI Desktop displays a preview window (assuming everything is working as it should), with the name of the two parameters at the top, as shown in the following figure.

word image 5 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

When you click the Load button, Power BI Desktop adds the dataset to Data view. Before taking any other steps, rename the dataset to RepSales or something to your liking. You should end up with a dataset that looks similar to the one shown in the following figure.

word image 6 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

By defining parameters for the connection properties, you can easily change their values at any time, should you need to retrieve the data from a different SQL Server instance or database. You can also use the same parameters for multiple datasets, saving you the trouble of having to repeat connection information each time you create a dataset based on the same data source.

Later in the article, you’ll learn more about working with existing parameters after they’ve been incorporated into your dataset. But for now, open Query Editor and view the M statement associated with the dataset’s Source step in the Applied Steps section, as shown in the following figure.

word image 7 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

If you examine the M statement closely, you’ll see that the SQL Server connection data references the SqlSrvInstance and Database parameters. They’re included as the first two arguments to the Sql.Database function. I’ve copied the statement here and highlighted the two parameters for easier viewing:

An interesting implication of all this is how easily you can reference parameters within your M statements, providing you with a very powerful and flexible tool for customizing the applied steps that make up your dataset.

Adding Parameters to Filter Data

In some cases, you might want to use parameters to filter a dataset, rather than applying a Filtered Rows step, an approach that can be inflexible and difficult to update. For example, you might want to use a parameter in place of the hard-coded 2013 specified the WHERE clause of the original T-SQL statement:

You can replace the hard-coded value with a parameter that supports a range of values. To do so, create a parameter just like you saw earlier, only this time, name the parameter SalesYear, define a list that contains the years 2011 through 2014, and set the default and current values to 2011, as shown in the following figure.

word image 8 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

After you’ve created the parameter, update the M statement associated with the dataset’s Source step by replacing the 2013 value with the following code snippet (including the quotation marks):

Once you’ve updated the code, click the checkmark to the left of the statement to verify that your changes are formatted correctly and that you didn’t somehow break the code. The statement and dataset should now look similar to those shown in the following figure.

word image 9 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

To test the new variable, select SalesYear in the Queries pane and choose a different year from the default 2011. Then re-select the RepSales dataset and verify that the data has been updated.

Adding Parameters to Control Statement Logic

In some cases, you might want to use parameters to control a query’s logic, rather than just filtering data. For example, the SELECT clause in the original T-SQL statement uses the SUM aggregate function to come up with the total sales for each sales rep:

You can instead insert a parameter in the M statement that allows you to apply a different aggregate function. First, create a parameter name AggType and then define a list that includes one item for each function, using the SUM function as the default and current values, as shown in the following figure.

word image 10 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

I’ve included the SubTotal column within the options in part to keep the logic clearer, but also to demonstrate that you can summarize the data based on other columns as well, as long as your dataset supports them. For example, you dataset might also include the DiscountAmounts column, which provides the total sales amounts, less any discounts. In such cases, each parameter option can define the specific function and column, including such values as SUM(h.DiscountAmounts) and AVG(h.SubTotal).

After you create the parameter, update the M statement associated with the dataset’s Source step by replacing the hard-coded SUM(h.SubTotal) fragment with the following snippet (including the quotes):

You can also do something similar with the original ORDER BY clause, using a variable to provide options for how to order the data. First, create a variable named ResultsOrder. Then, for each option in the list, specify the column on which to base the sorting (FullName or SalesAmounts) and whether the sorting should be done in ascending or descending order, as shown in the following figure. In this case, the option FullName ASC is used for the default and current values.

word image 11 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

Once you’ve created the parameter, update the M statement by replacing the FullName ASC code fragment with the following snippet (including the quotes):

Your M statement and dataset should now look similar to the ones shown in the following figure.

word image 12 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

Once you understand how to incorporate parameters into your M statements, you have a wide range of options for manipulating data and making your datasets more flexible and accommodating, without having to add a lot of steps to you query. Just be sure that whenever you make any changes, you apply and save them so you know they work and won’t get lost.

Using Parameters to Name Dataset Object

Another fun thing you can do with parameters is to use them for applying dynamic names to objects. For example, you can change the name of the SalesAmounts column to one that reflects the sales year and type of aggregation being applied. A simple way to do this is to first add a Renamed Columns step to your dataset in Query Editor and then update the associated M statement to include the parameter values.

To add the Renamed Columns step, right-click the SalesAmounts column header, click Rename, type Sales as a temporary column name, and then press Enter. Query Editor adds the Renamed Columns step to the Applied Steps section, along with the following M statement:

To incorporate the SalesYear and AggType parameters into the statement, replace Sales with the following code:

The concatenation operator (&) joins the name Sales with the two variable values, which are separated by a dash and enclosed in parentheses. The Text.Range function retrieves only the first three characters from the AggType variable.

After you update the statement, be sure to verify your changes. The dataset and M statement should now look similar to the ones shown in the following figure.

word image 13 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

Notice that the name of the dataset’s third column now includes the year and function in parentheses, making it easy to see what parameter values have been applied to the data set.

Working with Parameters in Data View

As you saw earlier, to change a parameter value in Query Editor, you need to select the parameter in the Queries pane and update its value accordingly. However, this can be a cumbersome process, especially when you want to update multiple parameter values concurrently.

Fortunately, you can set parameter values directly in Data view. To set the values, click the Edit Queries down arrow on the Home ribbon and then click Edit Parameters. When the Enter Parameters dialog box appears, select the values you want to apply to your datasets and then click OK. The following figure shows the Enter Parameters dialog box and the current parameter settings.

word image 14 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

Although the Edit Parameters option name and the Enter Parameters dialog box name are somewhat misleading, the features they represent provide an effective way to update the parameter values and in turn update the data. Be aware, however, that the changes you make here apply to all datasets using the parameters. If you want to include similar types of parameters in multiple datasets, but you do not what them to all share the same values, you should create parameters specific to the dataset, naming them in such a way to make them easily distinguishable from each other.

Once you know how to set the parameter values, you can try different variations, viewing the dataset each time you apply the new settings. For example, the following figure shows the dataset after applying the parameter settings shown in the previous figure.

word image 15 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

Notice that the Sales column includes the year and aggregation type in the column name and that the data is sorted based on the Sales values, in descending order. You can also change the parameter values while in Report view, allowing you to see you changes immediately within your visualizations.

Clearly, the parameter capabilities help to make Power BI Desktop an even more robust and flexible tool, and you can use parameters in a variety of ways, regardless of where the data originates. The more comfortable you become working with parameters, the more effectively you can use them, and the more control you’ll have over your datasets.

Let’s block ads! (Why?)

SQL – Simple Talk

Parakeet Meet

 Parakeet Meet

Let’s get ready to rumble!

“Hold me back bro.”
Image courtesy of https://imgur.com/gallery/EKTfEDb.

Advertisements

Let’s block ads! (Why?)

Quipster

Still Lost On New Industry 4.0 Keywords? Find Your Answers Here.

iStock 819365524 e1531417321940 Still Lost On New Industry 4.0 Keywords? Find Your Answers Here.

























Industries

Let’s block ads! (Why?)

The TIBCO Blog