Tag Archives: Learn

How to Learn from Travelers to Plan the Future of Airports

A few months ago, the Airports Council International (ACI) released its list of the world’s 20 busiest airport as of 2016. It’s notable that the traffic of these hubs increased by 4.7 percent in 2016, totaling over 1.4 billion passengers, and globally the number of people traveling by air grew 5.6 percent.

These figures help us understand that airports and airlines have huge opportunities sitting in front of them to offer better services and become part of the journey of each traveler and to have the possibility to monetize and increase their revenue. The daily flow of passengers are like mature cherries on a tree ready to be picked—it would be a total waste to leave them untouched.

Historically, an airport is regarded as a place where we spend time before our flight. But this attitude will change as soon as airports are considered as being part of the passenger’s journey. By 2024, most major airports will offer more than just a place to wait to catch your flight; they will offer you a gym class, invite you to attend an exhibition of masterpieces, or even let you sip a cocktail at the swimming pool while looking at the airplanes taking off.

Most of the airports have already started to offer their passengers a mostly free service: WiFi. But there is a hidden reason behind it; it allows them to track travelers, understand and learn from the walking paths, and measure how much time they spend in one area. From the arrival at the airport entrance, it is possible to track passengers’ time spent at the check-in desk, the time it takes them to go through security, how much time they spend eating at restaurants, and how long it takes them to reach the departure gate before finally taking off. This gives the airport a considerable amount of data to analyze to discover insights from passengers’ behaviors.

Tracking and optimizing the passenger dwell time is a fundamental part as recent airport studies have discovered that an extra 10 minutes at the security gate reduces the average passenger spend by a considerable 30 percent.

30 percent How to Learn from Travelers to Plan the Future of Airports

With the rise of IoT, more products and devices are connected to the internet and to each other, allowing devices to exchange informations through APIs. Just to cite one example, digital luggage tags and suitcases will include all flight details and destination information, which allows travelers or holidaymakers to track their their bags throughout their journey.

Another limitless opportunity comes from proximity marketing. Knowing in each instant where passengers are going and by combining information about their interests, it is possible to trigger marketing offers. Retailers would benefit by having more customers, and customers would benefit by getting only relevant offers.

Being able to track passengers in real time also gives the advantage to instantly visualize passenger density in different areas of the airport. For security and operational purposes, this allows airports to measure passenger throughput from one area to another in the case of exceeded thresholds to take countermeasures.

With the support of AI and machine learning algorithms, it is possible to learn and predict in real time what could happen when many passengers arrive at the airport, causing a delay, and what needs to be done to speed up operations. With dynamic check-in and security staff allocation, it is possible to optimize operations.

By gathering all this data from multiple connected devices and analyzing them, it is possible to understand traveler’s needs to plan the future of airports and offer a greater experience for a win-win situation.

Learn more about what TIBCO is doing in the travel industry.

Let’s block ads! (Why?)

The TIBCO Blog

So Much to Learn, So Many Fabulous Speakers at the Houston Energy Forum, September 6‒7

Energy Forum 620x360 So Much to Learn, So Many Fabulous Speakers at the Houston Energy Forum, September 6‒7

Here’s a preview of some the keynotes and break-out sessions that will be presented by Oil and Gas industry leaders at TIBCO’s 15th annual Houston Energy Forum. Learn more and register at https://energyforum.tibco.com/.

Anadarko Petroleum Corporation: Chad Loesel, Staff Drilling Engineer; Eric Keister, Advanced Analytics & Emerging Technologies – Lead Business Strategist; and Dingzhou Cao, PhD, Staff Data Scientist

Real-time analytics and event flow processing with Streambase

Since remote drilling locations are not easily accessible by engineering and operations staff, an event flow process for analyzing and measuring data feeds was created using TIBCO StreamBase. StreamBase allows Anadarko to receive data, process analytical models, and display results in real time. Learn how Anadarko is leveraging event processing and analytics to drive real-time decision-making.

Mongo-DB and Ruths.AI : Theo Kambouris, Regional Director, Mongo DB and Troy Ruths, CEO, Ruths.AI

Modern data platform for modern semi-structured applications

As companies evolve their analytics maturity, they realize a flexible, scalable, and reliable data structure and repository is necessary to support the growing footprint of data and its results. Often applications are semi-structured, in which a robust schema sits on top of an unstructured datamart, and are required to integrate with TIBCO technologies.  Modern data platforms have risen to meet such needs. In this talk, you will learn about why MongoDB is the world’s leading modern data platform, the vibrant community it offers, and its exciting future roadmap. We’ll also show MongoDB and TIBCO Spotfire in action together, including a semi-structured application for well production auto-forecasting and competitor analysis.

Monsanto Company: Michelle Lacy, Biotechnology Data Strategy Lead

A paradigm shift in visual analytics and collaboration at Monsanto

Spotfire is widely used across Monsanto and has enabled R&D, finance, and supply chain teams to communicate findings and concepts through powerful analytics, intuitive visualizations, and associated statistical results. Using Spotfire to transform data into information enables business leaders to make well informed and timely decisions on product advancements, new markets, and company direction.

Apache: Travis Osborne, Director of Information Management and Mark Ritch, Manager of Business Intelligence

Power to the people! Establishing a data-driven culture @ Apache Corporation

Apache Corporation began its data-driven transformation in 2016, which brought thorough examination of definition, storage, integration, and delivery of business data to align with the company’s mission, which also recently changed. The speakers will describe the DataNOW program, a series of offerings to broaden use of visualization and data discovery company-wide, as well as the company’s data re-architecture, innovations delivered, and revelations that can come from re-thinking data.

BP-America L48: Gustavo Carvajal, Sr. Reservoir Engineer

Smart DCA, faster DCA, and intuitive analysis

Fitting production history with the traditional Arps’ equation or any other decline curve analysis (DCA) is a well-known approach, but becomes time-consuming when repeated for more than 100 wells, leaving little time to analyze results. Typically, a manual workflow starts by collecting daily/monthly production data, selecting the DCA method, and changing b and d factors until history data is fitted―all requiring significant time. We believed it could be improved. Using AUTODCA, TIBCO Spotfire enhanced visualizations, and TIBCO Enterprise Runtime for R (TERR), an entire DCA analysis can be generated in minutes for 100 wells, providing unprecedented evaluation of production and reservoir performance for the field, connecting dots, and correlating cross-functional information to optimize well and fracture spacing.

Chevron: Calvin Caraway, Reservoir Engineer

Using Spotfire as a one stop shop for evaluating performance trends in the unconventional space

Evaluating an unconventional play involves careful management of large datasets and often several software suites to answer the host of questions that arise. By combining the interactive visual capabilities of TIBCO Spotfire with Iron Python and TERR, Chevron has developed an application designed to tackle this challenge, specifically in the Permian Basin. We will cover several app use cases and highlight key features including decline curve analysis, spatial interpolation, and data distribution assessment.

Chevron Supply & Trading: Steven Boyd, Global Advisor – Supply & Transport Raw Materials Processes

Delivering value to Chevron downstream & midstream through better insights

The use of Spotfire in Chevron’s downstream and midstream businesses has grown exponentially in the past few years. Visualization and modeling capabilities have led to additional insights and value. We will share the modeling and analytics approach, highlight a few Spotfire dashboards, and show demonstrations of two of the modeling applications.

ConocoPhillips Company: Neha Reddy, Analyst, Automation & SIV

Increased transactional data mining drives decision-making in ConocoPhillips’ supply chain

Supply Chain Integrated Visibility (SIV) is a new reporting solution helping ConocoPhillips’ focus less on finding and curating data and more on data-driven actions. SIV uses Spotfire to easily visualize transactional data and empower users to analyze relevant business information, supported by reactive, easy-to-use, intuitive dashboards for varying skill levels. We will demonstrate some of the key visualization features and change-management tactics deployed by SIV.

LINN Energy: Audrie Luna, Business Analyst

Viral Spotfire: Growing and maintaining analytics at LINN Energy

Does your company struggle to gain user adoption of Spotfire? Spotfire is integral to LINN Energy’s self-service analytics platform. We will discuss how we took Spotfire from a rarely used application to a viral BI solution that is now used company-wide, from IT to Operations to Finance. We will also cover the key drivers to consider when building analyses for mass consumption and show examples of some of the core analyses used at LINN.

NRG Energy: Joe Dominic, Director, Business Operations and Lucas Fontenelle, Senior Analyst, Business Operations

Optimizing financial performance through plant maintenance scheduling

NRG focuses on maintaining safe and reliable equipment to support plant operations and financial performance. Scheduling equipment maintenance and outages during periods of low-margin power generation is part of this effort, which starts with an economic evaluation using Spotfire. In addition to project information, historical and forecasted market pricing data serves as a guide for planning. We will demonstrate a few examples of combining lost margin analysis and historical maintenance spending to plan outages with the aim of translating operating performance to profitability.

Southwestern Energy: Paul Melton, HSE Transportation & Logistics Superintendent and Olaf Schroeder, Senior Systems Analyst – Business Intelligence

Squeezing Value+ from in-cab telematics data

As in-cab telematic devices have become commonplace in O&G, useful information needed to manage fleet safety and efficiency is often overlooked or simply missed in the data exchange. Companies frequently look to the in-cab device vendor to provide the best analysis when they themselves have a better view of the big picture. By partnering with a quality provider of in-cab telematics, then bringing that driver performance data into a TIBCO Spotfire environment, SWN created a powerful analytic tool for understanding how length of employment, vehicle age, job classification, employee age, supervisor, and many other factors not known by the telematics vendor, all play into the safety of our drivers and the impact to our bottom line.

XTO Energy, Inc: Manny Rosales, Data Analytics – Team Lead

Spotfire data functions and Hadoop: Dynamic GeoSpatial models from .LAS files using Hortonworks, R / TERR, and Spotfire

The Data Integration and Analytics team at XTO Energy, in coordination with Hortonworks and ExxonMobil, developed an Hadoop data source containing XTO .LAS files originally stored in network folders. This combined effort created single data source for all wireline measured data in .LAS format that is easy to query. Using existing Hortonworks processes and Hadoop capabilities, XTO combined Spotfire and TERR to dynamically generate queries on the Hadoop dataset, allowing insight generation in seconds instead of hours. Combining .LAS data with XTO’s interpreted tops also led to daily generation of predefined variables for more robust spatial analysis.

Discover more and register to join us at our 15th annual Houston Energy Forum next month! Follow along with the event coverage using #EnergyForum17.

Let’s block ads! (Why?)

The TIBCO Blog

Learn how to convert website visitors into customers with Dynamics 365

CRM Blog Learn how to convert website visitors into customers with Dynamics 365

Did you know that less than 12% of visitors to your website take any action? How do you leverage website visit data to engage with visitors and drive conversions? Join us to learn how.

Most visitors to your website have an interest in your products/services or they wouldn’t be visiting in the first place. If you are not using website visitor data in your marketing & sales efforts you are missing significant opportunity.

There are a number of ways to integrate your website with Dynamics 365 to convert visits to customers. During this informative webinar we will review the options for integrating your website with Dynamics 365.We will also demonstrate how to create a closed-loop process for harvesting website visitor intelligence and applying a sales process in Dynamics 365 to assure action is taken.

All attendees will receive “Top Productivity Hacks”, a valuable whitepaper with the top hacks used by some of the most successful business owners across the globe.

Click here to register for this free training

This free training is brought to you by Sträva Technology Group. We help businesses evaluate, plan, implement & support Dynamics 365 solutions. More information - www.stravatechgroup.com

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Learn How to Integrate in Minutes with TIBCO Cloud Integration

rsz bigstock white paper clouds on blue bac 97760516 Learn How to Integrate in Minutes with TIBCO Cloud Integration

In order to stay competitive, your company needs to become a fully connected or integrated organization. In other words, a digital business. An organization where technologies offer innovative ways to connect, collaborate, conduct business, and build bridges between people and devices.

The reality is, this transition is proving harder than expected for many. A lot of companies are finding that their systems can’t connect to today’s cutting edge, all-important cloud services because they are on-premises, behind firewalls, or legacy, and therefore not built with cloud in mind. What to do? How can your business remain agile and competitive in today’s landscape and easily and quickly transition into a digital organization without heavy costs?

Come check out TIBCO’s Cloud Integration in action and see how easy it is to integrate everything. The TIBCO Cloud Integration live demo series, starting July 11th, will run through the end of November. With this powerful yet easy to use integration tool, you don’t have to break up with your legacy applications and systems to transform into a digital company. TIBCO Cloud™ Integration allows users to quickly and easily connect all applications, systems, mobile devices, and partners both inside and outside your company’s network, regardless of how old they are or if they are cloud-native or not. And, you don’t have to be an IT expert to accomplish this. TIBCO Cloud Integration allows any user—from marketing, to development, to IT, to office hero—the ability to access integration functionality, all from a web browser.

See TIBCO Cloud Integration come to life and how it can:

  • Automate everyday workflows
  • Give you quick time to results
  • Provide a low barrier to entry (both with cost and ease of use)
  • Connect all of your SaaS apps without IT

These webinars are designed to be highly interactive, so bring your questions and use cases. If the first demo doesn’t work for you, sign up for a date that suits your calendar. We recommend you start your TIBCO Cloud Integration trial so you can follow along in real time. Looking forward to seeing you there!

Let’s block ads! (Why?)

The TIBCO Blog

Google advances AI with ‘one model to learn them all’

 Google advances AI with ‘one model to learn them all’

Google quietly released an academic paper that could provide a blueprint for the future of machine learning. Called “One Model to Learn Them All,” it lays out a template for how to create a single machine learning model that can address multiple tasks well.

The MultiModel, as the Google researchers call it, was trained on a variety of tasks, including translation, language parsing, speech recognition, image recognition, and object detection. While its results don’t show radical improvements over existing approaches, they illustrate that training a machine learning system on a variety of tasks could help boost its overall performance.

For example, the MultiModel improved its accuracy on machine translation, speech, and parsing tasks when trained on all of the operations it was capable of, compared to when the model was just trained on one operation.

Google’s paper could provide a template for the development of future machine learning systems that are more broadly applicable, and potentially more accurate, than the narrow solutions that populate much of the market today. What’s more, these techniques (or those they spawn) could help reduce the amount of training data needed to create a viable machine learning algorithm.

That’s because the team’s results show that when the MultiModel is trained on all the tasks it’s capable of, its accuracy improves on tasks with less training data. That’s important, since it can be difficult to accumulate a sizable enough set of training data in some domains.

However, Google doesn’t claim to have a master algorithm that can learn everything at once. As its name implies, the MultiModel network includes systems that are tailor-made to address different challenges, along with systems that help direct input to those expert algorithms. This research does show that the approach Google took could be useful for future development of similar systems that address different domains.

It’s also worth noting that there’s plenty more testing to be done. Google’s results haven’t been verified, and it’s hard to know how well this research generalizes to other fields. The Google Brain team has released the MultiModel code as part of the TensorFlow open source project, so other people can experiment with it and find out.

Google also has some clear paths to improvement. The team pointed out that they didn’t spend a lot of time optimizing some of the system’s fixed parameters (known as “hyperparameters” in machine learning speak), and going through more extensive tweaking could help improve accuracy in the future.

Updated 10:45: This story initially said that there was not a timetable for releasing the MultiModel code under an open source license. The code was released last week. This story has been updated to note that and include a link to the repository. 

Let’s block ads! (Why?)

Big Data – VentureBeat

M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

Let’s say I have ten minutes with an “uninitiated” Excel pro – someone who slings VLOOKUP and Pivots all the time, but has no idea that Power Pivot and Power Query exist.  And in those ten minutes, I’ve got to quickly demonstrate the capabilities of “Modern Excel” or Power BI (where the former means “Excel with Power Pivot and Power Query”, and the latter contains both of those same technologies).

volleyball 1 M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

M “Sets Up” DAX.  And then DAX… Spikes Success All Over the Competition.
(If Will Ferrell ever made a volleyball movie, I could imagine him saying something like that)

I’m going to inevitably spend eight minutes on DAX, and then two minutes on Power Query at the end.  I favor that 80/20 split because I think that’s a realistic picture of your future as an Agile BI Pro:  80% DAX, 20% Power Query/M.

But for the uninitiated, Power Query is always the sexier demo.  Always.

Why is that, and how do I “square” that with my 80/20 ratio, and my stubborn insistence to not “lead” with the sexier demo?

It makes total sense to me actually.  Traditional (pre-DAX, pre-M) Excel workflows always have involved a heavy dose of manual data munging.  Squashing a folder full of CSV’s together into a single table, for instance, was just as much “a thing” in 2002 as it is now, fifteen years later.  If you can reduce that to a few clicks, AND then automate all subsequent “runs” of that work down to a single click, wow, that’s mind-blowing to longtime Excel users.

So the magic of Power Query is instantly apparent and tangible to basically any Excel Pro.  They can immediately see how PQ will save them oodles of time and anguish.

The benefits of DAX and relationships, by contrast, are less readily-apparent on first glance.  Portable/re-useable formulas that enable rapid iteration, the answering of “emergent” questions in near real-time, as well as a “subdivide and segment” capability?  Or how about multi-data-table capabilities that provide an integrated and convenient view across many different sources of formerly-siloed data?  These concepts are simply alien to the longtime Excel user, even though they are MONSTERS in terms of their biz value (as well as time-savers and anguish-reducers).  None of the impact “lands” up front because it can’t adequately be contemplated until you’ve started DOING it.

“Help me do what I already do, faster” is FAR easier to absorb than “blow the doors off my existing world, unlocking entirely-new workflows.”

image thumb M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

Yeah, fair question – PQ does get used FIRST, prior to DAX and Relationships, in any XLSX or PBIX file you create.  So, um, why do I say learn it second?

Here’s a slide we use fairly often, in which we illustrate the relationship between Power Query (aka the M engine) and Power Pivot (aka the DAX/Relationships engine):

image thumb 1 M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

Power Query/M is a “Preprocessor/Cleaner” for Data Before it’s Fed to DAX

So let’s say that Power Query “feeds” the DAX / Power Pivot engine.

And Power Query can “make” just about anything (in terms of output data shapes) – let’s start out using cake, burgers, croissants, and pizza as a demonstration of its culinary range…

image thumb 2 M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

Well then…  what does the DAX engine like to eat?  Shouldn’t we get to know it better first?  What if it has dietary preferences, allergies, or other particular needs?

image thumb 3 M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

Sure, fine, but let’s ride it to its conclusion, if for no other reason than we employed a lot of clipart in the making of this post and clipart needs jobs, too.

Here’s the gist:  if DAX is where the transformative power truly lies (which is true), and it has strong preferences about the shapes of data you feed it (also true), we need to understand that before we can use Power Query / M properly.

And some of the things it likes/prefers might NOT be things we’d even think to “cook” (hence the coffee example above).  So, we need to get a feel for what DAX likes to eat.

No, nothing so extreme.  What I’m really cautioning about are the dangers of an insidious disease,Queryus Infatuationus, that tends to strike adopters of Modern Excel and Power BI in their early days of usage.

In short, the malady is this:  “look, I’ve got this cool new M hammer!  It’s the answer to my dreams and clearly I should use it for everything!”

I’ve seen M used extensively as an analysis tool, for instance, and that is Not What You Want to Do.  Using M to pre-aggregate and analyze data (instead of DAX), is akin to using SQL for that same purpose (also instead of DAX), and I’ve written about the latter before.  Short version:  you sacrifice all those lovely benefits of portable formulas I mentioned at the beginning of this article (iteration, answering emerging questions, subdivide and segment, etc.)

Bottom line:  in the early going you are going to be learning both (unless all your data lives in databases, and the admins of said databases are willing to make changes for you – in which case you may not need Power Query / M very much).

I’m suggesting, specifically, that you should advance to Intermediate skill in DAXbefore advancing to Intermediate skill in Power Query / M, and then to Mastery of DAX before Mastery of M.

Another take on this:  if you’re learning to write M scripts from scratch (as opposed to getting by with the toolbar buttons and the occasional google search for a snippet of M), and you haven’t yet conquered CALCULATE, ALL, and FILTER, you’re probably getting too deep into M at the expense of your DAX skill.

I think it’s only honest at this point to say that “getting by with the toolbar buttons and the occasional google search for a snippet of M” is exactly where *I* am at personally with Power Query / M.

On one hand, you may read that and say “oh great, he’s just advising us to be like him so he doesn’t feel bad.”  Even I wondered about that a bit, while writing this article – I was on the lookout for my own confirmation bias because my inner critic doesn’t stay inner – he sits on my shoulder while I write.  And I’m very much aware that at my own company, I rank in the bottom 10 percentile when it comes to M.

But on the other hand, I’m very confident in my assertion that M is not the ideal producer of final results – you don’t want to feed M output directly into visualization layers.  The bar charts at the top of this article paint a proper picture – you don’t get any of the Ten Things Data Can Do For You until you learn how to wield DAX proficiently.  So you absolutely are sacrificing TREMENDOUS capabilities if you leave DAX out of your equation – even if “leaving it out” simply means “not fully leveraging it.”  And if you ARE fully leveraging it, it has a “say” in what you should be doing (and learning!) with M.

Go get ‘em.

Let’s block ads! (Why?)

PowerPivotPro

Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

00 views 300x225 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

One of our most frequently requested customizations is to add new views to the set of system views that come out-of-the-box with Microsoft Dynamics 365. It seems most of our clients are hesitant to play around with modifying these system views or adding new ones out of fear that it may be too difficult or complex. However, this could not be further from the truth. The ability to customize Microsoft Dynamics 365 is very easy to complete, and almost easier to learn HOW to complete these customizations.

Below are some simple steps to get you started in creating and customizing views for the Account entity. Using the below instructions as a framework, you will be able to come up with some pretty exciting combinations to fulfill your business needs – and therefore, conquer the world!

We will first walk through how to change the default view.

1.)In CRM, click on the module menu drop down, then click on Settings, then click on Customizations

041217 2054 Creatingand1 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

2.)Click on Customize the System- this will open another window

041217 2054 Creatingand2 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

3.)In the new Customizations window, expand the triangles to expand Entities, then expand Account and click on Views.

041217 2054 Creatingand3 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

4.)To change which view is the Default View, click to select the Active Accounts view, then click More Actions, then click Set Default

041217 2054 Creatingand4 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

As you can see, changing the default view is quite easy. It may be more challenging to decide which view you and your users want to be the default view.

We will now walk through the steps to create a new view. These options are available in the same area in Customizations from which we set the default view.

1.)To create a new View, simply click on the New button on the main action bar.

2.)In the New View window, type a name and optionally a Description. I choose to name my new view My New Accounts

041217 2054 Creatingand5 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

3.)Click OK.

4.)On the right hand navigation, click Add Columns

041217 2054 Creatingand6 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

5.)In the Add Columns window, you can select which columns to add to your view by checking the boxes next to them

041217 2054 Creatingand7 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

6.)You can rearrange the columns by highlighting one and click on the left or right arrows to move them

041217 2054 Creatingand8 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

7.)You can also choose which data appears in the View by clicking on Edit Filter Criteria

041217 2054 Creatingand9 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

8.)Since I want to show Accounts that I own and that were created in the last 30 days, I will update my filter criteria to reflect this:

  • Click Select and grab the Owner field; the next filter part defaults to Equals Current User, so I will leave this.
  • Click Select on the next line and grab the Created On field. Click where it says On in the middle and scroll down to grab Last X Days. Then click next to that where it says Choose Date and type 30.
  • Your finished query should look like this:

041217 2054 Creatingand10 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

9.)You are also able to choose which column to sort on by clicking on Configure Sorting in the right navigation

041217 2054 Creatingand11 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

10.)In this window, you are able to sort based on two columns; I have chosen to also sort based on Account Name as well as Created On date

041217 2054 Creatingand12 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

11.)Lastly, you can also adjust the width of your columns individually by selecting a column and clicking on Change Properties in the right navigation

041217 2054 Creatingand13 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

12.)In this window, you can select from a variety of pixel widths for your columns. The default is 100; however, I wanted to make my date column more narrow

041217 2054 Creatingand14 Creating and Editing Views: 1.) Learn the Basics. 2.) Conquer the World!

The most important thing to remember here is that your users will not see any of these changes until you click Save and Close and the Publish. If your users report that they are not able to see the view immediately after you published, just ask them to refresh their browser and it should appear.

You can use the above items to come up with infinite possibilities of different views for all of your different record types, as well as modifying the views after you have saved them. As you can see, there was no coding required. That is the hallmark of Dynamics 365: its ability to seamlessly match the technology with the needs of each business.

For more tips and tricks, check out these similar blogs:

Who Wore it Best: System View vs. Personal View

Assigning System Views Based on Security Roles

Happy CRM’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Learn how to “think like a Freak” with Freakonomics authors at Microsoft Data Insights Summit

With more than 5 million copies sold in 40 countries, Freakonomics is a worldwide phenomenon — and we’re thrilled to announce that authors Steven Levitt and Stephen Dubner will join us for a special guest keynote at Microsoft Data Insights Summit, taking place June 12–13, 2017 in Seattle.

The Microsoft Data Insights Summit is our user conference for business analysts — and the place to be for those who want to create a data-driven culture at their organization. Now in its second year, the event is packed with strong technical content, hands-on workshops, and a great lineup of speakers. Plus, attendees can meet 1:1 with the experts behind Microsoft’s data insights tools and solutions, including Microsoft Power BI, SQL Server BI, Excel, PowerApps, and Flow.

From their bestselling books, to a documentary film, to a podcast boasting 8 million monthly downloads, authors Levitt and Dubner have been creating a data culture all their own, showing the world how to make smarter, savvier decisions with data. With their trademark blend of captivating storytelling and unconventional analysis, the duo will teach you to think more productively, more creatively, and more rationally — in other words, they’ll teach you how to think like a Freak!

Levitt and Dubner will get below the surface of modern business practices, discussing the topics that matter most to today’s businesses: how to create behavior change, incentives that work (and don’t work), and the value of asking unpopular questions. Their keynote will leave the audience energized, prepared to solve problems more effectively, and ready to succeed in fresh, new ways.

If you want to learn how to use data to drive better decisions in your business, you don’t want to miss this keynote — or the rest of the Microsoft Data Insights Summit. Register now to join us at the conference, June 12–13. Hope to see you there!

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Learn how to “think like a Freak” with Freakonomics authors at Microsoft Data Insights Summit

With more than 5 million copies sold in 40 countries, Freakonomics is a worldwide phenomenon — and we’re thrilled to announce that authors Steven Levitt and Stephen Dubner will join us for a special guest keynote at Microsoft Data Insights Summit, taking place June 12–13, 2017 in Seattle.

The Microsoft Data Insights Summit is our user conference for business analysts — and the place to be for those who want to create a data-driven culture at their organization. Now in its second year, the event is packed with strong technical content, hands-on workshops, and a great lineup of speakers. Plus, attendees can meet 1:1 with the experts behind Microsoft’s data insights tools and solutions, including Microsoft Power BI, SQL Server BI, Excel, PowerApps, and Flow.

From their bestselling books, to a documentary film, to a podcast boasting 8 million monthly downloads, authors Levitt and Dubner have been creating a data culture all their own, showing the world how to make smarter, savvier decisions with data. With their trademark blend of captivating storytelling and unconventional analysis, the duo will teach you to think more productively, more creatively, and more rationally — in other words, they’ll teach you how to think like a Freak!

Levitt and Dubner will get below the surface of modern business practices, discussing the topics that matter most to today’s businesses: how to create behavior change, incentives that work (and don’t work), and the value of asking unpopular questions. Their keynote will leave the audience energized, prepared to solve problems more effectively, and ready to succeed in fresh, new ways.

If you want to learn how to use data to drive better decisions in your business, you don’t want to miss this keynote — or the rest of the Microsoft Data Insights Summit. Register now to join us at the conference, June 12–13. Hope to see you there!

Let’s block ads! (Why?)

Analysis Services Team Blog

5 Powerful Lessons Content Marketers Can Learn from Journalism

5 Powerful Lessons Content Marketers Can Learn from Journalism 351x200 5 Powerful Lessons Content Marketers Can Learn from Journalism

Are you planning to create more content this year? If so, you aren’t alone. In fact, 66 percent of B2B marketers plan to produce more content in the coming months. They too understand that creating valuable content for their target audience is a surefire way to beef up results and revenue. Content marketing costs 62 percent less than traditional marketing and generates about three times as many leads.

A larger amount of content, however, also means something else … more noise. So how can your content stand out? The answer is simple: Think like a journalist. Here are five tips to swipe from journalism to strengthen your content marketing efforts this year.

1. Get to the Point (aka Don’t Bury the Lead)

Aspiring journalists are taught the “inverted or upside-down pyramid.” The structure is pretty straightforward and also very effective. The most important details appear at the top of the story, either in the lead paragraph or a close-to-the top “nut graph” that gives the point of the story and explains its news worthiness in a nutshell. Less-essential details appear, in order of importance, in the subsequent paragraphs.

When crafting the lead, start with the “five W’s” of journalism, better known as “the who, what, when, where, and why.”

The body of a news story also includes things like quotes from sources (more on this a little later), statistics, background information, and other critical details, with the least important information at the end.

This structure is pretty old, but there is one reason why it’s still relevant today, especially for content marketing. People are skimming content when reading online. In fact, readers don’t consume 80 percent of the words you write!

Experiment with this journalism-inspired structure on your next content marketing piece and monitor the results. You may be surprised by higher engagement and increased results.

2. Create Greater Depth: Use Quotes

Journalists use quotes for a variety of reasons, such as crafting a stronger narrative, adding credibility, and bringing a story to life ― and you can too. Each piece of content that you create has an underlying audience pain point, something that connects you to the audience. For example, let’s say you decide to generate a blog post series on sales and marketing alignment, a known pain point for your target audience and a problem your software offering resolves.

Adding quotes to the content marketing piece will not only illustrate that pain point, but also make your piece more reliable. Here are a few tips for using quotes strategically in your content marketing.

  • Select experts. Seek people in the industry who are considered authorities on a particular subject. For example, perhaps choose a technology security expert when you’re strategically creating content that supports a software security product.
  • Seek peers. Who is reading your content? Maybe it’s the CIOs at small-to-medium-sized organizations. If so, seek out these types of people to discuss common pain points and quote them in your content marketing.
  • Partner with influencers. Who is an established influencer in your space? For example, perhaps you’re strategically creating content about the Internet of Things (IoT). If so, seek IoT influencers through the social media channels in which your audience engages, such as Twitter and LinkedIn.

Once you speak with these experts, quote them in your content marketing, just as a journalist would, to add life to your story.

3. Leverage the Power of Storytelling

Some of the greatest journalists are also skilled storytellers. Some media outlets are even creating new visual formats specifically for storytelling, such as the New York Times story “Snow Fall,” which uses text and digital visuals to tell about an avalanche at Tunnel Creek in Stevens Pass in Washington state. Within six days, the story was viewed over 3.5 million times.

The Washington Post put together an interactive visual journey through Lesbos in 2016. The engaging documentary uses video, text, and even audience participation (in the form of yes/no questions) to learn about the experiences of refugees on the Greek island.

These two examples can inspire marketers to create their own interactive content marketing that leverages storytelling. Here are a few tips for creating more powerful stories:

  • Use emotion. The best way to draw a reader into a tale is to build an emotional connection. Find a narrative in history that relates to your content marketing message. For example, if you’re telling a story about overcoming failure, you might highlight the anecdote of how Walt Disney’s newspaper editor told the aspiring cartoonist that he simply was not creative enough. The Kansas City Star editor said that Disney “lacked imagination and had no good ideas.” When you find a great story that ties into your message, readers will instantly feel more engaged.
  • Create challenge and conflict. The best stories present a problem, obstacle, or struggle that needs to be resolved or overcome. The conflict can be simple, such as a CIO struggling with an increased risk of security breach ― and identifying his or her enterprise’s weaknesses.
  • Shock and awe. A recent Entrepreneur article highlights the power of “shock and awe” in human thinking patterns. It says that “We process the vast exposure to information and try to spit out a logical understanding. A break in that linear pattern is like a splash of icy water on your face. That’s why movies like ‘The Sixth Sense,’ ‘Fight Club,’ and ‘Romeo & Juliet’ are capturing. The twist endings created a mental pattern break.” Look for opportunities to provide something unexpected or surprising in your content marketing.
  • Drop early hints. Leave breadcrumbs that show what is coming in the near future to keep your readers engaged.
  • Create stories that are sharable. People share content for a variety of reasons, but mostly they’re trying to be helpful to peers and demonstrate authority in their niche. If your content marketing aligns with these purposes, it will be easier to share.

Look at your content and start asking, “Where is the story?” and “Would I want to read this?” This will help you create content that is more engaging and valuable to your target audience.

4. Leverage Big-Picture Trends

Want to create a blog post or a piece of content that has a greater chance of going viral? Piggyback hot trends. This strategy, which is often used by journalists, takes trends that are making the news and ties an angle to a specific topic or industry.

Find these hot topics by watching the headlines and by monitoring trade publications or industry-related media sources that your target audience follows. Look for relevant statistics, topic coverage by multiple publications, and online content that is setting a tone of comments and engagement. Then jump on those trends and tie them into your content.

5. Fact-Check Like a Pro

Good journalists are meticulous about fact-checking, mostly because they have a responsibility to their readers and also because somebody else will ensure they’ve done their due diligence on reporting. Generations of reporters have been trained with the popular saying “If your mother says she loves you, check it out.”

Fact-checking is even more important today because incorrect statements don’t get recycled in the most current edition of the news. Instead they spread on social media through tweets and shares. Here are a few fact-checking tips for marketers:

  • Find the best sources possible. For example, when reading the New York Times, you may find a great fact that fits nicely into the content that you’re publishing. Instead of citing the publication, go to the original study or source, verify it, and then include the information.
  • Use two reliable sources when possible. Some media outlets have a rule that all facts should be confirmed by two sources.
  • Determine whether the source is reliable. Ask yourself some basic questions about each source, such as “Is it reliable, trustworthy, and unbiased?” If not, look for another source with similar facts.
  • Confirm with subject matter experts. Did you find a really great fact but need another source to back it up or add additional credibility? If so, interview a subject matter expert and get his or her opinion to create additional credibility.

One more tip. When doing research, it’s tempting to quote people who say great things that support your piece. But instead, try getting in touch with the person and obtaining a fresh new quote that relates specifically to your content marketing piece. The content will instantly be much stronger.

Audience First … Always

Last, great journalists always put the audience first. They are providing a service; they are educating, informing, and sharing information. Content marketers can do exactly the same thing while still getting excellent results and earning more trust from their consumers.

Understand the audience’s pain points, deliver content that hits these problems ― and always look for angles that other marketers are missing. When you do this, you’ll quickly cut through the noise and become an invaluable resource for your target audience.

What do you think journalism can teach content marketing? Please share your thoughts below!

Let’s block ads! (Why?)

Act-On Blog