Tag Archives: Learn

Learn, Use, and Manage Microsoft Flows with these 2018 Updates

Biz App Microsoft Flow 300x225 Learn, Use, and Manage Microsoft Flows with these 2018 Updates

The Microsoft Flow team has been hard at work making important improvements this year. Flows are easier to learn, use, and manage than ever before. In this blog we share a few of our favorite updates to Microsoft Team Integration, Approvals, Sharing Flows, and the Flow Documentation site!

Microsoft Teams Integration

This goes beyond the Teams connector, which already allowed you to create Flows that could take actions in Teams, for example by posting a message. This gives you the ability to interact with Flow from inside Teams. The first step is to install the Flow app from the Team store.

071018 1715 2018Updates1 Learn, Use, and Manage Microsoft Flows with these 2018 Updates

Go to the More Apps menu in Microsoft Teams to access the Flow app. The Flows tab shows your flows and team flows. The Approval tab shows sent and received approvals and the Conversation tab allows you to interact with Flow Bot.

071018 1715 2018Updates2 Learn, Use, and Manage Microsoft Flows with these 2018 Updates

Flow Bot can tell you what it can do, provide you with a list of flows it can run, run a scheduled flow on demand, run a manually triggered flow that does not have inputs, and show the description for a flow.

071018 1715 2018Updates3 Learn, Use, and Manage Microsoft Flows with these 2018 Updates

Approvals

Modern approvals in Flow allow you to quickly set up an automated approval workflow for your data in many systems. Proposed content in SharePoint, Twitter, Visual Studio, and more, can be set up such that actions are automatically taken once the proposed content is approved. Approvers receive emails notifying them of the approval request and users can interact with approvals in a unified Approvals Center in Microsoft Flow.

What’s new in modern approvals? The detail of the approval request can now have rich text, lists, links and tables using Markdown. In addition, approvals you receive can now be reassigned to a different person.

The new History tab in the Approvals center shows all requests, both sent and received. Filter by title or direction and sort ascending or descending by date. Click on a row to see the details of an approval request including the comments and the exact time of the request.

071018 1715 2018Updates4 Learn, Use, and Manage Microsoft Flows with these 2018 Updates

The modern approvals connector is now built on the latest version of the Common Data Service for Apps (CDS for Apps). This allows you to build flows that act based on the status of approvals you receive or send. For example, you can build flows to notify you of new approval requests, remind you of pending approval requests, or notify you when a request you sent is approved. Use the template created by the Microsoft Flow team as a starting point. Modify the data operations, variables, and conditional actions to fit your scenario.

071018 1715 2018Updates5 Learn, Use, and Manage Microsoft Flows with these 2018 Updates

Sharing Flows

We have been able to share flows by adding people as co-owners, or (for manual flows only) adding them as run-only users (which makes the flow button available to them in the mobile app). Now, it is possible to add an Office 365 Group (including a Microsoft Team) as the co-owner or (for manual flows) as a run-only user. If the flow has steps which interact with a SharePoint list, the list can be invited to be a co-owner of the flow or (for manual flows only) as a run-only user, giving access to the flow to all members of the list.

When you add owners to a flow, you will see exactly which connections and permission are being provided to them.

071018 1715 2018Updates6 Learn, Use, and Manage Microsoft Flows with these 2018 Updates

Since a shared flow always runs with your connections and data, it’s important to evaluate whether you trust the author of the flow and whether you agree with the actions in the flow before running it. When running a flow, we can now see details of all the actions and connections in the flow, including details of SharePoint sites accessed by the flow, if applicable.

071018 1715 2018Updates7 Learn, Use, and Manage Microsoft Flows with these 2018 Updates

Documentation Landing Page

As you can see, Microsoft Flow is growing and improving all the time. To help you find the documentation most relevant to where you are in your Flow journey, the documentation page now has links to different content for different audiences. Whether you’re a beginner, intermediate user, expert or admin, there is a path for you. This makes it easier to find information you can use right away so you can get started having fun and being productive with Microsoft Flow!

071018 1715 2018Updates8 Learn, Use, and Manage Microsoft Flows with these 2018 Updates

Want to learn even more about Microsoft Flow? Join PowerObjects, this year’s Platinum Sponsor, at the Microsoft Business Applications Summit on July 22-24 in Seattle. Register with code HCL100dc to receive a $ 100 discount.

Please note: there are three different Flow plans. Microsoft Dynamics 365 applications and Office 365 plans may include Microsoft Flow Free, Microsoft Flow Plan 1, or Microsoft Flow Plan 2. To learn about pricing and features, please visit the Microsoft Flow Plan site.

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Health care bots are only as good as the data and doctors they learn from

 Health care bots are only as good as the data and doctors they learn from

The number of tech companies pursuing health care seems to have reached an all-time high: Google, Amazon, Apple, and IBM’s Watson all want to change health care using artificial intelligence. IBM has even rebranded its health offering as “Watson Health — Cognitive Healthcare Solutions.” Although technologies from these giants show great promise, the question of whether effective health care AI already exists or whether it is still a dream remains.

As a physician, I believe that in order to understand what is artificially intelligent in health care, you have to first define what it means to be intelligent in health care. Consider the Turing test, a point when a machine becomes indistinguishable from a human.

Joshua Batson, a writer for Wired magazine, has mused whether there is an alternative measurement to the Turing test, one where the machine doesn’t just seem like a person, but an intelligent person. Think of it this way: If you were to ask a random person about symptoms you experience, they’d likely reply “I have no idea. You should ask your doctor.” A bot supplying that response would certainly be indistinguishable from a human — but we expect a little more than that.

The challenge of health care AI

Health is hard, and that makes AI in health care especially hard. Interpretation, empathy, and knowledge all have unique challenges in health care AI.

To date, interpretation is where much of the technology investment has gone. Whether for touchscreen or voice recognition, natural language processing (NLP) has seen enormous investment including Amazon’s Comprehend, IBM’s Natural Language Understanding, and Google Cloud Natural Language. But even though there are plenty of health-specific interpretation challenges, interpretation challenges are really no greater in this particular sector than in other domains.

Similarly, while empathy needs to be particularly appropriate for the emotionally charged field of health care, bots are equally challenged trying to strike just the right tone for retail customer service, legal services, or childcare advice.

That leaves knowledge. The knowledge needed to be a successful conversational bot is where health care diverges greatly from other fields. We can divide that knowledge into two major categories: What do you know about the individual? And what do you know about medicine in general that will be most useful their individual case?

If a person is a diabetic and has high cholesterol, for example, then we know from existing data that the risks of having a heart attack are higher for that person and that aggressive blood sugar and diet control are effective in significantly lowering that risk. That combines with a general knowledge of medicine which says that multiple randomized controlled trials have found diabetics with uncontrolled blood sugars and high cholesterol to be twice as likely as others to have a cardiac event.

What is good enough?

There are two approaches to creating an algorithm that delivers a customized message. Humans can create it based on their domain knowledge, or computers can derive the algorithm based on patterns observed in data — i.e., machine learning. With a perfect profile and perfect domain knowledge, humans or machines could create the perfect algorithm. Combined with good interpretation and empathy you would have the ideal, artificially intelligent conversation. In other words, you’d have created the perfect doctor.

The problem comes when the profile or domain knowledge is less than perfect (which it always is), and then trying to determine when it is “good enough.”

The answer to “When is that knowledge good enough?” really comes down to the strength of your profile knowledge and the strength of your domain knowledge. While you can make up a shortfall in one with the other, inevitably, you’re left with something very human: a judgment call on when the profile and domain knowledge is sufficient.

Lucky for us, rich and structured health data is more prevalent than ever before, but making that data actionable takes a lot of informatics and computationally intensive processes that few companies are prepared for. As a result, many companies have turned to deriving that information through pattern analysis or machine learning. And where you have key gaps in your knowledge — like environmental data — you can simply ask the patient.

Companies looking for new “conversational AI” are filling these gaps in health care, beyond Alexa and Siri. Conversational AI can take our health care experience from a traditional, episodic one to a more insightful, collaborative, and continuous one. For example, conversational AI can build out consumer profiles from native clinical and consumer data to answer difficult questions very quickly, like “Is this person on heart medication?” or “Does this person have any medications that could complicate their condition?”

Not until recently has the technology been able to touch this in-depth and profile on-the-fly. It’s become that perfect doctor, knowing not only everything about your health history, but knowing how all of that connects to combinations of characteristics. Now, organizations are beginning to use that profile knowledge to derive engagement points to better characterize some of the “softer” attributes of an individual, like self-esteem, literacy, or other factors that will dictate their level of engagement.

Think about all of the knowledge that medical professionals have derived from centuries of research. In 2016 alone, Research America estimated, the U.S. spent $ 171.8 billion on medical research. But how do we capture all of that knowledge, and how could we use it in conversational systems? This lack of standardization is why we’ve developed so many rules-based or expert systems over the years.

It’s also why there’s a lot of new investment in deriving domain knowledge from large data sets. Google’s DeepMind partnership with the U.K.’s National Health Service is a great example. By combining their rich data on diagnoses, outcomes, medications, test results, and other information, Google’s DeepMind can use AI to derive patterns that will help it predict an individual’s outcome. But do we have to wait upon large, prospective data analyses to derive medical knowledge, or can we start with what we know today?

Putting data points to work

Expert-defined vs. machine-defined knowledge will have to be balanced in the near term. We must start with the structured data that is available, then ask what we don’t know so that we can derive additional knowledge from observed patterns. Domain knowledge should start with expert consensus in order to derive additional knowledge from observed patterns.

Knowing one particular data point about an individual can make the biggest difference in being able to read their situation. That’s when you’ll start getting questions that may make no sense whatsoever, but will make all the sense in the world to the machine. Imagine a conversation like this:

BOT: I noticed you were in Charlotte last week. By any chance, did you happen to eat at Larry’s Restaurant on 5th Street?

USER: Uh, yes, I did actually.

BOT: Well, that could explain your stomach problems. There has been a Salmonella outbreak reported from that location. I’ve ordered Amoxicillin and it should be to you shortly. Make sure to take it for the full 10 days. The drug Cipro is normally the first line therapy, but it would potentially interact badly with your Glyburide. I’ll check back in daily to see how you’re doing.

But while we wait for the detection of patterns by machines, the knowledge that is already out there should not be overlooked, even if it takes a lot of informatics and computations. I’d like to think the perfect AI doctor is just around the corner. But my guess is that those who take a “good enough” approach today will be the ones who get there first. After all, for so many people who don’t have access to adequate care today, and for all that we’re spending on health care, we don’t yet have a health care system that is “good enough.”

Dr. Phil Marshall is the cofounder and chief product officer at Conversa Health, a conversation platform for the health care sector.

Let’s block ads! (Why?)

Big Data – VentureBeat

What Did We Learn in 2018’s Cybersecurity Survey?

Cybersecurity Survey FICO Ovum What Did We Learn in 2018’s Cybersecurity Survey?

In 2017 we commissioned independent research company Ovum to carry out a cybersecurity survey among senior executives, with some surprising results. We and Ovum have just completed an even bigger cybersecurity survey, covering 500 senior executives in 11 countries. Organizations from a range of business were surveyed with sizes ranging in bands from 500 employees up to those with over 10,000 employees.

Through 20 questions, we uncovered some interesting statistics and #cybertrends, including:

  • 60% of businesses are expecting levels of investment in cybersecurity to go up in the coming year – for power and utilities companies it was 70%.
  • 76% of organizations have some level of cyber risk insurance – but only half of them consider it comprehensive cover.
  • When asked how cyber ready their organization is, a massive 95% think they are at least average compared to their competitors – a whopping 39% say they’re ‘top performers’.

Ovum discusses key findings from the cybersecurity survey in the white paper, ‘Cybersecurity Survey: Investments, Insurance and Inflated Confidence’. This considers whether IT and senior management are overstating their ability to deal with cyber attacks.

Read the paper now for more on the key findings that:

  • Organizations are overconfident about their cyber-readiness
  • Cyberthreats are rising and increased spending is the positive industry response.
  • Take up of cyber-risk insurance (CRI) is growing, but comprehensive use and satisfaction rates are low.
  • Pressure to improve cyberthreat protection is increasing from all sources.

I will provide further comment on our findings in future posts.

Let’s block ads! (Why?)

FICO

Nvidia is training robots to learn from watching humans

 Nvidia is training robots to learn from watching humans

Nvidia has developed a method to train robots how to carry out actions by first watching human activity. In initial applications, robots learned to pick up and move colored boxes and a toy car in a lab environment using a Baxter robot.

Learnings from such research will be used to retrain robots and create robots that can work safely alongside people in industrial settings and homes.

“In the manufacturing environment, robots are really good at repeatedly executing the same trajectory over and over again, but they don’t adapt to changes in the environment and they don’t learn their tasks,” Nvidia principal research scientist Stan Birchfield told VentureBeat in an interview.”So to repurpose a robot to execute a new task you have to bring in an expert to reprogram the robot at a fairly low level and its an expensive operation. What we’re interested in doing is making it easier for a non-expert user to teach a robot a new task by simply showing it what to do.”

The system has a series of deep neural networks that perform perception, planning, and control and the networks are trained entirely on synthetic data.

“There’s sort of a paradigm shift happening in the robotics community now,” he said. “We’re at the point now where we can use GPUs to generate essentially a limitless amount of pre-labeled data for free essentially for free to develop and test algorithms and this potentially going to allow us to develop these robotics systems that need to learn how to interact with the world around them in ways that scale better and are safer.”

The findings were shared today at the International Conference on Robotics and Automation (ICRA) taking place this week in Brisbane, Australia.

The new AI system was made with help from the Nvidia robotics research lab. First announced late last year, the lab now has six employees and is preparing to open up offices adjacent to the University of Washington in Seattle this summer.

The research lab will continue to work with the robotics community and in-house at Nvidia to explore the use of synthetic datasets for training AI systems, Nvidia head of robotics research Dieter Fox told VentureBeat.

Such knowledge could be used to strengthen the Isaac SDK, a framework for training robots with simulations first introduced in May 2017.

“Nvidia actually has been working in that domain for quite a while in the gaming context for instance where its all about setting up 3D virtual environments that are photo realistic and give you some kind of content modeling, and what we want to do is also work with all these teams that have all this expertise but help them expand it in a way so that it becomes better applicable in a robotics setting,” Fox said.

Research like the kind released today will be central to the creation of the next generation of robots, Fox said.

“We’re talking about robots that have to open doors, open drawers, pick up objects, move them around, even physically interacting with people, helping them, for example elderly people in the home,” he said. “These robots need to be able to recognize people, they need to see what a person wants to do, they need to learn from people, learn from demonstration for example, and they also need to be able to anticipate what a person wants to do in order to help them.”

Nvidia joins a growing number of companies like Google and SRI International interested in the development of AI systems with a kind of environmental awareness, as Google AI chief Jeff Dean put it, more “common sense.”

To read more about the new robotics research, see this Nvidia blog post.

Let’s block ads! (Why?)

Big Data – VentureBeat

Game publishers should learn how to use data better in 2018

 Game publishers should learn how to use data better in 2018

Making a video game is a hard endeavor even in the best of circumstances. Developers have to anticipate an audience’s desires years in advance, determine the best way to monetize their creation, and execute on a vision amid a cascading series of strict deadlines. And with so many new gaming formats becoming part of the gaming landscape, such as mobile, AR/VR, esports, and streaming, developers also have to consider the type of game they want to make.

It will never be an easy job, but it doesn’t have to be quite as hard as it is today. Market research has become a critical part of the design process in recent years. And there are more data streams than ever feeding into that research pool. Too few game makers are taking full advantage of the information that’s available to them –and even those that do generally focus solely on the success of a single title, rather than an entire brand.

This is partly because, for years, market research was limited to assembling player focus groups and siloed product data.

Where the problems are

User feedback is problematic because people’s stated opinions don’t always match up to objective facts or their behavior. Take Super Mario Run. When it launched in December 2016, it had over 50 percent negative (1 and 2 star) reviews. By that metric, it was a failure and you might predict poor sales, but it was downloaded over 200 million times! 12 years ago (before mobile games as we know them) those negative reviews might have tanked the whole project, but here we are, a little more than a year after launch, and Super Mario Run is still in the top games charts.

An additional factor for non-data-driven production is likely that studios and publishers  had limited product data to work with, and the data they did have, was siloed by teams dedicated to to build different parts of a game: level design, texture mapping, character animations, menu interactions, etc.

The success for each of these pieces was frequently measured independently with only a select few — if any — executives able to see the entire picture. This could lead to gaming experiences where one small issue, say with a menu load time, wound up stacked on top of another small issue where a certain enemy randomly generated too often. Looked at independently, each feature might be within their design specifications, with development teams giving the “green light” to move to production. But, when these issues are experienced together, players found themselves frustrated with constant, time-consuming  interruptions leading to a poor game experience, reviews and sales.

Either way it was a long and not a particularly data-driven process. But have no fear — instead, keep reading for some advice on how you can focus on utilizing the right data to increase success.

Taking a look back

By 2010, mobile gaming was impossible to ignore — and suddenly developers and publishers had a lot of concrete information to work with, including user reviews, engagement stats, download totals, and more. And because it cost substantially less to create mobile games than console titles, developers and publishers could rapidly deploy smaller games in short order. From there, they would monitor the feedback and find areas for improvement, then introduce new downloadable content that was tailored to the playing habits of customers.

You don’t have to look far for examples. Rovio’s Angry Birds made its debut in 2009 and quickly became a smash hit. Over the next four years, the game had 14 different versions and  got 3 billion downloads worldwide.

Other titles like: Clash of Clans, Candy Crush and Temple Run continued the agile, data-driven approach to game development, parsing user data and releasing new versions weekly, as well as new spin offs that were largely the same as the original, but tweaked enough to entice players to grab them.  Many can still be found on the top app charts today.

That new style of game making quickly proved to be a success. Mobile games became the fastest growing segment of the industry. In 2017, revenue from those titles hit or exceeded $ 40 billion. It was also the beginning of a more data-driven approach to development — one that would become essential for future, costlier development projects.

Hey, developers, use my data!

Delivery platforms have become invaluable partners for game makers, not only as storefronts but also as data providers. Steam, the leading digital distribution platform for PC games, offers players a number of options beyond its retail roots, including multiplayer gaming, video streaming and social networking services. In these offerings, as well as its storefront, Steam collects massive amounts of data, including Steam page traffic sources, visitor behavior, page visits and purchase information, that developers can analyze to develop the right games for the right audiences. Data from Steam, via APIs to the Steam platform, is sent daily, weekly or monthly to the game publisher. It’s not raw event-level data. Rather, it’s aggregated or summarized information that outlines trends, but doesn’t provide deep insights.

VR, AR, and esports

Developers are now trying to apply the lessons of online streaming to AR/VR. Gamemakers aren’t sure what kind of controller works best, or what kind of animation resonates with players. So while there’s no previous technology model to follow, there is a data-driven marketing model.

That model works with esports as well. Beyond capturing gameplay data points, competitive gaming has the added opportunity for measuring community engagement, via observer camera angles, team-chat, peer-to-peer betting and trading, and even fantasy teams. All these new interaction types supply data that can be used to improve the game experience and the viewing experience, and, ultimately, boost revenue.

The inclusion of game viewers is something new and potentially huge. Developers now not only have the opportunity to bring players into their brand, but also a large audience that didn’t download or purchase the game. It’s a situation where marketing models are even more critical.

Challenges with data and data aggregation

The challenge developers and publishers face, though, isn’t access to data, it’s access to the complete data picture. Creating a true brand experience doesn’t come from making standalone successful games, it requires getting players of one title to consider trying another they might normally not be aware of (or not normally be interested in playing). And that’s where data aggregation can really help.

Ninety percent of all of the world’s collective data (both inside and outside of the entertainment industry) was created in the last two years — with an additional 2.5 quintillion bytes generated each day. The sources continue to multiply. That’s means there’s a lot of information for developers to parse — and most of it gets lost. Creating a 360-degree view of each player and understanding their buying behavior across channels can increase long-term loyalty and maximize a company’s ROI.

How does that play in the real world? Consider this hypothetical: Plants vs. Zombies continues to be a phenomenally popular title on mobile and console. And between those two platforms, EA and PopCap Games have oodles of data to mine. But PvZ has grown into an industry, with plushies, action figures, books, board games and even slot machines. Those other categories can highlight, for example, which character in the game is popular or trending with players, That data can be used in determining where to focus DLC and expansion efforts and increase user engagement (which, in turn, makes them more likely to complete in-game purchases).

The same sort of data can be used with the Star Wars franchise, though as EA learned with the recent release of Star Wars Battlefront 2, it’s critical to not use the data in a way that is obviously meant to boost a title’s bottom line. Knowing the characters players love is one thing. Asking them to pay extra to access it is a riskier proposition.

Today’s game development studios, big and small, have access to more data about their customers than any creator ever has. But by failing to tie all of that information together, they’re making their jobs more difficult and sacrificing potential profits. Making confident, informed decisions in a timely manner requires a dynamic, scalable way to acquire and aggregate the fast-moving data streams. Companies that do so will find that the key to future success in the video game industry is having a firm grasp on the habits and preferences of their consumers, even if the consumers aren’t aware of those habits themselves. Those who ignore the opportunities that today’s data-rich environment presents do so at the risk of being left behind.

Erik Archer Smith is Marketing Director, ABM at Treasure Data.

Let’s block ads! (Why?)

Big Data – VentureBeat

Learn to Grow Like Ed “Too Tall” Jones

Posted by Ranga Bodla, Head of Industry Marketing

You probably remember Ed “Too Tall” Jones, and if you don’t, let us remind you:

Jones was one of the most prominent defensive players in the history of football, leading the Dallas Cowboys to three NFC Championships and a Super Bowl XII win. But after five years of playing professional ball, Jones took a “break” to focus on one of his true passions: boxing.  Even more incredible, he then returned to the football field for 10 more successful years with the Cowboys. Jones continually sought to evolve and grow as a competitor, an athlete and a teammate. His contributions to the game of football, his success in the boxing ring and his prodigious return to the field were a true testament to the realization of those goals and have contributed greatly to his current career as an entrepreneur.

 Learn to Grow Like Ed “Too Tall” Jones

Jones shared a few lessons he learned as an athlete during an appearance at a recent Grow Live event, which he’s used to find success off the field as a businessman–all of which will likely resonate with you, too.

Pursue Your Dreams–Even the Wildest Ones

Jones shared that his love of boxing from a young age inspired him to follow that dream in adulthood. He knew that he wanted to pursue the sport despite his incredibly successful and established football career, so he completed his full contract as well as his option year with the Cowboys before moving onto boxing. For reasons he was unable to share, he only boxed for a year (stay tuned – he says there will be a book at some point to tell us the real story). As he described, because he was able to get the boxing “elephant” off his back, he was able to return with a singular focus on football. Jones emphasized the importance of pursuing your dreams.  

Keep Your Doors Open

Building off of the lessons that he shared about pursuing boxing, Jones also advised the audience to never leave a door closed. Despite his success as a football player, the sport wasn’t his first passion. When he left football to pursue boxing, he was sure to maintain his relationships and keep all of his options open because he didn’t know where boxing would take him.  Because of his foresight, Jones was able to return to play with the Cowboys for another decade. When the boxing door closed, he had already ensured that the football door remained open.

The Importance of Your Team

Of course, Jones was passionate about the Cowboys and his teammates were a big part of his success on the football field. This emphasis on teamwork remained hugely important to him even after he completed his football career. Jones is now an entrepreneur and founder of several companies. He shared with us how he founded those companies with two other individuals and how the three of them, focused as a team, are able to be successful, play off each other’s strengths and drive success. He knew when he started that accomplishing anything alone can be incredibly difficult and building a team that you can rely on is vital. He applies the lesson learned in his professional career: If you can’t rely on your team, it all falls apart.

We’re excited to see where Jones’ next chapter takes him. If you want to join us for our next exciting story – come see us at a future Grow Live event.

Posted on Tue, February 27, 2018
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

Learn more about using Big Data Manager – importing data, notebooks and other useful things

In one of the previous posts on this blog (See How Easily You Can Copy Data Between Object Store and HDFS) we discussed some functionality enabled by a tool called Big Data Manager, based upon the distributed (Spark based) copy utility. Since then a lot of useful features have been added to Big Data Manager, and to share with the world, these are now recorded and published on YouTube.

The library consists of a number of videos with the following topics (video library is here):

  • Working with Archives
  • File Imports
  • Working with Remote Data
  • Importing Notebooks from GitHub

For some background, Big Data Manager is a utility that is included with Big Data Cloud Service, Big Data Cloud at Customer and soon with Big Data Appliance. It’s primary goal is to enable users to quickly achieve tasks like copying files, and publishing data via a Notebook interface. In this case, the interface is based on / leverages Zeppelin notebooks. The notebooks run on a node within the cluster and have direct access to the local data elements. As is shown in some of the videos, Big Data Manager enables easy file transport between Object Stores (incl. Oracle’s and Amazon’s) and HDFS. This transfer is based on ODCP, which leverages Apache Spark in the cluster to enable high volume and high performance file transfers. You can see more here: Free new tutorial: Quickly uploading files with Big Data Manager in Big Data Cloud Service

Let’s block ads! (Why?)

Oracle Blogs | Oracle The Data Warehouse Insider Blog

3 Critical Things Sales Can Learn From IT

If you’re in sales, there are people in your organization you want to talk to, and others you may go out of your way to avoid. You might be excited to talk to the CMO — or not so excited — based on the leads you recently worked. You might avoid the people from finance who bother you with questions, but you might enjoy conversing with your comp plan administrator — which would make sense, since many of us have a fondness for the people who hand us our checks.

However, there’s one group of people you may never even consider when you’re thinking about discussions inside your company: the CIO and IT team. Yeah, they’re important when your laptop won’t work, but do they really have anything to say that can impact your performance as a salesperson?

Well, yeah. In this day and age, they do.

If your business is on top of technological developments, you may be the beneficiary of an assortment of game-changing technologies. It behooves you to know what’s coming, when, and how well your team is preparing for it.

Imagine that you’re a Formula 1 race driver, and the CIO is the crew chief who’s preparing your car to be faster that any others on the track. Few drivers would fail to engage their crews to find out the latest changes, the timing of those changes, or other things they should know. Yet sales professionals often seem to strap on new technology and blindly hit the road — and sometimes, they crash and burn.

What do you have to talk about with your technology people? Plenty.

The Infrastructure for Your Artificial Intelligence

Artificial intelligence is coming — not to replace you, but
to serve as an assistant to help you be the best salesperson you can be.

With AI, you don’t just flip the switch and start getting suggestions. AI
must be trained — quite literally — using a very large data set. From there, AI will need a system of storage for the data it generates and the data generated by transactions — in other words, unstructured data about sales performance.

As a salesperson, you should be interested in how this data is stored and managed, since a failure in that regard makes your organization effectively blind — and it turns AI into a liability instead of an asset.

Ask your CIO whether your company is investing in object-based storage. Object storage has the advantage of being limitlessly scalable, and it can integrate new nodes automatically into a single storage namespace.

That means that IT can provision capacity in response to demand rather than trying to predict storage needs and provision capacity that sits unused. The worst-case scenario is that IT fails to provision enough capacity and fails to record a portion of your critical data, which would mean that AI would have to work with a partial picture of the data as it tries to refine its suggestions. That’s a guarantee that AI will never be as intelligent as you need it to be to help close deals.

Suggestion: Get familiar with how well IT is planning to cope with the data tsunami. That is a good indicator of how effective emerging technology will be at elevating your sales game — and it might suggest that it’s time to look for opportunities at organizations with more technology foresight.

The Depth of Your Internet of Things Offering

While the IoT promises big benefits to buyers and sellers, it also requires an enormous amount of trust. Allowing a vendor to install systems that deliver a constant data stream about your business operations is not something to be agreed to lightly — it requires the vendor to have all of its policies nailed down, and to hold all of its employees to an extremely high degree of integrity. If you’re in sales, you have to
sell this relationship as part of the deal.

IT needs to think through data policies thoroughly, and it needs to make sure that activities triggered by data in an IoT relationship completely map to the business process. It’s not enough to send replacement parts automatically or schedule maintenance proactively. The behind-the-scenes activities around contracts, invoicing, commissions and quote generation need to be hooked into this system, too.

Suggestion: Talk to IT to see whether the background activities needed to deliver IoT are part of IT’s plans, and get a timeline for the complete integration of business processes into the IoT infrastructure.

You may be pleasantly surprised by a comprehensive plan that lays the groundwork for great customer experiences and lucrative, long-term customer lifecycles, and that is something you can sell with complete confidence. On the other hand, you may be chagrined to find an IoT infrastructure that’s only partially baked, which could result in sales that turn into contractual nightmares and angry buyers in the near future.

Can You Actually Analyze Your Data?

For all the excitement over the last five years about sales analytics, it’s still often a battle to put the right data sets together to find actionable insights. Data from disparate cloud applications may not be easy to use for a number of reasons. There might be API mismatches between cloud providers. Also, different sales support systems may not work together. Finally, the unmanaged deployment of cloud applications for sales and marketing can create silos where data can be hidden.

That means that complete analysis can require a lot of work in preparing the data before any numbers can be run. That will take IT time — time the team often doesn’t have, and waiting for IT could result in stale analysis that has less impact than it ought to have provided.

Suggestion: Find out from IT how connected your various data sources actually are. That will help you understand what you know and what you could know, and it will put you in a much better place to make suggestions about next steps for data integrations.

Also, learn from IT how new cloud-based applications deployed without any thought about integration impact analysis. That will help remind you that
the IT team should be part of any cloud application decision if you want to maximize data’s impact on sales results.
end enn 3 Critical Things Sales Can Learn From IT


Chris%20Bucholtz 3 Critical Things Sales Can Learn From ITChris Bucholtz has been an ECT News Network columnist since 2009. His focus is on CRM, sales and marketing software, and the interface between people and technology. A noted speaker and author, Chris has covered the CRM space for 10 years.
Email Chris.

Let’s block ads! (Why?)

CRM Buyer

3 Critical Things Sales Can Learn From IT

If you’re in sales, there are people in your organization you want to talk to, and others you may go out of your way to avoid. You might be excited to talk to the CMO — or not so excited — based on the leads you recently worked. You might avoid the people from finance who bother you with questions, but you might enjoy conversing with your comp plan administrator — which would make sense, since many of us have a fondness for the people who hand us our checks.

However, there’s one group of people you may never even consider when you’re thinking about discussions inside your company: the CIO and IT team. Yeah, they’re important when your laptop won’t work, but do they really have anything to say that can impact your performance as a salesperson?

Well, yeah. In this day and age, they do.

If your business is on top of technological developments, you may be the beneficiary of an assortment of game-changing technologies. It behooves you to know what’s coming, when, and how well your team is preparing for it.

Imagine that you’re a Formula 1 race driver, and the CIO is the crew chief who’s preparing your car to be faster that any others on the track. Few drivers would fail to engage their crews to find out the latest changes, the timing of those changes, or other things they should know. Yet sales professionals often seem to strap on new technology and blindly hit the road — and sometimes, they crash and burn.

What do you have to talk about with your technology people? Plenty.

The Infrastructure for Your Artificial Intelligence

Artificial intelligence is coming — not to replace you, but
to serve as an assistant to help you be the best salesperson you can be.

With AI, you don’t just flip the switch and start getting suggestions. AI
must be trained — quite literally — using a very large data set. From there, AI will need a system of storage for the data it generates and the data generated by transactions — in other words, unstructured data about sales performance.

As a salesperson, you should be interested in how this data is stored and managed, since a failure in that regard makes your organization effectively blind — and it turns AI into a liability instead of an asset.

Ask your CIO whether your company is investing in object-based storage. Object storage has the advantage of being limitlessly scalable, and it can integrate new nodes automatically into a single storage namespace.

That means that IT can provision capacity in response to demand rather than trying to predict storage needs and provision capacity that sits unused. The worst-case scenario is that IT fails to provision enough capacity and fails to record a portion of your critical data, which would mean that AI would have to work with a partial picture of the data as it tries to refine its suggestions. That’s a guarantee that AI will never be as intelligent as you need it to be to help close deals.

Suggestion: Get familiar with how well IT is planning to cope with the data tsunami. That is a good indicator of how effective emerging technology will be at elevating your sales game — and it might suggest that it’s time to look for opportunities at organizations with more technology foresight.

The Depth of Your Internet of Things Offering

While the IoT promises big benefits to buyers and sellers, it also requires an enormous amount of trust. Allowing a vendor to install systems that deliver a constant data stream about your business operations is not something to be agreed to lightly — it requires the vendor to have all of its policies nailed down, and to hold all of its employees to an extremely high degree of integrity. If you’re in sales, you have to
sell this relationship as part of the deal.

IT needs to think through data policies thoroughly, and it needs to make sure that activities triggered by data in an IoT relationship completely map to the business process. It’s not enough to send replacement parts automatically or schedule maintenance proactively. The behind-the-scenes activities around contracts, invoicing, commissions and quote generation need to be hooked into this system, too.

Suggestion: Talk to IT to see whether the background activities needed to deliver IoT are part of IT’s plans, and get a timeline for the complete integration of business processes into the IoT infrastructure.

You may be pleasantly surprised by a comprehensive plan that lays the groundwork for great customer experiences and lucrative, long-term customer lifecycles, and that is something you can sell with complete confidence. On the other hand, you may be chagrined to find an IoT infrastructure that’s only partially baked, which could result in sales that turn into contractual nightmares and angry buyers in the near future.

Can You Actually Analyze Your Data?

For all the excitement over the last five years about sales analytics, it’s still often a battle to put the right data sets together to find actionable insights. Data from disparate cloud applications may not be easy to use for a number of reasons. There might be API mismatches between cloud providers. Also, different sales support systems may not work together. Finally, the unmanaged deployment of cloud applications for sales and marketing can create silos where data can be hidden.

That means that complete analysis can require a lot of work in preparing the data before any numbers can be run. That will take IT time — time the team often doesn’t have, and waiting for IT could result in stale analysis that has less impact than it ought to have provided.

Suggestion: Find out from IT how connected your various data sources actually are. That will help you understand what you know and what you could know, and it will put you in a much better place to make suggestions about next steps for data integrations.

Also, learn from IT how new cloud-based applications deployed without any thought about integration impact analysis. That will help remind you that
the IT team should be part of any cloud application decision if you want to maximize data’s impact on sales results.
end enn 3 Critical Things Sales Can Learn From IT


Chris%20Bucholtz 3 Critical Things Sales Can Learn From ITChris Bucholtz has been an ECT News Network columnist since 2009. His focus is on CRM, sales and marketing software, and the interface between people and technology. A noted speaker and author, Chris has covered the CRM space for 10 years.
Email Chris.

Let’s block ads! (Why?)

CRM Buyer

How to Learn from Travelers to Plan the Future of Airports

A few months ago, the Airports Council International (ACI) released its list of the world’s 20 busiest airport as of 2016. It’s notable that the traffic of these hubs increased by 4.7 percent in 2016, totaling over 1.4 billion passengers, and globally the number of people traveling by air grew 5.6 percent.

These figures help us understand that airports and airlines have huge opportunities sitting in front of them to offer better services and become part of the journey of each traveler and to have the possibility to monetize and increase their revenue. The daily flow of passengers are like mature cherries on a tree ready to be picked—it would be a total waste to leave them untouched.

Historically, an airport is regarded as a place where we spend time before our flight. But this attitude will change as soon as airports are considered as being part of the passenger’s journey. By 2024, most major airports will offer more than just a place to wait to catch your flight; they will offer you a gym class, invite you to attend an exhibition of masterpieces, or even let you sip a cocktail at the swimming pool while looking at the airplanes taking off.

Most of the airports have already started to offer their passengers a mostly free service: WiFi. But there is a hidden reason behind it; it allows them to track travelers, understand and learn from the walking paths, and measure how much time they spend in one area. From the arrival at the airport entrance, it is possible to track passengers’ time spent at the check-in desk, the time it takes them to go through security, how much time they spend eating at restaurants, and how long it takes them to reach the departure gate before finally taking off. This gives the airport a considerable amount of data to analyze to discover insights from passengers’ behaviors.

Tracking and optimizing the passenger dwell time is a fundamental part as recent airport studies have discovered that an extra 10 minutes at the security gate reduces the average passenger spend by a considerable 30 percent.

30 percent How to Learn from Travelers to Plan the Future of Airports

With the rise of IoT, more products and devices are connected to the internet and to each other, allowing devices to exchange informations through APIs. Just to cite one example, digital luggage tags and suitcases will include all flight details and destination information, which allows travelers or holidaymakers to track their their bags throughout their journey.

Another limitless opportunity comes from proximity marketing. Knowing in each instant where passengers are going and by combining information about their interests, it is possible to trigger marketing offers. Retailers would benefit by having more customers, and customers would benefit by getting only relevant offers.

Being able to track passengers in real time also gives the advantage to instantly visualize passenger density in different areas of the airport. For security and operational purposes, this allows airports to measure passenger throughput from one area to another in the case of exceeded thresholds to take countermeasures.

With the support of AI and machine learning algorithms, it is possible to learn and predict in real time what could happen when many passengers arrive at the airport, causing a delay, and what needs to be done to speed up operations. With dynamic check-in and security staff allocation, it is possible to optimize operations.

By gathering all this data from multiple connected devices and analyzing them, it is possible to understand traveler’s needs to plan the future of airports and offer a greater experience for a win-win situation.

Learn more about what TIBCO is doing in the travel industry.

Let’s block ads! (Why?)

The TIBCO Blog