• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Operationalize

How to Operationalize Your Data Science with Model Ops

May 24, 2020   TIBCO Spotfire

Reading Time: 3 minutes

Just as you wouldn’t train athletes and not have them compete, the same can be said about data science & machine learning (ML). You wouldn’t spend all this time and money on creating ML models without putting them into production, would you? You need your models infused into the business so they can help make crucial decisions.

Model Operations, or Model Ops, is the answer. Model Ops is the process of operationalizing data science by getting data science models into production and then managing them. The four main steps in the Model Ops process — build, manage, deploy/integrate, and monitor — form a repeatable cycle that you can leverage to reuse your models as software artifacts. Model Ops (aka ML Ops) ensures that models continue to deliver value to the organization. They also provide critical insights for managing the potential risks of model-based decision making, even as underlying business and technical conditions change.

Model Ops is a cross-functional, collaborative, continuous process that focuses on managing machine learning models to make them reusable and highly available via a repeatable deployment process.  What’s more, Model Ops encompasses various management aspects, such as model versioning, auditing, monitoring, and refreshing to ensure they are still delivering positive business value as conditions change.

Organizations need to realize the value of data science and machine learning models holistically, rather than as simply a process of developing models. While data science and ML processes are focused on building models, Model Ops focuses on operationalizing the entire data science pipeline within a business system.  Model Ops requires orchestration and coordination of many different personas within an organization including data engineers, data scientists, business users, IT operations, and app developers.  

In fact, many organizations have a dedicated Model Ops Engineer to facilitate this process. ML models that are people-facing must be unbiased, fair, and explainable; that is what the public demands and regulatory agencies and bodies increasingly require. For such applications, the ML Ops lifecycle must be designed to enable transparency and explainability when it comes to various risks.

The four-step approach to Model Ops 

In order to solve common pain points when it comes to model operationalization — such as the long delay between initiating a data science project and deploying the model — companies are taking a four-step approach: build, manage, deploy/integrate, and monitor. 

Build 

Data scientists use languages like Python and R, as well as commercial applications, to create analytics pipelines. They use innovative ML algorithms, they build predictive models, and they engineer new features that better represent the business problem and boost the predictive power of the model.  When building predictive models, data scientists need to consider both how the data is structured in production environments.  Similarly, for feature engineering, data scientists need to make sure that any new features that are created can be created fast enough in real-time production environments.  

Manage 

Models have a life cycle that is best managed from a central repository where their provenance, versioning, approval, testing, deployment, and eventual replacement can be tracked. Besides the metadata associated with model artifacts, the management platform and the repository should track accuracy metrics as well as dependencies between models and data sets.

Deploy/Integrate

A data science pipeline is taken from its original development environment and expressed in a form that can be executed independently and integrated into business applications. In the end, you need to be able to deploy the pipeline in a format/language that is appropriate to the target runtime environment.  

Monitor

After a model has been deployed, it is monitored for the accuracy of its predictions and impact on the business. The model needs to remain accurate even as the underlying data changes. This takes into account input from a human expert, or automatically, through ongoing retraining and champion-challenger loops with approval by a human, for example.

Realize the value of data science through Model Ops

By utilizing this four-step approach, organizations can realize the value of data science through Model Ops. It ensures that the best model gets embedded into a business system and that the model remains current. And companies that practice this have a big competitive advantage over those that consistently fail to operationalize models, and who fail to prioritize action over mere insight. Organizations can move beyond merely building models to truly operationalizing their data science usage.

Learn more about how to operationalize your data science by downloading this ebook. 

Let’s block ads! (Why?)

The TIBCO Blog

Read More

How to operationalize data science and machine learning without IT hurdles

November 13, 2019   BI News and Info
Managed Server How to operationalize data science and machine learning without IT hurdles

One of the biggest contributors to the Model Impact Epidemic

You may have heard about the Model Impact Epidemic – essentially a series of obstacles that are preventing organizations from properly operationalizing machine learning models. According to a recent Gartner report, 2 of the 3 top challenges that prevent data science projects from making it into production are IT-related.

Sometimes, IT departments get overloaded with requests for maintaining and operating different applications and platforms. ML Operations is a brand-new field, which requires some additional skills and a full understanding of the life cycle of a data science project.

If you develop DS projects, but they never get into production or the pace at which your processes are deployed is not quick enough, RapidMiner Managed Server is for you.

What is the RapidMiner Server?

RapidMiner Server is the orchestration and automation component of the RapidMiner platform that helps overcome many of the organizational challenges that prevent models from being fully deployed so they can deliver real business impact. Data Science is no longer an exploration task that ends with a report written by a data scientist. Today, impact is delivered only once models are deployed and operationalized. And that’s when RapidMiner Server shines: allowing enterprise customers to quickly deploy models in order to create value.

To help you seamlessly integrate your data science workflow into your production environments, as a part of RapidMiner 9.4, we introduced RapidMiner Managed Server, a new service that makes it easy for both small groups and larger companies to benefit from RapidMiner Server, without the need to add new hardware or wait for IT resources.

With RapidMiner Managed Server, we take care of all IT-related tasks (installation, maintenance, security, etc.), removing many of the roadblocks that users find on their path to applying models in production.

Managed Server: We do it for you

RapidMiner Managed Server is a services offering that harnesses our experienced team of ML Operations experts to install, configure, and maintain a RapidMiner environment for you. The environment can include one or multiple Servers (Development, Test and Production). It can grow horizontally whenever needed by having high availability servers or by adding Job Agents for additional execution power. And it can also include any number of Real-time Scoring Agents for Web Services deployments.

We host the environment in the RapidMiner AI Cloud and it’s tailored to suit your needs. In fact, the first thing we do is to analyze your needs in order to get the right size. We make sure your data scientists will find room for collaboration, remote execution of their projects, and even more importantly, for the deployment that finally provides value for the company. Next, we set up the infrastructure with a focus on network configuration and security. Finally, the RapidMiner platform is deployed and configured.

And then you can start working with it. No need to maintain it, it just works. While you do the data science, we do things like:

  • Backup everything and restore whenever needed
  • Manage any incidents that may come up
  • Elastically grow or shrink the environment according to your needs
  • Upgrade to new versions as they become available
  • Monitor the system
  • Secure the system to the highest standards

All this happens transparently while you use the platform and focus on solving your business problems.

RapidMiner Server as a Service

Internally, your users will be accessing RapidMiner Server in the cloud, “as a service”. However, it can be integrated with your network, so you can access your own data sources. And it can be configured in a completely customized way just for you. It’s a dedicated environment you won’t be sharing with anyone else.

Architecture of the environments

The RapidMiner environments are in the cloud, with the flexibility to make changes whenever they are needed. The whole system is “dockerized” and uses modern, state-of-the-art technology that tightens security, ensures availability, and minimizes risks. Our team of security experts make sure any vulnerability is removed and your data and processes are safe.

The system is also fully monitored, and we periodically share this information with you, so you’ll know if capacity is running low, what time is best to schedule a process, or which processes are using a lot of memory.

RapidMiner Server and Real-time Scoring

RapidMiner Server and Real-time Scoring are included in with RapidMiner Managed Server. They make a good couple, as RapidMiner Server provides user control, collaboration, and scheduling, while Real-time Scoring provides easy deployment of projects as Web Services and has extremely low latencies during execution. Both can be part of your environment.

Conclusions

With RapidMiner Managed Server, we are offering the expertise and knowledge of our group of expert ML Operations administrators. They’ll take care of installation, configuration, maintenance, availability, re-sizing, security, upgrades—all of the nitty gritty—and you just use the platform and get the value: collaboration during development, remote execution, scheduling, deployment, etc.

This is one of the many ways RapidMiner is working to help our customers move their path to AI beyond the ‘science project phase’ and into the world of operational models. In the world of Enterprise AI, models don’t have an impact until they’re fully deployed. Click here for more information on other new offerings and enhancements we’ve provided to help with this initiative.

Let’s block ads! (Why?)

RapidMiner

Read More

Using Xpress Insight to Operationalize Analytics

February 13, 2019   FICO
Xpress Insight Using Xpress Insight to Operationalize Analytics

Operationalize Analytics: Xpress Insight turns advanced analytic assets into a fully functioning application for business users.

Industry Today recently asked me to contribute an article about getting the maximum value out of analytics. And, in my experience, the biggest hurdle to maximizing the use of data to create greater business value is being able to operationalize analytics throughout an organization. This means creating a collaborative, scalable, transparent process for getting advanced analytics out of the lab and into the everyday business, this is the only way to truly operationalize and get maximum business value from analytic investments.

I wanted to follow the article up with a specific example of something we offer at FICO, and that’s Xpress Insight. Xpress Insight is a decision support application that enables data scientists to rapidly deploy any advanced analytic or optimization model as a business application.

Across industries, data scientists create powerful models to solve complex business problems. But, as we know all too well, many data science projects are never fully deployed. It’s a huge waste of time and resources when the models don’t reach the intended business users. The real value of highly complex analytic models comes when they’re in the hands of the business users and become an integral part of their work day.

FICO Xpress Insight enables collaboration between the data science team and the line of business users by taking highly complex analytic or optimization models and turning them into point-and-click applications that help them make business decisions.

Originally we built Xpress Insight to enable optimization applications, now it supports any advanced analytic model, including Python, SPSS and R. For data scientists trying to create decision support solutions, they can use Xpress Insight while also leveraging their existing investment in other analytic models and tools.

Companies are sitting on mountains of relevant and useful data to help inform their business processes and decisions. They need tools like Xpress Insight to unlock the value of that data and place analytics in the hands of business stakeholder in an easy to use and explainable framework that empowers the operation.

To check out Xpress Insight go to the FICO Community.

Let’s block ads! (Why?)

FICO

Read More

Want to Operationalize Analytics? Here’s Where to Start

March 11, 2018   FICO
Screen Shot 2018 03 09 at 1.03.42 PM Want to Operationalize Analytics? Here’s Where to Start

Thoughts From Gartner’s Data and Analytics Summit

This week I was at Gartner’s Data and Analytics Summit in Grapevine Texas, with my FICO team. This is an event we never miss, it’s attended by thousands of experts, analysts, data scientists and analytics leaders.

This year’s Summit was dominated by presentations and conversations about data, analytics, explainable artificial intelligence (AI), and machine learning; nearly every discussion came to the same point — we want to use data and technology to gain competitive advantage and deliver real business value, but how? There was a wide variety of opinions, Gartner asserted that “to accomplish this feat, data and analytic leaders must master four key dimensions of scale: diversity, literacy, complexity, and trust.”

In many of my conversations at the Summit, I shared my own view of what it takes to operationalize analytics. By this I mean, take all of the data and insights gleaned from advanced analytics and connect those to day-to-day operations. Surprisingly, the first two steps really have nothing to do with technology.

What it takes to operationalize analytics

First, companies need to start by putting the decision before the data. With a decision-first strategy you define the business objective, then determine what data and analytics you need to achieve the goal. If the modeling and data analytics requirements are defined by the business outcome first, data exploration and analytic development is faster and more productive. This helps enterprises narrow in on meaningful outcomes, shutting out extraneous noise to focus on the insights that address specific objectives.

Then, enterprises need to get data science into the hands of business decision makers. Empower the business leaders with the ability to evaluate the complete spectrum of potential opportunities. Experience has shown that, when business experts have access to the data, insight, and the tools to exploit analytics, they can visualize relationships between different variables and actions to quickly identify the preferred outcomes for maximum impact.

Rita Sallam, the conference chair and VP at Gartner, opened the event with a telling statement, “This is a consequential time to be a data and analytics leader.” I couldn’t agree more; leading a digital transformation is no small task. But if you start with the business challenge, then look at the analytics and arm business leaders with access, you will at least be headed in the right direction.

Let’s block ads! (Why?)

FICO

Read More
  • Recent Posts

    • SO MUCH FOR GLOBAL WARMING, EH?
    • Important Changes to Microsoft Dynamics 365 Field Service Mobile App
    • Syncing Dynamics 365 User Permissions with SharePoint
    • solve for variable in iterator limit
    • THE UNIVERSE: A WONDROUS PLACE
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited