Tag Archives: edge

Cloud, Edge, or Local? Dynamics 365 Finance and Operations Deployment Options

Feature 300x225 Cloud, Edge, or Local? Dynamics 365 Finance and Operations Deployment Options

In today’s blog on Dynamics 365 for Finance and Operations, we are going to touch on different deployment scenarios that are available for customers. When choosing a Microsoft Dynamics 365 for Finance and Operations deployment, there are three options: Cloud, Cloud and Edge (hybrid), or Local Business Data (on-premises).

The option you choose will depend on your business need. One great feature is that you have the option to change up your deployment down the line. Why would you do that? Well, maybe you recently made a large infrastructure investment but still want to implement Dynamics 365. You could start with a Local implementation and then switch over to the Cloud at the end of that hardware’s life cycle. How great is that?

Now let’s move on to learning more about each of the deployment options (we’ll use Microsoft’s descriptions of each type and explain what they mean).


This scenario offers a fully Microsoft managed cloud ERP service that includes high availability (H/A), disaster recovery (D/R), sandbox environments, and application life cycle management that are combined with cloud-based systems of intelligence, infrastructure, compute, and database services in a single offering.

What does this mean?

It means that your Dynamics 365 deployment lives in Microsoft’s Cloud and is fully managed by them, so Microsoft is the Data Trustee in this scenario. This covers your high availability and disaster recovery using geo-redundant data centers. This option has the tightest integration with the Azure stack offerings like BI, Machine Learning, Logic Apps, etc. Microsoft is taking a Cloud first approach so some new functionality around infrastructure tie ins (SQL) will likely appear here first.

Why would you consider this deployment type?
There are some great benefits with this option. You get “always on” infrastructure without having to staff it and you get the disaster recovery (D/R) and high availability (H/A) without the additional hardware investments. One of the biggest advantages of this option is scalability. You can scale up when needed and without an additional hardware investment. You are paying for user licensing and not the computing power or “hardware.” Need to add a new division and need additional AOS resources or SQL processing power? You can scale up with a service request.

Cloud and Edge

A primarily cloud-based scenario that combines the power of a Microsoft managed Cloud service with plans to enable organizations to run their business processes from application servers at the edges. This means that transactions are supported by local application services and business data is stored locally.

What does this mean?

It means that this is a hybrid of Cloud and On-Premises deployments. Transactions are entered via local applications and the data is stored locally in the customer’s data center. There is also a central Cloud node which provides a single view of the business data and provides failover to the Cloud – D/R and H/A. Since it is still using the Cloud, you can still take advantage of BI, Machine Learning, etc. without a lot of manual configuration.

Why would you consider this deployment type?

Some organizations, like Retail or Manufacturing, may require things to be local for business continuity. For example, a retailer must be able to run the store POS operations even if internet connectivity goes down for a few seconds or minutes. This way you can capture the transactions locally and sync back up when you’re back online and don’t have to worry about redundant internet connectivity at the store level. In this scenario, both the customer and Microsoft are considered the data trustee.

Local Business Data

A scenario that will allow customers to deploy a Dynamics 365 for Finance and Operations environment into their or their partner’s data center and run operations without replicating business data to the Cloud.

What does this mean?

This one is fairly self-explanatory. The customer deploys all the Dynamics 365 components in their own data center. None of the applications or data reside in the Cloud. The customer is the data trustee, responsible for H/A and D/R, and for maintaining all the servers in the data center. Since there isn’t integration to the Cloud, options for things like integrated BI and Machine Learning are not available or require some additional configuration. Configuring all applications does require some heavy lifting from your IT staff and there are some serious system requirements.

Why would you consider this deployment type?

Some organizations are just not ready or comfortable with their mission critical data and apps residing in the Cloud. Or maybe you just spent a ton of money on recent hardware improvements and want to take advantage of that. Some industry mandates may require full access and control to the infrastructure or data. Maybe internet connectivity is an issue and there aren’t many options for redundant connections.

Here is a matrix that outlines the options and provides some details on each. Please note that the image is now a bit out dated and all the options are currently available.

121117 2220 CloudEdgeor1 Cloud, Edge, or Local? Dynamics 365 Finance and Operations Deployment Options

Clearly, there is a lot of to consider when making your decisions on a Dynamics 365 for Finance and Operations deployment type. The good news is that there are options and you can choose to go one route in the beginning and then change directions downstream if you want!

Learn more about Dynamics 365 for Finance and Operations in our on-demand webinar series! Learn how PowerObjects can help you on your Dynamics 365 journey here.

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

FogHorn raises $30 million to provide IoT edge computing analytics

 FogHorn raises $30 million to provide IoT edge computing analytics

FogHorn, which provides data analytics software for industrial and commercial Internet of Things (IoT) applications, announced today that is has secured $ 30 million in funding. Intel Capital and Saudi Aramco Energy Ventures co-led this round, with new investor Honeywell Ventures joining in. All previous investors also participated, including March Capital, GE, Dell, Bosch, Yokogawa, Darling Ventures, and The Hive.

“In the industrial application of IoT, such as manufacturing facilities, oil and gas production sites, and transportation systems, there are often hundreds or thousands of physical and video/audio sensors continuously producing a massive amount of high velocity data,” wrote FogHorn CEO David King, in an email to VentureBeat. “This data is being collected at the network ‘edge’, what Cisco coined the ‘fog’ layer several years ago.”

Edge computing is a method of optimizing cloud computing systems by performing data processing at the edge of the network, near the source of the data. According to King, industrial operators face several challenges when collecting and processing data, including a high volume of data-collecting sources, high costs in transporting this data in the cloud, and limits to real-time insights.

While latency may be fine when you are conversing with Amazon’s Alexa, having a delayed response to a gas leak could be extremely dangerous.

FogHorn’s Lighting platform has been purpose-built to run in very small footprint (256MB or smaller) edge computing systems. “The reason this is important is that the vast majority of data streaming from IoT sensors is useless within a very short period of time,” wrote King. “The information that is valuable — the anomalies and hard-to-detect patterns — need to be acted upon while operators can take corrective action.”

FogHorn licenses its software on a subscription basis to dozens of customers, according to King. The chief executive does not see any direct competitors focusing solely on tapping into streaming edge data for analytics, machine learning, and AI. “Amazon Greengrass and Microsoft Azure Edge are now targeting the edge with reduced footprint versions of their heavy cloud software stacks, but both still send most data to the cloud for advanced data science functionality,” he added.

The investment from Saudi Aramco Energy Ventures should secure FogHorn’s foothold in Saudi Arabia, which is one of the world’s biggest oil producers.

“Given the heavy presence of oil and gas, we expect it to be a large market in the future,” wrote King. “By partnering with Saudi Aramco Energy Ventures, we’re just beginning our reach into this market.”

To date, FogHorn has raised a total of $ 47.5 million. The Mountain View, California-based startup will use the fresh injection of capital to hire more engineers and increase sales and marketing efforts.

Founded in 2014 as part of The Hive incubator, FogHorn currently has more than 40 employees.

Sign up for Funding Daily: Get the latest news in your inbox every weekday.

Let’s block ads! (Why?)

Big Data – VentureBeat

Edge Computing And The New Decentralization: The Rhyming Of IT History

277234 277234 h ergb s gl e1506449988177 Edge Computing And The New Decentralization: The Rhyming Of IT History

Responding to rapid changes

A digital core is an IT architecture that offers stability and long-term reliability for core enterprise processes, yet also provides the flexibility to adapt quickly to new opportunities, challenges, and regulations. This is the concept of “bimodal IT,” a stable and reliable IT architecture that handles end-to-end processes providing simulations and analytical capabilities, simplifies order commitments, empowers teams with individualized customer insight, supports business models, and connects to business networks. This solid foundation gives you a single source of truth, which in turns enables flexibility for innovation, such as new business models, new regulations, and business events such as mergers or acquisitions.

Simulation and analytics

The ability to respond quickly is an essential part of managing a high-tech company. To do this, simulation, prediction, and analytical capabilities are an important component. Data is critical for insight to make decisions. This insight must be at a granular level, so that decision-makers have the detail they need to understand trends, opportunities, and risks, and quickly carry out what-if analysis using predictive algorithms.

Keeping data under control

Many companies have challenges in effectively leveraging all the data they collect. They need the computing capability to carry out complex algorithms with large data sets to support timely, real-time analysis. Everyone in the company must have access to data they need, whenever and wherever they need it. This also extends to the ecosystem to enable suppliers to stay up-to-date with a company’s orders, and to allow salespeople to see customer history for credit risk and stock information, and provide accurate delivery schedules.

With this sort of data and analysis, companies can then reevaluate the services or products that are provided to customers. And they can simulate the impact of providing bundles and services, which can also enhance the customer experience.

For more insight on how advanced technology is transforming business strategy, see IoT Today: Be A Disruptor Or Be Disrupted—Look At Your Business A Whole New Way.

Let’s block ads! (Why?)

Digitalist Magazine

Kleptocracy American-Style's flagship ready to sail … over the edge

 Kleptocracy American Style's flagship ready to sail ... over the edge


That first meeting between Robert Mueller’s folks and Reince Priebus should be interesting… more on The Real Animal Husbands of the Trump White House.

Because it’s Das Love Boot… and Reince will not be the next ambassador to Greece, but there does seem to be a junta coming into focus at the WH. 




DF2fwXtU0AAc2SJ Kleptocracy American Style's flagship ready to sail ... over the edge



 Kleptocracy American Style's flagship ready to sail ... over the edge


 Kleptocracy American Style's flagship ready to sail ... over the edge
Oliver North NSC uniform wearer

Trump has a thing for generals. He routinely refers to them as “my generals” and brags about them looking like they’re from “central casting.” Kelly was the third general the president named to a senior position after Secretary of Defense James Mattis—who had served so recently that he required a special waiver from Congress in order to take the job—and National Security Adviser Michael Flynn, who Trump reportedly asked to wear his uniform to work despite the fact Flynn had retired from the service. (The retired Kelly also does not wear his uniform to the workplace.) When Flynn was forced to resign over contacts with Russia, Trump replaced him with another general, H.R. McMaster, who has the advantage of still being in the military, meaning he can wear his uniform around the office….

Despite Trump’s professions of love and admiration for his generals and his claim that he’s giving the military “total authorization,” the president doesn’t always seem to care about what they have to say. He ignored Mattis’ plan for fighting ISIS for months. He’s widely known to be frustrated with McMaster, particularly the national security adviser’s desire to commit more troops to Afghanistan. And despite his claim that he’d decided to ban transgender people from the military after consultation with “my generals,” the Pentagon had no idea the announcement was coming. Mattis was apparently only informed the day before and was on vacation. Joint Chiefs Chairman Gen. Joseph Dunford says he’s not going to implement the ban until he receives an official order.

Before Kelly moves into his new role, then, particularly given that his new brief will involve more than just national security issues, he should keep in mind that Trump is often less interested in heeding his generals’ wisdom than he is in using them as props.



Let’s block ads! (Why?)


Connected Intelligence in the IoT Edge

Flogo + IOTA—The little IoT app engines that could!

We are thrilled to have announced Flogo Edge Applications and TIBCO® IOT App Engine (IOTA™) last week at TIBCO NOW in Berlin. On this occasion, we are happy to feature a 3-part series of blogs that will explain our strategy and vision behind Flogo Edge and IOTA:

The three articles in this series will be:

  • Strategizing for IoT edge – Rajeev Kozhikkatuthodi
  • Introducing Flogo edge apps – Matt Ellis
  • Industrializing the IoT edge with TIBCO IOTA – Rahul Kamdar

There are few things in today’s technology landscape that promise to be more transformational than the emergence of an estimated 20 billion IoT edge devices by 2020. Even for an industry, as the joke goes, that has incorrectly predicted 10 out of the 3 “game-changers” in the past couple of decades, the implications of this brave new era of edge compute are difficult to ignore.

Last week at TIBCO NOW in Berlin, we unveiled TIBCO IoT App Engine (IOTA), a commercial, industrial IoT offering for  your IoT Edge Applications. Powered by the open source Project Flogo, IOTA sits at the intersection of Internet of Things (IoT) and edge application development. It is an important milestone in the evolution of not just TIBCO’s mission to Interconnect Everything, but also our overarching Connected Intelligence vision and strategy. In this post, I will provide some context around emergence of edge computing, importance of edge applications in particular, and outline TIBCO’s vision for IoT edge applications.

Emergence of edge computing and edge apps

A lot of insightful analyses have been been published about this topic including by Peter Levine, General Partner at a16z, on how cloud computing is coming to end. He argues that today’s cloud computing era powered by largely centralized compute infrastructures is going to be (yet again!) replaced by a distributed computing era. A key area of interest for my team are the applications that physically run on these edge devices and interact with cloud apps and services. At the end of the day, despite appearances, those 20 billion things are nothing but programmable, resource-constrained computers that also happen to be things. This opens up unprecedented opportunities and challenges around architecting, developing, and operating these billions of edge apps. These edge apps are not going to look like traditional cloud or on-premise apps. Analysts like Janakiram MSV have written about how edge is not just ideal for IoT solutions, but also extends to an entire new class of business applications and we wholeheartedly agree.

Today’s cloud-centric IoT models won’t hold

One of the well understood impacts of the edge as it emerges is that it will unleash a data tsunami. The evidence is already out there—a connected car can produce up to 5 terabytes of data hourly, an oil and gas drilling rig can produce 7 to 8 terabytes of data daily. Waiting for all of this data to be sent to the cloud and acted upon won’t cut it for many applications in industrial or consumer domains. Even for edge analytical applications, conventional big data architectures that involve forwarding everything to the cloud to be stored and analyzed doesn’t make sense—what you likely need are streaming analytics capabilities offered by solutions like TIBCO StreamBase in the edge. Simply put, cloud-centric IoT approaches are too expensive to operate, not secure and unreliable to operate.

TIBCO’s IoT edge vision

Around late 2015, it became clear to us that we needed some fresh thinking to address these formidable challenges posed by IoT edge. Virtually everything we knew about building distributed systems in the past 30 years had to be questioned and analyzed in the context of this newly emerging edge landscape. Out of this realization, was born Project Flogoour moonshot project to build integration technology that could run on the next 20 billion edge devices. What Flogo started off in 2016, we are taking it to the next level in 2017 with Flogo edge applications that run on the tiniest of microcontrollers with a footprint as low as 50KB.

Screen Shot 2017 06 14 at 9.12.51 AM Connected Intelligence in the IoT Edge

Figure 1: Flogo Edge Applications run with footprints as low as 50KB

At a high level, our vision for IoT edge applications is driven by three core beliefs:

  • Edge-native by design
  • Engineered for Connected Intelligence
  • Open as a matter of strategic choice

I will attempt to provide a little more color around these three beliefs and connect to a couple of take-aways for technologists and decision makers.

Edge-native design

Edge-native design simply means we are not retrofitting cloud-native technologies and architectures for the edge unless absolutely necessary. This sort of “edge-washing” is not entirely new. In many ways, this is reminiscent of the “cloud-washing” we saw in the early days of cloud computing. When innovators pioneer a new paradigm shift and enter the early majority, there is often a gold rush by vendors, suppliers, analysts, and users alike to reflexively retrofit what worked in the past into what is emerging as a new paradigm. This often translates into admirable but somewhat wonky efforts such as retrofitting a technology like Node.js, a good fit for server-side JavaScript apps, being abused for edge applications and infrastructure. Ultimately design is what design does and there are good examples of such edge-native design throughout Flogo such as externalizing all of the application state to a pluggable state management service, step-back remote debugging, externalizing flow configuration to a remote flow service, etc.

Engineered for Connected Intelligence

We are also approaching this design from the unique perspective of Connected Intelligence. In simple terms, we are no longer looking at IoT edge apps merely as an application development, integration, or analytics problem. We are not building just edge apps, we are seeking to provide connected intelligence capabilities that are physically resident in the edge as edge “applications”. What are some of these connected intelligence capabilities in these edge apps? 

  • Sense activity that is happening all around and turn them into events of significance
  • Connect in the edge and back into cloud and on-premise apps
  • Learn from these event streams in supervised, semi-supervised, and unsupervised ways
  • Act in real time in response to these events based on pre-trained models and or declarative logic

Another way to conceptualize this would be to think of these edge applications as facilitating connected but potentially self-reliant swarms of Observe-Orient-Decide-Act (OODA) loops with support from peers in the edge as well as the cloud. We believe this is a powerful shift in perspective and stands in sharp contrast to conventional server-side or cloud-native application development approaches.

Screen Shot 2017 06 14 at 9.14.25 AM Connected Intelligence in the IoT Edge

Figure 2: Contrasting edge-native and cloud-native thinking

Open as a matter of choice

When there are 20 billion things that are nothing short of full-fledged computers, one needs to approach this as a general computing problem, not a highly specialized sensor-actuator problem. Openness becomes a strategic choice to reduce specific types of technology lock-in risk, but also to promote experimentation and agility.

Many of today’s IoT platforms embed a proprietary SDK or API in the edge layer. In my opinion, they do not represent an edge application strategy, no more than a SQL client library represents a data management strategy. Those remote platform interfaces may very well be required to be embedded in edge apps, however the right architecture with the right set of abstractions will pay off in the medium to long run. We also believe that edge applications will need to embrace a multi-paradigmatic approach. For instance, some edge apps may be served better by an event-driven abstractions than a traditional API-led approach. Another example of this openness in action would be to leverage cheap cloud infrastructure to do model training leveraging deep-learning frameworks like Google TensorFlow, Amazon MXNet, and Microsoft CNTK in conjunction with technologies like TIBCO Enterprise Runtime for R and TIBCO Spotfire for data science tooling and visual analytics. These models can now be embedded in edge apps enabling real time inferencing without costly cloud hops. Assemble a toolchain of best in class capabilities, but do so with a radical commitment to exercising degrees of freedom where it matters to promote experimentation and freedom of choice.

Take aways

If you are a technologist or business strategist looking at edge computing, here are the key takeaways:

  • Start experimenting today: The lifeblood of digital transformation and innovation is data-driven experimentation. We are incredibly lucky to live in an age where the cost of experimentation has hit unprecedented lows with emergence of open source frameworks and cloud offerings—get started today.  
  • Translate your technology strategy to digital objectives: Focus on driving net digital value, not just point technology outcomes with your edge app strategy. Work with lines of business and executive stakeholders to build that closed loop translation between your edge innovation efforts and business outcomes.
  • Embrace edge-native and openness: There is no longer a need to edge-wash technologies to meet your edge application needs. Embrace edge-native design choices that do justice to the challenges and opportunities that edge computing brings up.

These are indeed exciting times and we are in many ways just getting started. We would love to hear your thoughts and experiences!

Let’s block ads! (Why?)

The TIBCO Blog

Aluvii stands apart from the competition by offering cutting edge data visualization with Power BI Embedded

social default image Aluvii stands apart from the competition by offering cutting edge data visualization with Power BI Embedded

The Power BI Team is excited to highlight a new story blog series of notable ISVs who have integrated Power BI Embedded into their offerings to differentiate and innovate their solutions. Our immediate story post comes from Aluvii, which offers SaaS POS solution for amusement parks and leisure facilities. With Power Bi Embedded, Aluvii aims to stand apart from the competition bringing rich data visualization to life all inside their application offerings. How did they integrate Power BI Embedded into their product offering? Read more below, and stay tuned for additional story posts coming your way this month and through December. 

Aluvii, is an all-in-one POS software platform for the amusement and leisure industries, recently released their flagship SaaS product and are experiencing a very positive response in the marketplace.  The cloud-based software includes a comprehensive set of modules needed to efficiently run an entire business including ticketing, point of sale, e-commerce, memberships, events & reservations, inventory, HR, scheduling & timekeeping, sales & marketing, member portal, online waivers, and much more.  Because Aluvii is cloud based, it’s accessible anytime, anywhere, and on any device. In addition, all registers, customer data, and reports are always available, safe, and up to date.

Aluvii recognizes automated business intelligence and reporting are critical to customers.  As such, Aluvii selected and utilizes Microsoft Power BI Embedded as its cutting edge reporting and dashboard technology solution.  

The primary reason for our selection of Microsoft Power BI Embedded was the need for a single reporting platform that could be deployed across multiple operating systems, web portals, and applications.  Microsoft’s Power BI is an excellent platform for the deployment of dashboards and reports, but it lacked the flexibility of embedded application inclusion.  With the release of the Power BI Embedded solution, we at Aluvii were able to leverage everything we loved in the standard Power BI framework into an integrated product.  We researched other solutions but each did not truly allow for embedded application deployment.  And many required convoluted integration architectures that had greater possibility of breakage.  Since Aluvii is built on top of Azure, the Power BI Embedded solution was seamless, and drove down the total cost of ownership.

The same architecture that allows for rapid deployment of reporting is also being managed through the Azure API Management services.  This allows our more advanced clients to build their own custom Power BI Desktop reports and dashboards.  With the API Gateway created by the Azure API Management, Aluvii is able to setup subscription models that provide access to data and other features.  Billing clients is also streamlined through this process.

Our development team has leveraged Microsoft Azure as its PaaS, the core API architecture used by the company which has allowed for increased flexibility in development and deployment.  And by kick starting the business using the Microsoft BizSpark program, we has been able to test and utilize the best of Microsoft’s product line, ensuring a state-of-the-art solution at affordable prices. 

The Azure SQL Database product, which is used as the backbone of the Aluvii Software Suite, allows for quick integration within Power BI Embedded.  The write-once-deploy-everywhere software process ensures that our reports can be integrated within our web portals, desktop applications, and mobile platforms with minimal development and cost.  Microsoft Power BI’s flexible visualizations offers our Aluvii product the ability to develop truly unique reporting solutions across all of our various feature sets and using multiple data sources.  At this time, Aluvii is unique in providing dynamic, multi-tenant reporting services built into the core application among all of its other software features. 

Overall, Aluvii credits their growing success to its diverse product offerings, the scalability of the Microsoft Cloud Azure Platform, and innovative development teams.  Since the initial release of their flagship SaaS offering, Aluvii has received tremendous interest from customers across the globe.  The Power BI Embedded reporting feature allows Aluvii to differentiate itself from its competitors, granting deep insights into its customers’ operations. 

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Analytics for an Edge: Optimizing Operations in the Energy Sector

rsz bigstock silhouette three oil pumps 3494683 Analytics for an Edge: Optimizing Operations in the Energy Sector

The energy economy has been patchy in 2016. Engineering innovations in drilling and completions have been enormously successful, leading to a glut in crude supply. Cheap crude has led to increased fuel production, and refinery margins are narrowing.

When margins are low, successful companies are increasing their focus on technology innovations and running a profitable business. This results in more attention on planning and operations issues like where to drill, what rigs to operate, which assets to acquire or divest, and how to run the business to optimize production, sales, marketing, and profit.

The use of analytics is fundamental in keeping profits up and maximizing returns. There has never been a better time for energy companies to sharpen the saw on their data infrastructure and analytics deployments.

How can data and analytics help? 

From upstream exploration and production to refining and logistics, trading, downstream fuels, and marketing; there are opportunities for optimization. Innovative companies are using analytics to manage asset portfolios, maximize production, address environmental health and safety, monitor equipment, and streamline logistics, sales, and marketing.

All of these areas benefit from business insights and dashboards for measurement and diagnosis, embedded reporting, collaboration, predictive and streaming analytics, and process automation. One key path to extreme value is extracting insights from historical data “at rest” and putting those insights into action on fresh data “in motion”. This applies to data refresh rates that are daily, hourly, by the minute, second, or even sub-second in some applications.

Consider production optimization as a use case. With careful interconnection of systems and application of analytics, a production manager can be physically sitting in Houston, monitoring thousands of wells around the world from close-to-real-time feeds of equipment sensor data.

Based on the sensor data stored in historian systems and integrated with advanced analytics tools, that manager is able to visualize patterns and leading indicators for equipment stoppages that the analytics team has developed. The analytics patterns can be simple rules like changes in pump intake pressure, or more flexible empirical models; forming leading indicators to failure conditions such as gas buildup in the downhole, plugged tubing and the like.

These patterns can then be used to monitor refreshed equipment sensor data in real time, to anticipate when and why a potential equipment issue may arise. These potential issues can be communicated to responsible engineers for intervention, and managed in a collection of accumulated intelligence on the equipment for convenient ongoing assessment, maintenance, and production optimization.

This is the industrial Internet of Things (IIoT) in action, and the energy sector continues to be a technology and business leader in this area.

TIBCO’s insight to action platform and the industrial Internet of Things (IIoT)

This rapid insight-to-action data refresh and analytics workflow is enabled via an Enterprise Software Insight Platform— combining systems integration, data management, analytics, collaboration, and event processing components.

TIBCO’s insight-to-action platform features interconnected data, systems, and analytics—that enable the analyst and manager experience. This platform can underpin a company’s intelligent operations with comprehensive data management and analytics logic.

TIBCO’s platform combines TIBCO Spotfire and Streambase for visual and streaming analytics; along with TIBCO EMS, Business Works and Open Spirit for messaging and integration, including data connectors for specialized data sources in the energy sector; and TIBCO Tibbr for collaboration and crowdsourcing throughout field and management teams.

TIBCO Spotfire is widely used in the energy space across upstream exploration and production, asset management, supply chain, trading, refining, and downstream fuels and marketing. Spotfire has a large install base in the Energy sector – used by all the majors, across all functional areas, and by most exploration and production companies worldwide. Spotfire also has a healthy ecosystem of partners and system integrators specializing in the Energy Sector.

While simple to use, Spotfire is deep in GeoAnalytics and Predictive Analytics, including the industry’s only embeddable commercial R engine, TIBCO Enterprise Runtime for R (TERR), and advanced mapping capabilities.

TIBCO is also strong in streaming analytics and event processing. TIBCO Streambase, a streaming analytics platform that utilizes a visual programming paradigm to apply math to real-time data streams, is used for equipment maintenance, management of non-productive time (NPT), well operations and real-time drilling applications. TIBCO BusinessEvents, a stateful, event-driven rules platform, allows organizations to quickly build event analytics applications with real-time logic and reasoning.  These analytics applications may be configured with an in-memory data mart like TIBCO Live Datamart, combining rich real-time visualizations with continuous queries and user-driven alerts, actions, and responses.

The TIBCO insight-to-action platform enables equipment management and production optimization in real time and builds institutional knowledge identifying the kind of equipment components with issues, regions impacted by those issues, and any consistent bad actors in the system. This Enterprise Platform forms a digital nervous system across the field equipment, technical, and management staff, providing notifications and alerts to the appropriate engineers who can take action to keep the oil and gas production flowing.

TIBCO’s 14th annual Houston Energy Forum

The TIBCO Houston Energy Forum brings together hundreds of data management and analytics professionals in the energy sector for 2 days of technology innovation and sharing. Themes for 2016 include Operational Excellence, Strategy, and Technology—with presenters from leading energy companies across the US.

The latest and greatest updates from TIBCO Spotfire, Streambase, and Open Spirit are featured on the program, including new geoanalytics, graphics, data discovery, and predictive analytics features, templates, and data functions. Day 1 features customer and TIBCO technical presentations, and day 2 includes training and a Hackathon for Spotfire users. Some of the technical material is being published to the TIBCO Community—for use in the Hackathon and to extend the reach of the forum. See below for links to the material in the TIBCO Community:

TIBCO Community – Energy

TIBCO Community – Spotfire Community WIki

TIBCO Community – Exchange

In these times of low margins, innovative companies are accelerating their digital journey. From upstream exploration and production to refining and logistics, downstream fuels, and marketing, there are opportunities for optimization. Innovative companies are using analytics to prioritize asset portfolios, maximize production, address environmental health and safety, manage equipment, and streamline logistics, sales and marketing. It’s the surest path to competitive technical and business process, operations, and profit in the current environment.

Join us for 2 days of augmenting intelligence for energy professionals at the Houston Energy Forum. Find more information and register here.

Let’s block ads! (Why?)

The TIBCO Blog

Microsoft CRM on the Edge

edge Microsoft CRM on the EdgeCan I get CRM to run faster? I typically hear this from clients who run older versions of CRM using Internet Explorer. There are many factors that can play into this lag time, but the most likely culprit is the web browser. I know that Internet Explorer has a bad rap for running slow, but it is not necessarily to blame. It’s a web browser based on extensibility, which allows a user to plug in as many (emphasis on the word MANY) add-ons as they like. This not only slows down the CRM experience, but web browsing overall.

In older versions of Microsoft Dynamics CRM, the only supported web browser was Internet Explorer. But with the growing popularity of alternate web browsers like Chrome, Firefox and Safari, Microsoft realized the need to add support for cross browser compatibility. They also realized the need to create a web experience that would compete with their competitors.

Enter Microsoft’s new web browser called Edge, which is exclusively found on Windows 10. This web browser was built primarily with speed and performance in mind, and will deliver a much faster web experience, especially with Microsoft CRM. Keep in mind that Edge is only supported on certain CRM versions that contain the proper service packs:

– Microsoft Dynamics CRM 2015 with Update 0.2 (7.0.2) and Update 1.1 (7.1.1)

– Microsoft Dynamics CRM 2013 will be supported with the pending release of Update Rollup 4 for Service Pack 1 (6.1.4)

Older versions of CRM 2011 or CRM 4 will not support the Edge browser. Not to worry, Beringer has extensive experience with upgrading your CRM system to the latest version, or even migrating you to Microsoft Online.

Beringer Associates a leading Microsoft Gold Certified Partner specializing in Microsoft Dynamics CRM and CRM for Distribution. We also provide expert Managed IT Services, Backup and Disaster Recovery, Cloud Based Computing and Unified Communication Systems.

by Beringer Associates

This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.

CRM Software Blog