Tag Archives: edge

On the Edge of More Accessible IoT Innovation

iStock 875478016 e1523394816577 On the Edge of More Accessible IoT Innovation

We’re all well-versed with the narrative that pits the cloud and edge computing against one another as competing entities.

Both have jostled for the top status as IT’s core disruptor and been positioned as a stark choice to make, depending on a business’s priorities and capabilities. Yet this ‘either/or’ conundrum is a myth worth dispelling; they are entirely different concepts. The edge — that physical space that draws computing and intelligence ever closer to the data source — becomes a delivery mechanism for the disconnected elements of the cloud. As such, they can work in synergy, rather than as replacements, making an effective hybrid that combines the edge’s agility with the sheer processing power of the central cloud. It’s why both environments are deployment options for the new breed of developers creating smarter, event-driven microservices for faster, more flexible app development.

While it was predicted that connectivity costs would reduce to such an extent that locating system intelligence in the central cloud would become the preferred option, these predictions have not come to fruition. Instead, we have seen the gradual migration of IoT-created data to the edge and with it, the natural progression of enhanced connectivity and functionality. Indeed, intelligence at the very edge of the network is not only more accessible but captured in real time, in its purest form and freshest state. This makes it most valuable for informing immediate and accurate operational decisions.

The benefits rumble on; with the compute happening directly on device, bottlenecks caused by multiple devices communicating back to a centralized core network are consigned to the past. Furthermore, security risks are minimized, with the time that data is spent in transit and vulnerable to attack significantly slashed. When analytics are added to the mix, things become more interesting, as subsets of data can be collated with analysis localized for enhanced decision making.

While the case for the edge has always been compelling, not everything has traditionally thrived there. Today’s machine learning algorithms, for example, and their requirement for huge volumes of data and computing power, have long relied on the cloud to do the heavy lifting. Yet as artificial intelligence becomes a more mainstream reality, informing daily routines from smart cars to digital personal assistants, things have changed. Attention has now turned to how it can better deliver on the network periphery and the space closer to the mobile phones, computers and other devices where the applications which commonly harness this technology run.

We have already seen the benefits play out in the smart home area. Here, deep learning capabilities at the edge of the network inform the nuances and intuitive responses of IoT-enabled digital tools that integrate and interact to provide insight as situations change. They can then feedback real-time contextual information to the homeowner or, if intruders appear, to a professional monitoring resource.

This was just the start; bringing machine learning capabilities to the edge, on device, with no connectivity requirements, and simplifying the long-standing IoT integration challenges, has wider ramifications for a host of industries and applications beyond the consumer space. Solutions that can respond to events in milliseconds represent the most cutting edge of innovation in this space, unlocking ever greater value in territories as diverse as industrial settings to the medical sector.

Here, real-time information at its most accessible will drive intelligent diagnostic capabilities on medical devices and harness machine learning to make all manner of predictions such as the patients most at risk from a hospital infection or most likely to be readmitted after discharge. At this stage, we are not privy to the full potential of AI’s role in this environment. However, a future where a medical facility can offer patients the option of receiving online medical advice from an artificial intelligence software programme is on the horizon, and promises improvements from speed and efficiency, to patient care and cost savings. Equally, strides are being made in the context of industrial settings, where data must flow between a myriad of sensors, devices, assets and machinery in the field, often in unstructured or challenging and remote conditions. Detecting anomalies at the edge device offers the kind of agility needed for predictive monitoring and mission-critical decisions with the potential to save millions, in terms of addressing equipment failures before they do damage.

Crucially, open source projects that simplify the development and deployment of microservices and IoT applications become the bedrock of this innovation. By enabling autonomous device action for a smarter edge and an era of more accessible IoT development, the potential is limitless.

Let’s block ads! (Why?)

The TIBCO Blog

Analyst Whitepaper: How Edge Computing, Serverless, and Machine Learning Will Transform the Enterprise

iStock 802301404 e1521218979870 Analyst Whitepaper: How Edge Computing, Serverless, and Machine Learning Will Transform the Enterprise

Machine learning at the edge is becoming an integral part of modern applications. From the web to mobile to IoT, machine learning is powering a new breed of applications through natural user experiences and inbuilt intelligence, all in devices themselves that can make decisions and take actions right on the spot.

Unfortunately, today’s virtual machines and containers are too heavy of an option to be deployed at the edge. As a result, developers are moving towards a serverless compute model which uses code snippets as a unit of execution. This enables machine learning models to be packaged as functions so they can run at the edge.

The convergence of machine learning, edge computing, and serverless will become the backbone of digital transformation in industry verticals such as manufacturing, automobile, healthcare, finance, and insurance in the very near future. And TIBCO is here to help.

TIBCO has built solutions that enable developers to bring the power of machine learning and serverless computing to the edge. We now have edge computing and serverless computing platforms integrated with machine learning frameworks. Our products, like TIBCO Flogo® Enterprise and Project Mashling™, make it easy to develop and deploy intelligent applications packaged as microservices running at the edge.

To learn more about these trends, please download the analyst white paper: “How Edge Computing, Serverless, and Machine Learning Will Transform the Enterprise” today.

Let’s block ads! (Why?)

The TIBCO Blog

Cloud, Edge, or Local? Dynamics 365 Finance and Operations Deployment Options

Feature 300x225 Cloud, Edge, or Local? Dynamics 365 Finance and Operations Deployment Options

In today’s blog on Dynamics 365 for Finance and Operations, we are going to touch on different deployment scenarios that are available for customers. When choosing a Microsoft Dynamics 365 for Finance and Operations deployment, there are three options: Cloud, Cloud and Edge (hybrid), or Local Business Data (on-premises).

The option you choose will depend on your business need. One great feature is that you have the option to change up your deployment down the line. Why would you do that? Well, maybe you recently made a large infrastructure investment but still want to implement Dynamics 365. You could start with a Local implementation and then switch over to the Cloud at the end of that hardware’s life cycle. How great is that?

Now let’s move on to learning more about each of the deployment options (we’ll use Microsoft’s descriptions of each type and explain what they mean).

Cloud

This scenario offers a fully Microsoft managed cloud ERP service that includes high availability (H/A), disaster recovery (D/R), sandbox environments, and application life cycle management that are combined with cloud-based systems of intelligence, infrastructure, compute, and database services in a single offering.

What does this mean?

It means that your Dynamics 365 deployment lives in Microsoft’s Cloud and is fully managed by them, so Microsoft is the Data Trustee in this scenario. This covers your high availability and disaster recovery using geo-redundant data centers. This option has the tightest integration with the Azure stack offerings like BI, Machine Learning, Logic Apps, etc. Microsoft is taking a Cloud first approach so some new functionality around infrastructure tie ins (SQL) will likely appear here first.

Why would you consider this deployment type?
There are some great benefits with this option. You get “always on” infrastructure without having to staff it and you get the disaster recovery (D/R) and high availability (H/A) without the additional hardware investments. One of the biggest advantages of this option is scalability. You can scale up when needed and without an additional hardware investment. You are paying for user licensing and not the computing power or “hardware.” Need to add a new division and need additional AOS resources or SQL processing power? You can scale up with a service request.

Cloud and Edge

A primarily cloud-based scenario that combines the power of a Microsoft managed Cloud service with plans to enable organizations to run their business processes from application servers at the edges. This means that transactions are supported by local application services and business data is stored locally.

What does this mean?

It means that this is a hybrid of Cloud and On-Premises deployments. Transactions are entered via local applications and the data is stored locally in the customer’s data center. There is also a central Cloud node which provides a single view of the business data and provides failover to the Cloud – D/R and H/A. Since it is still using the Cloud, you can still take advantage of BI, Machine Learning, etc. without a lot of manual configuration.

Why would you consider this deployment type?

Some organizations, like Retail or Manufacturing, may require things to be local for business continuity. For example, a retailer must be able to run the store POS operations even if internet connectivity goes down for a few seconds or minutes. This way you can capture the transactions locally and sync back up when you’re back online and don’t have to worry about redundant internet connectivity at the store level. In this scenario, both the customer and Microsoft are considered the data trustee.

Local Business Data

A scenario that will allow customers to deploy a Dynamics 365 for Finance and Operations environment into their or their partner’s data center and run operations without replicating business data to the Cloud.

What does this mean?

This one is fairly self-explanatory. The customer deploys all the Dynamics 365 components in their own data center. None of the applications or data reside in the Cloud. The customer is the data trustee, responsible for H/A and D/R, and for maintaining all the servers in the data center. Since there isn’t integration to the Cloud, options for things like integrated BI and Machine Learning are not available or require some additional configuration. Configuring all applications does require some heavy lifting from your IT staff and there are some serious system requirements.

Why would you consider this deployment type?

Some organizations are just not ready or comfortable with their mission critical data and apps residing in the Cloud. Or maybe you just spent a ton of money on recent hardware improvements and want to take advantage of that. Some industry mandates may require full access and control to the infrastructure or data. Maybe internet connectivity is an issue and there aren’t many options for redundant connections.

Here is a matrix that outlines the options and provides some details on each. Please note that the image is now a bit out dated and all the options are currently available.

121117 2220 CloudEdgeor1 Cloud, Edge, or Local? Dynamics 365 Finance and Operations Deployment Options

Clearly, there is a lot of to consider when making your decisions on a Dynamics 365 for Finance and Operations deployment type. The good news is that there are options and you can choose to go one route in the beginning and then change directions downstream if you want!

Learn more about Dynamics 365 for Finance and Operations in our on-demand webinar series! Learn how PowerObjects can help you on your Dynamics 365 journey here.

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

FogHorn raises $30 million to provide IoT edge computing analytics

 FogHorn raises $30 million to provide IoT edge computing analytics

FogHorn, which provides data analytics software for industrial and commercial Internet of Things (IoT) applications, announced today that is has secured $ 30 million in funding. Intel Capital and Saudi Aramco Energy Ventures co-led this round, with new investor Honeywell Ventures joining in. All previous investors also participated, including March Capital, GE, Dell, Bosch, Yokogawa, Darling Ventures, and The Hive.

“In the industrial application of IoT, such as manufacturing facilities, oil and gas production sites, and transportation systems, there are often hundreds or thousands of physical and video/audio sensors continuously producing a massive amount of high velocity data,” wrote FogHorn CEO David King, in an email to VentureBeat. “This data is being collected at the network ‘edge’, what Cisco coined the ‘fog’ layer several years ago.”

Edge computing is a method of optimizing cloud computing systems by performing data processing at the edge of the network, near the source of the data. According to King, industrial operators face several challenges when collecting and processing data, including a high volume of data-collecting sources, high costs in transporting this data in the cloud, and limits to real-time insights.

While latency may be fine when you are conversing with Amazon’s Alexa, having a delayed response to a gas leak could be extremely dangerous.

FogHorn’s Lighting platform has been purpose-built to run in very small footprint (256MB or smaller) edge computing systems. “The reason this is important is that the vast majority of data streaming from IoT sensors is useless within a very short period of time,” wrote King. “The information that is valuable — the anomalies and hard-to-detect patterns — need to be acted upon while operators can take corrective action.”

FogHorn licenses its software on a subscription basis to dozens of customers, according to King. The chief executive does not see any direct competitors focusing solely on tapping into streaming edge data for analytics, machine learning, and AI. “Amazon Greengrass and Microsoft Azure Edge are now targeting the edge with reduced footprint versions of their heavy cloud software stacks, but both still send most data to the cloud for advanced data science functionality,” he added.

The investment from Saudi Aramco Energy Ventures should secure FogHorn’s foothold in Saudi Arabia, which is one of the world’s biggest oil producers.

“Given the heavy presence of oil and gas, we expect it to be a large market in the future,” wrote King. “By partnering with Saudi Aramco Energy Ventures, we’re just beginning our reach into this market.”

To date, FogHorn has raised a total of $ 47.5 million. The Mountain View, California-based startup will use the fresh injection of capital to hire more engineers and increase sales and marketing efforts.

Founded in 2014 as part of The Hive incubator, FogHorn currently has more than 40 employees.

Sign up for Funding Daily: Get the latest news in your inbox every weekday.

Let’s block ads! (Why?)

Big Data – VentureBeat

Edge Computing And The New Decentralization: The Rhyming Of IT History

277234 277234 h ergb s gl e1506449988177 Edge Computing And The New Decentralization: The Rhyming Of IT History

Responding to rapid changes

A digital core is an IT architecture that offers stability and long-term reliability for core enterprise processes, yet also provides the flexibility to adapt quickly to new opportunities, challenges, and regulations. This is the concept of “bimodal IT,” a stable and reliable IT architecture that handles end-to-end processes providing simulations and analytical capabilities, simplifies order commitments, empowers teams with individualized customer insight, supports business models, and connects to business networks. This solid foundation gives you a single source of truth, which in turns enables flexibility for innovation, such as new business models, new regulations, and business events such as mergers or acquisitions.

Simulation and analytics

The ability to respond quickly is an essential part of managing a high-tech company. To do this, simulation, prediction, and analytical capabilities are an important component. Data is critical for insight to make decisions. This insight must be at a granular level, so that decision-makers have the detail they need to understand trends, opportunities, and risks, and quickly carry out what-if analysis using predictive algorithms.

Keeping data under control

Many companies have challenges in effectively leveraging all the data they collect. They need the computing capability to carry out complex algorithms with large data sets to support timely, real-time analysis. Everyone in the company must have access to data they need, whenever and wherever they need it. This also extends to the ecosystem to enable suppliers to stay up-to-date with a company’s orders, and to allow salespeople to see customer history for credit risk and stock information, and provide accurate delivery schedules.

With this sort of data and analysis, companies can then reevaluate the services or products that are provided to customers. And they can simulate the impact of providing bundles and services, which can also enhance the customer experience.

For more insight on how advanced technology is transforming business strategy, see IoT Today: Be A Disruptor Or Be Disrupted—Look At Your Business A Whole New Way.

Let’s block ads! (Why?)

Digitalist Magazine

Kleptocracy American-Style's flagship ready to sail … over the edge

 Kleptocracy American Style's flagship ready to sail ... over the edge

x

That first meeting between Robert Mueller’s folks and Reince Priebus should be interesting… more on The Real Animal Husbands of the Trump White House.

Because it’s Das Love Boot… and Reince will not be the next ambassador to Greece, but there does seem to be a junta coming into focus at the WH. 

x

x

x

DF2fwXtU0AAc2SJ Kleptocracy American Style's flagship ready to sail ... over the edge

x

x

 Kleptocracy American Style's flagship ready to sail ... over the edge

x

 Kleptocracy American Style's flagship ready to sail ... over the edge
Oliver North NSC uniform wearer

Trump has a thing for generals. He routinely refers to them as “my generals” and brags about them looking like they’re from “central casting.” Kelly was the third general the president named to a senior position after Secretary of Defense James Mattis—who had served so recently that he required a special waiver from Congress in order to take the job—and National Security Adviser Michael Flynn, who Trump reportedly asked to wear his uniform to work despite the fact Flynn had retired from the service. (The retired Kelly also does not wear his uniform to the workplace.) When Flynn was forced to resign over contacts with Russia, Trump replaced him with another general, H.R. McMaster, who has the advantage of still being in the military, meaning he can wear his uniform around the office….

Despite Trump’s professions of love and admiration for his generals and his claim that he’s giving the military “total authorization,” the president doesn’t always seem to care about what they have to say. He ignored Mattis’ plan for fighting ISIS for months. He’s widely known to be frustrated with McMaster, particularly the national security adviser’s desire to commit more troops to Afghanistan. And despite his claim that he’d decided to ban transgender people from the military after consultation with “my generals,” the Pentagon had no idea the announcement was coming. Mattis was apparently only informed the day before and was on vacation. Joint Chiefs Chairman Gen. Joseph Dunford says he’s not going to implement the ban until he receives an official order.

Before Kelly moves into his new role, then, particularly given that his new brief will involve more than just national security issues, he should keep in mind that Trump is often less interested in heeding his generals’ wisdom than he is in using them as props.

x

x

Let’s block ads! (Why?)

moranbetterDemocrats

Connected Intelligence in the IoT Edge

Flogo + IOTA—The little IoT app engines that could!

We are thrilled to have announced Flogo Edge Applications and TIBCO® IOT App Engine (IOTA™) last week at TIBCO NOW in Berlin. On this occasion, we are happy to feature a 3-part series of blogs that will explain our strategy and vision behind Flogo Edge and IOTA:

The three articles in this series will be:

  • Strategizing for IoT edge – Rajeev Kozhikkatuthodi
  • Introducing Flogo edge apps – Matt Ellis
  • Industrializing the IoT edge with TIBCO IOTA – Rahul Kamdar

There are few things in today’s technology landscape that promise to be more transformational than the emergence of an estimated 20 billion IoT edge devices by 2020. Even for an industry, as the joke goes, that has incorrectly predicted 10 out of the 3 “game-changers” in the past couple of decades, the implications of this brave new era of edge compute are difficult to ignore.

Last week at TIBCO NOW in Berlin, we unveiled TIBCO IoT App Engine (IOTA), a commercial, industrial IoT offering for  your IoT Edge Applications. Powered by the open source Project Flogo, IOTA sits at the intersection of Internet of Things (IoT) and edge application development. It is an important milestone in the evolution of not just TIBCO’s mission to Interconnect Everything, but also our overarching Connected Intelligence vision and strategy. In this post, I will provide some context around emergence of edge computing, importance of edge applications in particular, and outline TIBCO’s vision for IoT edge applications.

Emergence of edge computing and edge apps

A lot of insightful analyses have been been published about this topic including by Peter Levine, General Partner at a16z, on how cloud computing is coming to end. He argues that today’s cloud computing era powered by largely centralized compute infrastructures is going to be (yet again!) replaced by a distributed computing era. A key area of interest for my team are the applications that physically run on these edge devices and interact with cloud apps and services. At the end of the day, despite appearances, those 20 billion things are nothing but programmable, resource-constrained computers that also happen to be things. This opens up unprecedented opportunities and challenges around architecting, developing, and operating these billions of edge apps. These edge apps are not going to look like traditional cloud or on-premise apps. Analysts like Janakiram MSV have written about how edge is not just ideal for IoT solutions, but also extends to an entire new class of business applications and we wholeheartedly agree.

Today’s cloud-centric IoT models won’t hold

One of the well understood impacts of the edge as it emerges is that it will unleash a data tsunami. The evidence is already out there—a connected car can produce up to 5 terabytes of data hourly, an oil and gas drilling rig can produce 7 to 8 terabytes of data daily. Waiting for all of this data to be sent to the cloud and acted upon won’t cut it for many applications in industrial or consumer domains. Even for edge analytical applications, conventional big data architectures that involve forwarding everything to the cloud to be stored and analyzed doesn’t make sense—what you likely need are streaming analytics capabilities offered by solutions like TIBCO StreamBase in the edge. Simply put, cloud-centric IoT approaches are too expensive to operate, not secure and unreliable to operate.

TIBCO’s IoT edge vision

Around late 2015, it became clear to us that we needed some fresh thinking to address these formidable challenges posed by IoT edge. Virtually everything we knew about building distributed systems in the past 30 years had to be questioned and analyzed in the context of this newly emerging edge landscape. Out of this realization, was born Project Flogoour moonshot project to build integration technology that could run on the next 20 billion edge devices. What Flogo started off in 2016, we are taking it to the next level in 2017 with Flogo edge applications that run on the tiniest of microcontrollers with a footprint as low as 50KB.

Screen Shot 2017 06 14 at 9.12.51 AM Connected Intelligence in the IoT Edge

Figure 1: Flogo Edge Applications run with footprints as low as 50KB

At a high level, our vision for IoT edge applications is driven by three core beliefs:

  • Edge-native by design
  • Engineered for Connected Intelligence
  • Open as a matter of strategic choice

I will attempt to provide a little more color around these three beliefs and connect to a couple of take-aways for technologists and decision makers.

Edge-native design

Edge-native design simply means we are not retrofitting cloud-native technologies and architectures for the edge unless absolutely necessary. This sort of “edge-washing” is not entirely new. In many ways, this is reminiscent of the “cloud-washing” we saw in the early days of cloud computing. When innovators pioneer a new paradigm shift and enter the early majority, there is often a gold rush by vendors, suppliers, analysts, and users alike to reflexively retrofit what worked in the past into what is emerging as a new paradigm. This often translates into admirable but somewhat wonky efforts such as retrofitting a technology like Node.js, a good fit for server-side JavaScript apps, being abused for edge applications and infrastructure. Ultimately design is what design does and there are good examples of such edge-native design throughout Flogo such as externalizing all of the application state to a pluggable state management service, step-back remote debugging, externalizing flow configuration to a remote flow service, etc.

Engineered for Connected Intelligence

We are also approaching this design from the unique perspective of Connected Intelligence. In simple terms, we are no longer looking at IoT edge apps merely as an application development, integration, or analytics problem. We are not building just edge apps, we are seeking to provide connected intelligence capabilities that are physically resident in the edge as edge “applications”. What are some of these connected intelligence capabilities in these edge apps? 

  • Sense activity that is happening all around and turn them into events of significance
  • Connect in the edge and back into cloud and on-premise apps
  • Learn from these event streams in supervised, semi-supervised, and unsupervised ways
  • Act in real time in response to these events based on pre-trained models and or declarative logic

Another way to conceptualize this would be to think of these edge applications as facilitating connected but potentially self-reliant swarms of Observe-Orient-Decide-Act (OODA) loops with support from peers in the edge as well as the cloud. We believe this is a powerful shift in perspective and stands in sharp contrast to conventional server-side or cloud-native application development approaches.

Screen Shot 2017 06 14 at 9.14.25 AM Connected Intelligence in the IoT Edge

Figure 2: Contrasting edge-native and cloud-native thinking

Open as a matter of choice

When there are 20 billion things that are nothing short of full-fledged computers, one needs to approach this as a general computing problem, not a highly specialized sensor-actuator problem. Openness becomes a strategic choice to reduce specific types of technology lock-in risk, but also to promote experimentation and agility.

Many of today’s IoT platforms embed a proprietary SDK or API in the edge layer. In my opinion, they do not represent an edge application strategy, no more than a SQL client library represents a data management strategy. Those remote platform interfaces may very well be required to be embedded in edge apps, however the right architecture with the right set of abstractions will pay off in the medium to long run. We also believe that edge applications will need to embrace a multi-paradigmatic approach. For instance, some edge apps may be served better by an event-driven abstractions than a traditional API-led approach. Another example of this openness in action would be to leverage cheap cloud infrastructure to do model training leveraging deep-learning frameworks like Google TensorFlow, Amazon MXNet, and Microsoft CNTK in conjunction with technologies like TIBCO Enterprise Runtime for R and TIBCO Spotfire for data science tooling and visual analytics. These models can now be embedded in edge apps enabling real time inferencing without costly cloud hops. Assemble a toolchain of best in class capabilities, but do so with a radical commitment to exercising degrees of freedom where it matters to promote experimentation and freedom of choice.

Take aways

If you are a technologist or business strategist looking at edge computing, here are the key takeaways:

  • Start experimenting today: The lifeblood of digital transformation and innovation is data-driven experimentation. We are incredibly lucky to live in an age where the cost of experimentation has hit unprecedented lows with emergence of open source frameworks and cloud offerings—get started today.  
  • Translate your technology strategy to digital objectives: Focus on driving net digital value, not just point technology outcomes with your edge app strategy. Work with lines of business and executive stakeholders to build that closed loop translation between your edge innovation efforts and business outcomes.
  • Embrace edge-native and openness: There is no longer a need to edge-wash technologies to meet your edge application needs. Embrace edge-native design choices that do justice to the challenges and opportunities that edge computing brings up.

These are indeed exciting times and we are in many ways just getting started. We would love to hear your thoughts and experiences!

Let’s block ads! (Why?)

The TIBCO Blog

Aluvii stands apart from the competition by offering cutting edge data visualization with Power BI Embedded

social default image Aluvii stands apart from the competition by offering cutting edge data visualization with Power BI Embedded

The Power BI Team is excited to highlight a new story blog series of notable ISVs who have integrated Power BI Embedded into their offerings to differentiate and innovate their solutions. Our immediate story post comes from Aluvii, which offers SaaS POS solution for amusement parks and leisure facilities. With Power Bi Embedded, Aluvii aims to stand apart from the competition bringing rich data visualization to life all inside their application offerings. How did they integrate Power BI Embedded into their product offering? Read more below, and stay tuned for additional story posts coming your way this month and through December. 

Aluvii, is an all-in-one POS software platform for the amusement and leisure industries, recently released their flagship SaaS product and are experiencing a very positive response in the marketplace.  The cloud-based software includes a comprehensive set of modules needed to efficiently run an entire business including ticketing, point of sale, e-commerce, memberships, events & reservations, inventory, HR, scheduling & timekeeping, sales & marketing, member portal, online waivers, and much more.  Because Aluvii is cloud based, it’s accessible anytime, anywhere, and on any device. In addition, all registers, customer data, and reports are always available, safe, and up to date.

Aluvii recognizes automated business intelligence and reporting are critical to customers.  As such, Aluvii selected and utilizes Microsoft Power BI Embedded as its cutting edge reporting and dashboard technology solution.  

The primary reason for our selection of Microsoft Power BI Embedded was the need for a single reporting platform that could be deployed across multiple operating systems, web portals, and applications.  Microsoft’s Power BI is an excellent platform for the deployment of dashboards and reports, but it lacked the flexibility of embedded application inclusion.  With the release of the Power BI Embedded solution, we at Aluvii were able to leverage everything we loved in the standard Power BI framework into an integrated product.  We researched other solutions but each did not truly allow for embedded application deployment.  And many required convoluted integration architectures that had greater possibility of breakage.  Since Aluvii is built on top of Azure, the Power BI Embedded solution was seamless, and drove down the total cost of ownership.

The same architecture that allows for rapid deployment of reporting is also being managed through the Azure API Management services.  This allows our more advanced clients to build their own custom Power BI Desktop reports and dashboards.  With the API Gateway created by the Azure API Management, Aluvii is able to setup subscription models that provide access to data and other features.  Billing clients is also streamlined through this process.

Our development team has leveraged Microsoft Azure as its PaaS, the core API architecture used by the company which has allowed for increased flexibility in development and deployment.  And by kick starting the business using the Microsoft BizSpark program, we has been able to test and utilize the best of Microsoft’s product line, ensuring a state-of-the-art solution at affordable prices. 

The Azure SQL Database product, which is used as the backbone of the Aluvii Software Suite, allows for quick integration within Power BI Embedded.  The write-once-deploy-everywhere software process ensures that our reports can be integrated within our web portals, desktop applications, and mobile platforms with minimal development and cost.  Microsoft Power BI’s flexible visualizations offers our Aluvii product the ability to develop truly unique reporting solutions across all of our various feature sets and using multiple data sources.  At this time, Aluvii is unique in providing dynamic, multi-tenant reporting services built into the core application among all of its other software features. 

Overall, Aluvii credits their growing success to its diverse product offerings, the scalability of the Microsoft Cloud Azure Platform, and innovative development teams.  Since the initial release of their flagship SaaS offering, Aluvii has received tremendous interest from customers across the globe.  The Power BI Embedded reporting feature allows Aluvii to differentiate itself from its competitors, granting deep insights into its customers’ operations. 

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI