Tag Archives: analytics

Churn Prevention: New UK Rules Make Analytics Vital

Churn prevention Ofcom Churn Prevention: New UK Rules Make Analytics Vital

Whilst UK consumers are likely to be enthused by the reforms around switching mobile provider and switch requests announced by Ofcom recently, this will give mobile operators an additional regulatory headache when it comes to churn prevention. The reforms are centred on making the switching process easier and enabling consumers to invoke the process by sending a free text message to their current provider.

Ofcom research has suggested that one of the biggest hurdles that consumers face is “having to speak to your current provider, and facing unwanted attempts to persuade you to stay.” So, whilst it’s clearly in the operators interests to have a conversation with the customer to incentivise them to stay, customers “will control how much contact they have with their current mobile provider, preventing companies from delaying and frustrating the switch process.”

This raises questions that operators will need to consider. Most providers cite their customer satisfaction measures, like Net Promoter Scores, as absolutely key, and consider exceptional service to be a competitive differentiator. In fact, recent research presented by Analysis Mason at the European Telecoms Summit 2017 suggests that for every 1% improvement in customer satisfaction there is a correlating 4% in churn reduction.

Yet with around 5 million customers per year choosing to switch provider, it is clear that operators still have work to do in order to maintain loyal relationships with their customers. Removing the “traditional” retention conversation could re-focus providers into ensuring they are managing their customers appropriately before the customer begins to consider their options.

Churn Analytics

This is where advanced analytics comes into its own. Analytics can proactively segment customers according to their likelihood of leaving, with churn scores based on their profile and network activity data, such as the age of their mobile device, number of negative customer support interactions and dropped call or service incidents.

These customers can also be segmented by value and other items, including cost to serve, in order to inform the decision as to what tailored, automated actions will mitigate each individual’s risk of churn. This includes incentives that exceed expectations and delight customers, or indeed whether the investment to retain a customer could be better used elsewhere and allow some low-value customers to churn.

Having the ability to interpret the churn signals coming from different customer segments, added to the tools to execute decisions that mitigate the risk and cost-effectively retain customers, is becoming ever more important in managing churn prevention. This is supported through research by Three quoted in the Ofcom report:

 “61% of switchers who were offered a reactive save deal said they had made their final decision before starting the switching process, and their previous provider could not reasonably have done anything to change their mind about switching.”

There must be scope for operators to understand these switchers at a more sophisticated level and take action that dissuades them from making switch requests.

How Fast Can You Respond to a Switch Request?

Some savvy customers will of course use the process to their advantage as they do today; many customers will say they want to leave in order to get a better deal and the churn prevention activity is relatively straightforward. However, one of the policy aims of the Ofcom decision is to reduce unwanted attempts to persuade customers to remain. They lay out that expectations of providers are, in short, not to proactively contact the customer about the Port / Switching code in a way that may induce the customer to not proceed with the switch or to call the current provider. However, they do recognise that once a switch request has been made, operators are still free to contact the customer to attempt a save.

With a one working day, or less, time frame (in most cases) between the request and receiving a switch code, there is a short window of opportunity for the operators to get clever about how they identify the optimal save incentives and communicate this to the customer through the most effective digital channel that is likely to result in a save. The more agile operators are likely to use prescriptive analytics and decisioning to identify the incentive that makes most sense financially, the personalised offer that is likely to convince the customer to stay and the channel(s) that the customer is most likely to respond to.

As with so many other changes in the modern era, the operators that are best equipped to adapt to these changing regulatory requirements and manage churn prevention will be those that have embraced deep analytics and have the tools to put the analytical outputs into operation.

Let’s block ads! (Why?)

FICO

Expenses Fraud or Honest Mistake? Predictive Analytics Will Tell

heroexpensesimage2 Expenses Fraud or Honest Mistake? Predictive Analytics Will Tell

Are traditional approvals and audit reviews of procure-to-pay transactions sufficient to prevent expenses fraud, waste and abuse in your organization? The evidence suggests not. A recent study showed that expenses fraud committed by employees is estimated to cost U.S. businesses more than $ 2.8 billion per year. In the public sector and higher education, we see numerous news articles about government and university employees either misusing funds or putting the organization at risk due to bypassing important controls.

It should be said that the vast majority of staff do the right thing. A number of studies have shown that the majority of fraud (around 80 percent) is committed by just 5 percent of employees. The types of issues which most organizations are trying to prevent include:

  • Buying personal items with a corporate purchase card (p-card)
  • Inflating an expense report with personal travel or unapproved expenses
  • Directing purchases to favored suppliers
  • Unallowable usage of sponsored awards including federal grant monies

Less troubling, but also problematic are violations of established policies such as:

  • Requestor lacks documentation to support procurements requirements (e.g. bids, quotes) for special funds or grants
  • Split purchases into multiple transactions to avoid a procurement threshold
  • Purchases which are made using a p-card for an item on an approved catalog
  • Taxable travel expenses which are not properly coded as taxable.

The sheer volume of financial transactions makes it almost impossible for a large organization to solve this problem using manual reviews, while effective, 100 percent audits, of all transactions for mid-to-large sized organizations is cost-prohibitive.  And even the best audit program is likely to miss issues such as split transactions, PO leakage, or potential HIPAA violations.

What is needed is an automated system that can review every transaction using predictive analytics to estimate the risk level that each transaction represents to the enterprise.  This approach allows managers to focus their time during their approval review and for auditors to center their attention on the highest priority items saving money while simultaneously reducing waste and fraud.

Of course, expenses scandals are all the evidence needed to understand why an automated system would reduce the likelihood of public embarrassment or negative audit findings that could result in the loss of grant or other funding.

Challenges with the Traditional Expenses Approval Processes

Organizations often rely on management review and authorization to ensure the integrity of financial transactions.  While many managers are responsible, diligent and thoughtful, not all approvals receive appropriate scrutiny.  Many managers are extremely busy and click an “approve” button without giving the request appropriate detailed review.  In some cases, managers trust their employee to do the right thing and don’t review the details of expense reports or purchase requests, especially if the employee stays within budget.  Other managers will delegate the approval to administrative staff who are ill-equipped or lack the authority to truly scrutinize a purchase order or expenses.

Automated risk modeling is needed to help focus these reviews so that managers can question high risk transactions.  By ‘risk’ we mean a mathematical calculation that combines many factors but represents the risk to the organization from the transaction.  Our deep history in fraud with predictive analytics and scoring designed to protect the credit and debit portfolios of the world’s financial institutions, means our technology and approach can also be applied very effectively to ‘internal transactions’.

Risk Scoring in Real Time

Experience shows that there is a direct correlation between having data to inform decisions early in the process and higher rates of compliance. If an organization has automated notifications which indicate specific rules that have been violated and risk scores to help prioritize and focus their attention, the review and audit process will be more accurate, efficient and effective.   Risk assessment is based on many factors including:

  • Funding source. Some funding sources, like Federal Grant money, are highly sensitive and subject to external audit.
  • Dollar amount. Higher dollar amounts represent more risk, but also dollar amounts just below review or procurement thresholds carry additional risks.
  • Purchase category. Over time, certain types of purchases or expenses are more likely to be abused.

By scoring the case using mathematically derived predictive analytics, the ERP or expense reporting system can provide alerts to reviewers (managers and auditors) for additional scrutiny prior to initial approval, and suspect transactions are stopped before any outlay of funds.  Requestors can also be provided with real-time feedback about their purchase, potentially stopping them from even making the request.

Automated Review of receipts using OCR

Technology is now available which can automatically review the text within receipts or purchase card data (Level 3) for signs of improper or unusual spending.  Through automation these reviews occur more cost effectively than through traditional manual processes.  These reviews would include checks for:

  • Receipts for unapproved items (e.g., alcohol or entertainment)
  • Airplane Tickets for someone other than the employee
  • Improper Class of Service (e.g., first class airfare, luxury car rental)
  • Personal side trips accompanying business travel
  • Hotel location outside of the approved destination
  • Additional nights of hotel and excessive ancillary costs (e.g. gym, spa, room service, laundry, etc.)

Since some flagged items may be permissible, refinement of the predictive models and business rules means false positive results can be reduced.  In addition, these same models can identify permissible expenses which are taxable but were not properly coded.

Over time, merchants will provide higher quality Level 2 and Level 3 data for p-card transactions because the credit card companies charge merchants a lower commission rate based on the data they include, which incents their participation and further supports automation.

Change habits by reducing the opportunity for errors

Fraud has been described as a result of pressure, rationalization and opportunity.  Improper actions can occur when otherwise diligent staff rationalize that they can save time by cutting corners and with a belief they will not be caught.  By instituting sophisticated automated checks and publicizing the existence of those controls, staff will make the reasoned decision that they are likely to be caught, which will in turn lead to an increase in voluntary compliance.  The automated system can also continually look for new types of noncompliance, to prevent new approaches to circumventing processes and controls.

Over time this will lead to a reduced cost of compliance, improved usage of master contracts, and spending which better matches established policies, reducing costs within the procure-to-pay process.

Let’s block ads! (Why?)

FICO

If Analytics Is The Engine, Then Data Is The Fuel Of The 21st Century

Businesses share something important with lions. When a lion captures and consumes its prey, only about 10% to 20% of the prey’s energy is directly transferred into the lion’s metabolism. The rest evaporates away, mostly as heat loss, according to research done in the 1940s by ecologist Raymond Lindeman.

Today, businesses do only about as well as the big cats. When you consider the energy required to manage, power, and move products and services, less than 20% goes directly into the typical product or service—what economists call aggregate efficiency (the ratio of potential work to the actual useful work that gets embedded into a product or service at the expense of the energy lost in moving products and services through all of the steps of their value chains). Aggregate efficiency is a key factor in determining productivity.

SAP Q417 DigitalDoubles Feature2 Image2 If Analytics Is The Engine, Then Data Is The Fuel Of The 21st CenturyAfter making steady gains during much of the 20th century, businesses’ aggregate energy efficiency peaked in the 1980s and then stalled. Japan, home of the world’s most energy-efficient economy, has been skating along at or near 20% ever since. The U.S. economy, meanwhile, topped out at about 13% aggregate efficiency in the 1990s, according to research.

Why does this matter? Jeremy Rifkin says he knows why. Rifkin is an economic and social theorist, author, consultant, and lecturer at the Wharton School’s Executive Education program who believes that economies experience major increases in growth and productivity only when big shifts occur in three integrated infrastructure segments around the same time: communications, energy, and transportation.

But it’s only a matter of time before information technology blows all three wide open, says Rifkin. He envisions a new economic infrastructure based on digital integration of communications, energy, and transportation, riding atop an Internet of Things (IoT) platform that incorporates Big Data, analytics, and artificial intelligence. This platform will disrupt the world economy and bring dramatic levels of efficiency and productivity to businesses that take advantage of it,
he says.

Some economists consider Rifkin’s ideas controversial. And his vision of a new economic platform may be problematic—at least globally. It will require massive investments and unusually high levels of government, community, and private sector cooperation, all of which seem to be at depressingly low levels these days.

However, Rifkin has some influential adherents to his philosophy. He has advised three presidents of the European Commission—Romano Prodi, José Manuel Barroso, and the current president, Jean-Claude Juncker—as well as the European Parliament and numerous European Union (EU) heads of state, including Angela Merkel, on the ushering in of what he calls “a smart, green Third Industrial Revolution.” Rifkin is also advising the leadership of the People’s Republic of China on the build out and scale up of the “Internet Plus” Third Industrial Revolution infrastructure to usher in a sustainable low-carbon economy.

The internet has already shaken up one of the three major economic sectors: communications. Today it takes little more than a cell phone, an internet connection, and social media to publish a book or music video for free—what Rifkin calls zero marginal cost. The result has been a hollowing out of once-mighty media empires in just over 10 years. Much of what remains of their business models and revenues has been converted from physical (remember CDs and video stores?) to digital.

But we haven’t hit the trifecta yet. Transportation and energy have changed little since the middle of the last century, says Rifkin. That’s when superhighways reached their saturation point across the developed world and the internal-combustion engine came close to the limits of its potential on the roads, in the air, and at sea. “We have all these killer new technology products, but they’re being plugged into the same old infrastructure, and it’s not creating enough new business opportunities,” he says.

All that may be about to undergo a big shake-up, however. The digitalization of information on the IoT at near-zero marginal cost generates Big Data that can be mined with analytics to create algorithms and apps enabling ubiquitous networking. This digital transformation is beginning to have a big impact on the energy and transportation sectors. If that trend continues, we could see a metamorphosis in the economy and society not unlike previous industrial revolutions in history. And given the pace of technology change today, the shift could happen much faster than ever before.

SAP Q417 DigitalDoubles Feature2 Image3 1024x572 If Analytics Is The Engine, Then Data Is The Fuel Of The 21st CenturyThe speed of change is dictated by the increase in digitalization of these three main sectors; expensive physical assets and processes are partially replaced by low-cost virtual ones. The cost efficiencies brought on by digitalization drive disruption in existing business models toward zero marginal cost, as we’ve already seen in entertainment and publishing. According to research company Gartner, when an industry gets to the point where digital drives at least 20% of revenues, you reach the tipping point.

“A clear pattern has emerged,” says Peter Sondergaard, executive vice president and head of research and advisory for Gartner. “Once digital revenues for a sector hit 20% of total revenue, the digital bloodbath begins,” he told the audience at Gartner’s annual 2017 IT Symposium/ITxpo, according to The Wall Street Journal. “No matter what industry you are in, 20% will be the point of no return.”

Communications is already there, and energy and transportation are heading down that path. If they hit the magic 20% mark, the impact will be felt not just within those industries but across all industries. After all, who doesn’t rely on energy and transportation to power their value chains?

That’s why businesses need to factor potentially massive business model disruptions into their plans for digital transformation today if they want to remain competitive with organizations in early adopter countries like China and Germany. China, for example, is already halfway through an US$ 88 billion upgrade to its state electricity grid that will enable renewable energy transmission around the country—all managed and moved digitally, according to an article in The Economist magazine. And it is competing with the United States for leadership in self-driving vehicles, which will shift the transportation process and revenue streams heavily to digital, according to an article in Wired magazine.

SAP Q417 DigitalDoubles Feature2 Image4 If Analytics Is The Engine, Then Data Is The Fuel Of The 21st CenturyOnce China’s and Germany’s renewables and driverless infrastructures are in place, the only additional costs are management and maintenance. That could bring businesses in these countries dramatic cost savings over those that still rely on fossil fuels and nuclear energy to power their supply chains and logistics. “Once you pay the fixed costs of renewables, the marginal costs are near zero,” says Rifkin. “The sun and wind haven’t sent us invoices yet.”

In other words, zero marginal cost has become a zero-sum game.

To understand why that is, consider the major industrial revolutions in history, writes Rifkin in his books, The Zero Marginal Cost Society and The Third Industrial Revolution. The first major shift occurred in the 19th century when cheap, abundant coal provided an efficient new source of power (steam) for manufacturing and enabled the creation of a vast railway transportation network. Meanwhile, the telegraph gave the world near-instant communication over a globally connected network.

The second big change occurred at the beginning of the 20th century, when inexpensive oil began to displace coal and gave rise to a much more flexible new transportation network of cars and trucks. Telephones, radios, and televisions had a similar impact on communications.

Breaking Down the Walls Between Sectors

Now, according to Rifkin, we’re poised for the third big shift. The eye of the technology disruption hurricane has moved beyond communications and is heading toward—or as publishing and entertainment executives might warn, coming for—the rest of the economy. With its assemblage of global internet and cellular network connectivity and ever-smaller and more powerful sensors, the IoT, along with Big Data analytics and artificial intelligence, is breaking down the economic walls that have protected the energy and transportation sectors for the past 50 years.

Daimler is now among the first movers in transitioning into a digitalized mobility internet. The company has equipped nearly 400,000 of its trucks with external sensors, transforming the vehicles into mobile Big Data centers. The sensors are picking up real-time Big Data on weather conditions, traffic flows, and warehouse availability. Daimler plans to establish collaborations with thousands of companies, providing them with Big Data and analytics that can help dramatically increase their aggregate efficiency and productivity in shipping goods across their value chains. The Daimler trucks are autonomous and capable of establishing platoons of multiple trucks driving across highways.

It won’t be long before vehicles that navigate the more complex transportation infrastructures around the world begin to think for themselves. Autonomous vehicles will bring massive economic disruption to transportation and logistics thanks to new aggregate efficiencies. Without the cost of having a human at the wheel, autonomous cars could achieve a shared cost per mile below that of owned vehicles by as early as 2030, according to research from financial services company Morgan Stanley.

The transition is getting a push from governments pledging to give up their addiction to cars powered by combustion engines. Great Britain, France, India, and Norway are seeking to go all electric as early as 2025 and by 2040 at the latest.

The Final Piece of the Transition

Considering that automobiles account for 47% of petroleum consumption in the United States alone—more than twice the amount used for generators and heating for homes and businesses, according to the U.S. Energy Information Administration—Rifkin argues that the shift to autonomous electric vehicles could provide the momentum needed to upend the final pillar of the economic platform: energy. Though energy has gone through three major disruptions over the past 150 years, from coal to oil to natural gas—each causing massive teardowns and rebuilds of infrastructure—the underlying economic model has remained constant: highly concentrated and easily accessible fossil fuels and highly centralized, vertically integrated, and enormous (and enormously powerful) energy and utility companies.

Now, according to Rifkin, the “Third Industrial Revolution Internet of Things infrastructure” is on course to disrupt all of it. It’s neither centralized nor vertically integrated; instead, it’s distributed and networked. And that fits perfectly with the commercial evolution of two energy sources that, until the efficiencies of the IoT came along, made no sense for large-scale energy production: the sun and the wind.

But the IoT gives power utilities the means to harness these batches together and to account for variable energy flows. Sensors on solar panels and wind turbines, along with intelligent meters and a smart grid based on the internet, manage a new, two-way flow of energy to and from the grid.

SAP Q417 DigitalDoubles Feature2 Image5 If Analytics Is The Engine, Then Data Is The Fuel Of The 21st CenturyToday, fossil fuel–based power plants need to kick in extra energy if insufficient energy is collected from the sun and wind. But industrial-strength batteries and hydrogen fuel cells are beginning to take their place by storing large reservoirs of reserve power for rainy or windless days. In addition, electric vehicles will be able to send some of their stored energy to the digitalized energy internet during peak use. Demand for ever-more efficient cell phone and vehicle batteries is helping push the evolution of batteries along, but batteries will need to get a lot better if renewables are to completely replace fossil fuel energy generation.

Meanwhile, silicon-based solar cells have not yet approached their limits of efficiency. They have their own version of computing’s Moore’s Law called Swanson’s Law. According to data from research company Bloomberg New Energy Finance (BNEF), Swanson’s Law means that for each doubling of global solar panel manufacturing capacity, the price falls by 28%, from $ 76 per watt in 1977 to $ 0.41 in 2016. (Wind power is on a similar plunging exponential cost curve, according to data from the U.S. Department of Energy.)

Thanks to the plummeting solar price, by 2028, the cost of building and operating new sun-based generation capacity will drop below the cost of running existing fossil power plants, according to BNEF. “One of the surprising things in this year’s forecast,” says Seb Henbest, lead author of BNEF’s annual long-term forecast, the New Energy Outlook, “is that the crossover points in the economics of new and old technologies are happening much sooner than we thought last year … and those were all happening a bit sooner than we thought the year before. There’s this sense that it’s not some distant risk or distant opportunity. A lot of these realities are rushing toward us.”

The conclusion, he says, is irrefutable. “We can see the data and when we map that forward with conservative assumptions, these technologies just get cheaper than everything else.”

The smart money, then—72% of total new power generation capacity investment worldwide by 2040—will go to renewable energy, according to BNEF. The firm’s research also suggests that there’s more room in Swanson’s Law along the way, with solar prices expected to drop another 66% by 2040.

Another factor could push the economic shift to renewables even faster. Just as computers transitioned from being strictly corporate infrastructure to becoming consumer products with the invention of the PC in the 1980s, ultimately causing a dramatic increase in corporate IT investments, energy generation has also made the transition to the consumer side.

Thanks to future tech media star Elon Musk, consumers can go to his Tesla Energy company website and order tempered glass solar panels that look like chic, designer versions of old-fashioned roof shingles. Models that look like slate or a curved, terracotta-colored, ceramic-style glass that will make roofs look like those of Tuscan country villas, are promised soon. Consumers can also buy a sleek-looking battery called a Powerwall to store energy from the roof.

SAP Q417 DigitalDoubles Feature2 Image6 If Analytics Is The Engine, Then Data Is The Fuel Of The 21st CenturyThe combination of solar panels, batteries, and smart meters transforms homeowners from passive consumers of energy into active producers and traders who can choose to take energy from the grid during off-peak hours, when some utilities offer discounts, and sell energy back to the grid during periods when prices are higher. And new blockchain applications promise to accelerate the shift to an energy market that is laterally integrated rather than vertically integrated as it is now. Consumers like their newfound sense of control, according to Henbest. “Energy’s never been an interesting consumer decision before and suddenly it is,” he says.

As the price of solar equipment continues to drop, homes, offices, and factories will become like nodes on a computer network. And if promising new solar cell technologies, such as organic polymers, small molecules, and inorganic compounds, supplant silicon, which is not nearly as efficient with sunlight as it is with ones and zeroes, solar receivers could become embedded into windows and building compounds. Solar production could move off the roof and become integrated into the external facades of homes and office buildings, making nearly every edifice in town a node.

The big question, of course, is how quickly those nodes will become linked together—if, say doubters, they become linked at all. As we learned from Metcalfe’s Law, the value of a network is proportional to its number of connected users.

The Will Determines the Way

Right now, the network is limited. Wind and solar account for just 5% of global energy production today, according to Bloomberg.

But, says Rifkin, technology exists that could enable the network to grow exponentially. We are seeing the beginnings of a digital energy network, which uses a combination of the IoT, Big Data, analytics, and artificial intelligence to manage distributed energy sources, such as solar and wind power from homes and businesses.

As nodes on this network, consumers and businesses could take a more active role in energy production, management, and efficiency, according to Rifkin. Utilities, in turn, could transition from simply transmitting power and maintaining power plants and lines to managing the flow to and from many different energy nodes; selling and maintaining smart home energy management products; and monitoring and maintaining solar panels and wind turbines. By analyzing energy use in the network, utilities could create algorithms that automatically smooth the flow of renewables. Consumers and businesses, meanwhile, would not have to worry about connecting their wind and solar assets to the grid and keeping them up and running; utilities could take on those tasks more efficiently.

Already in Germany, two utility companies, E.ON and RWE, have each split their businesses into legacy fossil and nuclear fuel companies and new services companies based on distributed generation from renewables, new technologies, and digitalization.

The reason is simple: it’s about survival. As fossil fuel generation winds down, the utilities need a new business model to make up for lost revenue. Due to Germany’s population density, “the utilities realize that they won’t ever have access to enough land to scale renewables themselves,” says Rifkin. “So they are starting service companies to link together all the different communities that are building solar and wind and are managing energy flows for them and for their customers, doing their analytics, and managing their Big Data. That’s how they will make more money while selling less energy in the future.”

SAP Q417 DigitalDoubles Feature2 Image7 1024x572 If Analytics Is The Engine, Then Data Is The Fuel Of The 21st Century

The digital energy internet is already starting out in pockets and at different levels of intensity around the world, depending on a combination of citizen support, utility company investments, governmental power, and economic incentives.

China and some countries within the EU, such as Germany and France, are the most likely leaders in the transition toward a renewable, energy-based infrastructure because they have been able to align the government and private sectors in long-term energy planning. In the EU for example, wind has already overtaken coal as the second largest form of power capacity behind natural gas, according to an article in TheGuardian newspaper. Indeed, Rifkin has been working with China, the EU, and governments, communities, and utilities in Northern France, the Netherlands, and Luxembourg to begin building these new internets.

Hauts-de-France, a region that borders the English Channel and Belgium and has one of the highest poverty rates in France, enlisted Rifkin to develop a plan to lift it out of its downward spiral of shuttered factories and abandoned coal mines. In collaboration with a diverse group of CEOs, politicians, teachers, scientists, and others, it developed Rev3, a plan to put people to work building a renewable energy network, according to an article in Vice.

Today, more than 1,000 Rev3 projects are underway, encompassing everything from residential windmills made from local linen to a fully electric car–sharing system. Rev3 has received financial support from the European Investment Bank and a handful of private investment funds, and startups have benefited from crowdfunding mechanisms sponsored by Rev3. Today, 90% of new energy in the region is renewable and 1,500 new jobs have been created in the wind energy sector alone.

Meanwhile, thanks in part to generous government financial support, Germany is already producing 35% of its energy from renewables, according to an article in TheIndependent, and there is near unanimous citizen support (95%, according to a recent government poll) for its expansion.

If renewable energy is to move forward in other areas of the world that don’t enjoy such strong economic and political support, however, it must come from the ability to make green, not act green.

Not everyone agrees that renewables will produce cost savings sufficient to cause widespread cost disruption anytime soon. A recent forecast by the U.S. Energy Information Administration predicts that in 2040, oil, natural gas, and coal will still be the planet’s major electricity producers, powering 77% of worldwide production, while renewables such as wind, solar, and biofuels will account for just 15%.

Skeptics also say that renewables’ complex management needs, combined with the need to store reserve power, will make them less economical than fossil fuels through at least 2035. “All advanced economies demand full-time electricity,” Benjamin Sporton, chief executive officer of the World Coal Association told Bloomberg. “Wind and solar can only generate part-time, intermittent electricity. While some renewable technologies have achieved significant cost reductions in recent years, it’s important to look at total system costs.”

On the other hand, there are many areas of the world where distributed, decentralized, renewable power generation already makes more sense than a centralized fossil fuel–powered grid. More than 20% of Indians in far flung areas of the country have no access to power today, according to an article in TheGuardian. Locally owned and managed solar and wind farms are the most economical way forward. The same is true in other developing countries, such as Afghanistan, where rugged terrain, war, and tribal territorialism make a centralized grid an easy target, and mountainous Costa Rica, where strong winds and rivers have pushed the country to near 100% renewable energy, according to TheGuardian.

The Light and the Darknet

Even if all the different IoT-enabled economic platforms become financially advantageous, there is another concern that could disrupt progress and potentially cause widespread disaster once the new platforms are up and running: hacking. Poorly secured IoT sensors have allowed hackers to take over everything from Wi-Fi enabled Barbie dolls to Jeep Cherokees, according to an article in Wired magazine.

Humans may be lousy drivers, but at least we can’t be hacked (yet). And while the grid may be prone to outages, it is tightly controlled, has few access points for hackers, and is physically separated from the Wild West of the internet.

If our transportation and energy networks join the fray, however, every sensor, from those in the steering system on vehicles to grid-connected toasters, becomes as vulnerable as a credit card number. Fake news and election hacking are bad enough, but what about fake drivers or fake energy? Now we’re talking dangerous disruptions and putting millions of people in harm’s way.

SAP Q417 DigitalDoubles Feature2 Image8 If Analytics Is The Engine, Then Data Is The Fuel Of The 21st CenturyThe only answer, according to Rifkin, is for businesses and governments to start taking the hacking threat much more seriously than they do today and to begin pouring money into research and technologies for making the internet less vulnerable. That means establishing “a fully distributed, redundant, and resilient digital infrastructure less vulnerable to the kind of disruptions experienced by Second Industrial Revolution–centralized communication systems and power grids that are increasingly subject to climate change, disasters, cybercrime, and cyberterrorism,” he says. “The ability of neighborhoods and communities to go off centralized grids during crises and re-aggregate in locally decentralized networks is the key to advancing societal security in the digital era,” he adds.

Start Looking Ahead

Until today, digital transformation has come mainly through the networking and communications efficiencies made possible by the internet. Airbnb thrives because web communications make it possible to create virtual trust markets that allow people to feel safe about swapping their most private spaces with one another.

But now these same efficiencies are coming to two other areas that have never been considered core to business strategy. That’s why businesses need to begin managing energy and transportation as key elements of their digital transformation portfolios.

Microsoft, for example, formed a senior energy team to develop an energy strategy to mitigate risk from fluctuating energy prices and increasing demands from customers to reduce carbon emissions, according to an article in Harvard Business Review. “Energy has become a C-suite issue,” Rob Bernard, Microsoft’s top environmental and sustainability executive told the magazine. “The CFO and president are now actively involved in our energy road map.”

As Daimler’s experience shows, driverless vehicles will push autonomous transportation and automated logistics up the strategic agenda within the next few years. Boston Consulting Group predicts that the driverless vehicle market will hit $ 42 billion by 2025. If that happens, it could have a lateral impact across many industries, from insurance to healthcare to the military.

Businesses must start planning now. “There’s always a period when businesses have to live in the new and the old worlds at the same time,” says Rifkin. “So businesses need to be considering new business models and structures now while continuing to operate their existing models.”

He worries that many businesses will be left behind if their communications, energy, and transportation infrastructures don’t evolve. Companies that still rely on fossil fuels for powering traditional transportation and logistics could be at a major competitive disadvantage to those that have moved to the new, IoT-based energy and transportation infrastructures.

Germany, for example, has set a target of 80% renewables for gross power consumption by 2050, according to TheIndependent. If the cost advantages of renewables bear out, German businesses, which are already the world’s third-largest exporters behind China and the United States, could have a major competitive advantage.

“How would a second industrial revolution society or country compete with one that has energy at zero marginal cost and driverless vehicles?” asks Rifkin. “It can’t be done.” D!


About the Authors

Maurizio Cattaneo is Director, Delivery Execution, Energy and Natural Resources, at SAP.

Joerg Ferchow is Senior Utilities Expert and Design Thinking Coach, Digital Transformation, at SAP.

Daniel Wellers is Digital Futures Lead, Global Marketing, at SAP.

Christopher Koch is Editorial Director, SAP Center for Business Insight, at SAP.


Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

FICO Receives Analytics 50 Award for FICO Score XD

Analytics 50 2017 FICO Receives Analytics 50 Award for FICO Score XD

Drexel University’s LeBow College of Business and CIO.com have named analytic software firm FICO a winner of the Analytics 50 Awards for the second year in a row. The awards program honors organizations using analytics to solve business challenges.  FICO received the award for FICO® Score XD, which leverages groundbreaking analytic technologies and alternative data to help safely and responsibily expand credit access.

For more information check out the full award article.

Led by Radha Chandra, principal scientist in the Scores business unit at FICO, the FICO analytic development team posed the question: Can alternative data expand credit access?  After extensive research and validation, FICO launched FICO Score XD.  Through the development of FICO® Score XD, FICO provides a potential onramp to credit access for the majority of 50+ million Americans who are identified as ‘unscorable’.

In addition to traditional credit data, FICO® Score XD consumes alternative data from telco, cable, and other payment history, plus public record and property data.  One specific design goal of  FICO Score XD was to mirror the standards of the traditional FICO Score – same 300-850 score range, similar relationship between a given score and likelihood of repayment, and same characteristic logic and treatment.

FICO® Score XD uses positive and negative data from National Consumer Telecom and Utilities Exchange, Inc. (NCTUE) including new connect requests, payment history, current and historical account status and LexisNexisRisk Solutions including property ownership records, frequency of residential moves, plus bankruptcies, evictions, and liens. All data sources comply with the Fair Credit Reporting Act (FCRA), the federal law that regulates how consumer reporting agencies use data.

While FICO Scores based on traditional credit data remain the cornerstone of the FICO business, this initiative helps banks and other lenders expand their addressable market by leveraging scores that are built on models utilizing alternative data. Findings demonstrate that consumers with a high FICO® Score XD who go on to obtain credit, maintain a high traditional FICO score in the future – 49% scored 700 just 24 months later.

“FICO’s pioneering analytics are a key driver to our commitment to financial inclusion and provide lenders with a reliable, broad-based risk score that enables precise assessment of creditworthiness for consumers,” said Ethan Dornhelm, vice president, Scores and Analytics, FICO. “It is an honor that FICO – and Radha Chandra on our team – have been recognized by Drexel University for our work to help millions more people gain access to credit.”

FICO Score XD research and solutions have also paved the way for similar initiatives in Mexico, Russia, Turkey and now India. The latest solution – FICO Score X Data for India – was recently announced at Money 20/20.

The Analytics 50 is a collaboration between Drexel University’s LeBow College of Business and CIO.com to recognize companies using analytics to solve business challenges. Honorees were selected by a panel of researchers and practitioners based on the complexity of the business challenges they faced, the analytics solutions implemented, and the solutions’ business impact on the organization.

Let’s block ads! (Why?)

FICO

Periscope Data launches unified platform for data warehousing and analytics

 Periscope Data launches unified platform for data warehousing and analytics

Periscope Data is trying to make it easier for data teams to process large amounts of information from across their companies with a new cloud data warehouse service that the company announced today.

The Unified Data Platform is designed to provide a single source for companies to perform data processing, analytics, and visualization, which is important for generating insights from the information that’s tied up in business systems.

It’s supposed to solve a key problem: While companies have plenty of data about how their business is doing, it can be hard for analysts to reason over all of it, owing to a series of technical silos that makes collating and processing all that information difficult. Periscope Data’s new product helps fix that by bringing all of a company’s data into Amazon Web Services’ Redshift database.

Once the data is loaded in, teams can query the information using SQL, R, and Python. Periscope Data’s software is also able to provide visualizations in the form of both reports and dashboards. According to Periscope Data CEO Harry Glaser, this whole system is supposed to replace the existing approach to building data warehousing and analytics capabilities, which often requires chaining together multiple software packages.

“Generally speaking, if you’re a data team and you’re getting up and running, what you will do is assemble a platform out of various siloed products — some of them B2B products, some of them open source or consumer products — and then stitch them together with spit and duct tape,” he said in an interview with VentureBeat.

That causes a problem, though, since companies will then need to pay engineers to maintain that platform in order to keep their entire data processing regime working. Glaser said that the Unified Data Platform is designed to fix that.

In addition, having one unified platform means that it’s possible to see how changes to data affect an entire system, so companies can see what reports would be affected when a particular information source is shut off, for example.

Customers will get billed for the Unified Data Platform based on how many people use the software, as well as how much data they have stored within the data warehouse. When all is said and done, it will end up costing customers tens or hundreds of thousands of dollars per year, according to Glaser.

Glaser said that he could see a future in which customers would be able to select from different cloud platforms and different database engines in order to match the performance characteristics that they want. But that’s still a ways off. Right now, Periscope Data is quite happy working with Amazon, which was a partner for this launch.

What’s not on the table is an on-premises version of the Unified Data Platform. Glaser said that the company is remaining focused on serving only customers who want to have their information in the public cloud, since that’s where the market is headed, in his view.

Let’s block ads! (Why?)

Big Data – VentureBeat

Two Ways to Approach Federated Queries with U-SQL and Azure Data Lake Analytics

Did you know there are two ways to do federated queries with Azure Data Lake Analytics (ADLA)? By federated queries, I mean a query that combines (federates) data from multiple sources — in this case, from within Azure Data Lake and another data store. Federated queries are one aspect of data virtualization which helps us to access data without requiring the physical movement of data or data integration:

FederatedQueries Two Ways to Approach Federated Queries with U SQL and Azure Data Lake Analytics

The two methods for federated queries with U-SQL and ADLA are:

  1. Schema-less (aka “lazy metadata”)
  2. Via a pre-defined schema via an external table

You might be familiar with external tables in SQL Server, Azure SQL Data Warehouse, or APS. In those platforms, external tables work with PolyBase for purposes of querying data where it lives elsewhere, often for the purpose of loading it into the relational database. That same premise exists in Azure Data Lake Analytics as well. However, in the data lake there’s two approaches – an external table is still a good idea most of the time but it isn’t absolutely required.

Option 1: Schema-Less

Following are the components of making schema-less federated queries work in ADLA:

ADLA Schemaless Two Ways to Approach Federated Queries with U SQL and Azure Data Lake Analytics

Pros of the schema-less option:

  • Access the data quickly for exploration without requiring an external table to be defined in the ADLA Catalog
  • More closely aligned to a schema-on-read paradigm because of its flexibility 
  • Query flexibility: can retrieve a subset of columns without having to define all the columns

Cons of the schema-less option:

  • Additional “burden” on the data analyst doing the ad hoc querying with U-SQL to always perform the schema-on-read within the query
  • Repeating the same schema-on-read syntax in numerous U-SQL queries, rather than reusing the definition via an external table — so if the source system table or view changes, it could involve altering numerous U-SQL scripts.
  • Requires a rowset in the U-SQL schema-on-read queries – i.e., cannot do a direct join so this approach involves slightly longer, more complex syntax

Option 2: With a Pre-Defined Schema in an External Table

The following introduces an external table to the picture in order to enforce a schema:

ADLA SchemaExternalTable Two Ways to Approach Federated Queries with U SQL and Azure Data Lake Analytics

Pros of using an external table:

  • Most efficient on the data analyst doing the ad hoc querying with U-SQL
  • Easier, shorter syntax on the query side because columns and data types have already been predefined in the ADLA Catalog, so a direct join to an external table can be used in the query without having to define a rowset
  • Only one external table to change if a modification does occur to the underlying SQL table

Cons of using an external table:

  • Schema must remain consistent – a downstream U-SQL query will error if a new column is added to the remote source and the external table has not been kept in sync
  • All remote columns must be defined in the external table (not necessarily a big con – but definitely important to know)

In summary, the schema-less approach is most appropriate for initial data exploration because of the freedom and flexibility. An external table is better suited for ongoing, routine queries in which the SQL side is stable and unchanging. Solutions which have been operationalized and promoted to production will typically warrant an external table. 

Want to Know More?

During my all-day workshop, we set up each piece step by step including the the service principal, credential, data source, external table, and so forth so you can see the whole thing in action. The next workshop is in Washington DC on December 8th. For more details and how to register check here: Presenting a New Training Class on Architecting a Data Lake  

You Might Also Like…

Querying Multi-Structured JSON Files with U-SQL

Running U-SQL on a Schedule with Azure Data Factory to Populate Azure Data Lake

Handling Row Headers in U-SQL

Data Lake Use Cases and Planning Considerations

Let’s block ads! (Why?)

Blog – SQL Chick

Why Predictive Analytics Has Become Essential For Optimized FP&A

274993 274993 h ergb s gl e1508779915236 Why Predictive Analytics Has Become Essential For Optimized FP&A

Let me start with a quote from McKinsey, that in my view hits the nail right on the head:

“No matter what the context, there’s a strong possibility that blockchain will affect your business. The very big question is when.”

Now, in the industries that I cover in my role as general manager and innovation lead for travel and transportation/cargo, engineering, construction and operations, professional services, and media, I engage with many different digital leaders on a regular basis. We are having visionary conversations about the impact of digital technologies and digital transformation on business models and business processes and the way companies address them. Many topics are at different stages of the hype cycle, but the one that definitely stands out is blockchain as a new enabling technology in the enterprise space.

Just a few weeks ago, a customer said to me: “My board is all about blockchain, but I don’t get what the excitement is about – isn’t this just about Bitcoin and a cryptocurrency?”

I can totally understand his confusion. I’ve been talking to many blockchain experts who know that it will have a big impact on many industries and the related business communities. But even they are uncertain about the where, how, and when, and about the strategy on how to deal with it. The reason is that we often look at it from a technology point of view. This is a common mistake, as the starting point should be the business problem and the business issue or process that you want to solve or create.

In my many interactions with Torsten Zube, vice president and blockchain lead at the SAP Innovation Center Network (ICN) in Potsdam, Germany, he has made it very clear that it’s mandatory to “start by identifying the real business problem and then … figure out how blockchain can add value.” This is the right approach.

What we really need to do is provide guidance for our customers to enable them to bring this into the context of their business in order to understand and define valuable use cases for blockchain. We need to use design thinking or other creative strategies to identify the relevant fields for a particular company. We must work with our customers and review their processes and business models to determine which key blockchain aspects, such as provenance and trust, are crucial elements in their industry. This way, we can identify use cases in which blockchain will benefit their business and make their company more successful.

My highly regarded colleague Ulrich Scholl, who is responsible for externalizing the latest industry innovations, especially blockchain, in our SAP Industries organization, recently said: “These kinds of use cases are often not evident, as blockchain capabilities sometimes provide minor but crucial elements when used in combination with other enabling technologies such as IoT and machine learning.” In one recent and very interesting customer case from the autonomous province of South Tyrol, Italy, blockchain was one of various cloud platform services required to make this scenario happen.

How to identify “blockchainable” processes and business topics (value drivers)

To understand the true value and impact of blockchain, we need to keep in mind that a verified transaction can involve any kind of digital asset such as cryptocurrency, contracts, and records (for instance, assets can be tangible equipment or digital media). While blockchain can be used for many different scenarios, some don’t need blockchain technology because they could be handled by a simple ledger, managed and owned by the company, or have such a large volume of data that a distributed ledger cannot support it. Blockchain would not the right solution for these scenarios.

Here are some common factors that can help identify potential blockchain use cases:

  • Multiparty collaboration: Are many different parties, and not just one, involved in the process or scenario, but one party dominates everything? For example, a company with many parties in the ecosystem that are all connected to it but not in a network or more decentralized structure.
  • Process optimization: Will blockchain massively improve a process that today is performed manually, involves multiple parties, needs to be digitized, and is very cumbersome to manage or be part of?
  • Transparency and auditability: Is it important to offer each party transparency (e.g., on the origin, delivery, geolocation, and hand-overs) and auditable steps? (e.g., How can I be sure that the wine in my bottle really is from Bordeaux?)
  • Risk and fraud minimization: Does it help (or is there a need) to minimize risk and fraud for each party, or at least for most of them in the chain? (e.g., A company might want to know if its goods have suffered any shocks in transit or whether the predefined route was not followed.)

Connecting blockchain with the Internet of Things

This is where blockchain’s value can be increased and automated. Just think about a blockchain that is not just maintained or simply added by a human, but automatically acquires different signals from sensors, such as geolocation, temperature, shock, usage hours, alerts, etc. One that knows when a payment or any kind of money transfer has been made, a delivery has been received or arrived at its destination, or a digital asset has been downloaded from the Internet. The relevant automated actions or signals are then recorded in the distributed ledger/blockchain.

Of course, given the massive amount of data that is created by those sensors, automated signals, and data streams, it is imperative that only the very few pieces of data coming from a signal that are relevant for a specific business process or transaction be stored in a blockchain. By recording non-relevant data in a blockchain, we would soon hit data size and performance issues.

Ideas to ignite thinking in specific industries

  • The digital, “blockchained” physical asset (asset lifecycle management): No matter whether you build, use, or maintain an asset, such as a machine, a piece of equipment, a turbine, or a whole aircraft, a blockchain transaction (genesis block) can be created when the asset is created. The blockchain will contain all the contracts and information for the asset as a whole and its parts. In this scenario, an entry is made in the blockchain every time an asset is: sold; maintained by the producer or owner’s maintenance team; audited by a third-party auditor; has malfunctioning parts; sends or receives information from sensors; meets specific thresholds; has spare parts built in; requires a change to the purpose or the capability of the assets due to age or usage duration; receives (or doesn’t receive) payments; etc.
  • The delivery chain, bill of lading: In today’s world, shipping freight from A to B involves lots of manual steps. For example, a carrier receives a booking from a shipper or forwarder, confirms it, and, before the document cut-off time, receives the shipping instructions describing the content and how the master bill of lading should be created. The carrier creates the original bill of lading and hands it over to the ordering party (the current owner of the cargo). Today, that original paper-based bill of lading is required for the freight (the container) to be picked up at the destination (the port of discharge). Imagine if we could do this as a blockchain transaction and by forwarding a PDF by email. There would be one transaction at the beginning, when the shipping carrier creates the bill of lading. Then there would be look-ups, e.g., by the import and release processing clerk of the shipper at the port of discharge and the new owner of the cargo at the destination. Then another transaction could document that the container had been handed over.

The future

I personally believe in the massive transformative power of blockchain, even though we are just at the very beginning. This transformation will be achieved by looking at larger networks with many participants that all have a nearly equal part in a process. Today, many blockchain ideas still have a more centralistic approach, in which one company has a more prominent role than the (many) others and often is “managing” this blockchain/distributed ledger-supported process/approach.

But think about the delivery scenario today, where goods are shipped from one door or company to another door or company, across many parties in the delivery chain: from the shipper/producer via the third-party logistics service provider and/or freight forwarder; to the companies doing the actual transport, like vessels, trucks, aircraft, trains, cars, ferries, and so on; to the final destination/receiver. And all of this happens across many countries, many borders, many handovers, customs, etc., and involves a lot of paperwork, across all constituents.

“Blockchaining” this will be truly transformational. But it will need all constituents in the process or network to participate, even if they have different interests, and to agree on basic principles and an approach.

As Torsten Zube put it, I am not a “blockchain extremist” nor a denier that believes this is just a hype, but a realist open to embracing a new technology in order to change our processes for our collective benefit.

Turn insight into action, make better decisions, and transform your business. Learn how.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

Pentaho Business Analytics Blog

Today, our parent company Hitachi, a global leader across industries, infrastructure and technology, announced the formation of Hitachi Vantara , a company whose aim is to help organizations thrive in today’s uncertain and turbulent times and prepare for the future. This new company unifies the mission and operations of Pentaho,…

Let’s block ads! (Why?)

Pentaho Business Analytics Blog

How Executives Can Leverage Data Analytics To Enhance Performance

The business world is now firmly in the age of data. Not that data wasn’t relevant before; it was just nowhere close to the speed and volume that’s available to us today. Businesses are buckling under the deluge of petabytes, exabytes, and zettabytes. Within these bytes lie valuable information on customer behavior, key business insights, and revenue generation. However, all that data is practically useless for businesses without the ability to identify the right data. Plus, if they don’t have the talent and resources to capture the right data, organize it, dissect it, draw actionable insights from it and, finally, deliver those insights in a meaningful way, their data initiatives will fail.

Rise of the CDO

Companies of all sizes can easily find themselves drowning in data generated from websites, landing pages, social streams, emails, text messages, and many other sources. Additionally, there is data in their own repositories. With so much data at their disposal, companies are under mounting pressure to utilize it to generate insights. These insights are critical because they can (and should) drive the overall business strategy and help companies make better business decisions. To leverage the power of data analytics, businesses need more “top-management muscle” specialized in the field of data science. This specialized field has lead to the creation of roles like Chief Data Officer (CDO).

In addition, with more companies undertaking digital transformations, there’s greater impetus for the C-suite to make data-driven decisions. The CDO helps make data-driven decisions and also develops a digital business strategy around those decisions. As data grows at an unstoppable rate, becoming an inseparable part of key business functions, we will see the CDO act as a bridge between other C-suite execs.

Data skills an emerging business necessity

So far, only large enterprises with bigger data mining and management needs maintain in-house solutions. These in-house teams and technologies handle the growing sets of diverse and dispersed data. Others work with third-party service providers to develop and execute their big data strategies.

As the amount of data grows, the need to mine it for insights becomes a key business requirement. For both large and small businesses, data-centric roles will experience endless upward mobility. These roles include data anlysts and scientists. There is going to be a huge opportunity for critical thinkers to turn their analytical skills into rapidly growing roles in the field of data science. In fact, data skills are now a prized qualification for titles like IT project managers and computer systems analysts.

Forbes cited the McKinsey Global Institute’s prediction that by 2018 there could be a massive shortage of data-skilled professionals. This indicates a disruption at the demand-supply level with the needs for data skills at an all-time high. With an increasing number of companies adopting big data strategies, salaries for data jobs are going through the roof. This is turning the position into a highly coveted one.

According to Harvard Professor Gary King, “There is a big data revolution. The big data revolution is that now we can do something with the data.” The big problem is that most enterprises don’t know what to do with data. Data professionals are helping businesses figure that out. So if you’re casting about for where to apply your skills and want to take advantage of one of the best career paths in the job market today, focus on data science.

I’m compensated by University of Phoenix for this blog. As always, all thoughts and opinions are my own.

For more insight on our increasingly connected future, see The $ 19 Trillion Question: Are You Undervaluing The Internet Of Things?

The post Data Analysts and Scientists More Important Than Ever For the Enterprise appeared first on Millennial CEO.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

FogHorn raises $30 million to provide IoT edge computing analytics

 FogHorn raises $30 million to provide IoT edge computing analytics

FogHorn, which provides data analytics software for industrial and commercial Internet of Things (IoT) applications, announced today that is has secured $ 30 million in funding. Intel Capital and Saudi Aramco Energy Ventures co-led this round, with new investor Honeywell Ventures joining in. All previous investors also participated, including March Capital, GE, Dell, Bosch, Yokogawa, Darling Ventures, and The Hive.

“In the industrial application of IoT, such as manufacturing facilities, oil and gas production sites, and transportation systems, there are often hundreds or thousands of physical and video/audio sensors continuously producing a massive amount of high velocity data,” wrote FogHorn CEO David King, in an email to VentureBeat. “This data is being collected at the network ‘edge’, what Cisco coined the ‘fog’ layer several years ago.”

Edge computing is a method of optimizing cloud computing systems by performing data processing at the edge of the network, near the source of the data. According to King, industrial operators face several challenges when collecting and processing data, including a high volume of data-collecting sources, high costs in transporting this data in the cloud, and limits to real-time insights.

While latency may be fine when you are conversing with Amazon’s Alexa, having a delayed response to a gas leak could be extremely dangerous.

FogHorn’s Lighting platform has been purpose-built to run in very small footprint (256MB or smaller) edge computing systems. “The reason this is important is that the vast majority of data streaming from IoT sensors is useless within a very short period of time,” wrote King. “The information that is valuable — the anomalies and hard-to-detect patterns — need to be acted upon while operators can take corrective action.”

FogHorn licenses its software on a subscription basis to dozens of customers, according to King. The chief executive does not see any direct competitors focusing solely on tapping into streaming edge data for analytics, machine learning, and AI. “Amazon Greengrass and Microsoft Azure Edge are now targeting the edge with reduced footprint versions of their heavy cloud software stacks, but both still send most data to the cloud for advanced data science functionality,” he added.

The investment from Saudi Aramco Energy Ventures should secure FogHorn’s foothold in Saudi Arabia, which is one of the world’s biggest oil producers.

“Given the heavy presence of oil and gas, we expect it to be a large market in the future,” wrote King. “By partnering with Saudi Aramco Energy Ventures, we’re just beginning our reach into this market.”

To date, FogHorn has raised a total of $ 47.5 million. The Mountain View, California-based startup will use the fresh injection of capital to hire more engineers and increase sales and marketing efforts.

Founded in 2014 as part of The Hive incubator, FogHorn currently has more than 40 employees.

Sign up for Funding Daily: Get the latest news in your inbox every weekday.

Let’s block ads! (Why?)

Big Data – VentureBeat