Monthly Archives: February 2018

Disaster Recovery Plan Barriers (And How to Overcome Them)

Disaster recovery should be as central to your data management strategy as data security and reliability. But the fact is that it’s not at most organizations. Here’s how to overcome the challenges that most organizations face in implementing an effective disaster recovery plan.

According to Syncsort’s “State of Resilience” survey from January 2018, a full 85 percent of IT professionals report that their organizations either have no disaster recovery plan in place, or are not confident in the ability of their disaster recovery plans to meet their goals.

SoR Disaster Recovery Plan Disaster Recovery Plan Barriers (And How to Overcome Them)

That’s a pretty remarkable figure. Although disaster recovery may not be the most exciting topic for many IT professionals, it’s a crucial one for any business that wants to guarantee its survival following unexpected events.

Addressing Disaster Recovery Plan Hurdles

hurdles 1503753 960 720 600x Disaster Recovery Plan Barriers (And How to Overcome Them)

To build an effective disaster recovery plan — and avoid being counted among the 85 percent of professionals whose organizations lack one — you need to overcome the following disaster recovery hurdles:

  • Thinking backups are enough

Backing up data is part of the disaster recovery process. But it is only a part. A full disaster recovery solution requires data backups and a plan for restoring data quickly following a disaster.

  • Thinking cloud data is safe

If your data is stored in the cloud, you may think it can never disappear. In reality, there are lots of reasons why cloud-based data can be lost, ranging from cyber attacks and accidental data deletion to a failure on the part of your cloud provider. Even if you make extensive use of the cloud, you should still have a disaster-recovery plan in place.

  • Focusing only on compliance

In some cases, organizations may fail to implement effective disaster recovery because there are no compliance pressures for them to do so. Compliance frameworks are less likely to mandate disaster recovery than they are to regulate things like data security. However, compliance requirements shouldn’t be the reason you implement disaster recovery. You should have it in place for the sake of protecting your business and your customers, not because a government regulator tells you to create a disaster recovery plan.

  • Failure to understand what “disaster” means

The term disaster recovery can be a bit misleading. To some people, it implies that disaster recovery is necessary only for organizations that are at risk of suffering a major disaster, like a hurricane or massive fire that wipes out their data center. Those types of events are rare (especially if your data center is located in a region not prone to major natural disasters). However, the reality is that disasters come in many forms, and they may not always be large-scale. The failure of a single server could be a disaster if the server contains critical data that you cannot recover. Disaster recovery is about preparing for all types of disruptions, not just the fire-and-brimstone variety.

  • Lack of training

Your IT staff are well versed in tasks like installing software and storing data. Those are the things they learn in training programs and from on-the-job experience. But chances are that most of them lack prior experience with disaster recovery, and never formally studied the topic as part of training. This means that teaching your team to implement proper disaster recovery will require them to invest time in learning something new.

  • Lack of time and money

Spare time and money tend to be in short supply at most organizations. This shouldn’t be a reason for not investing in disaster recovery planning, however — especially when you consider how much the lack of a disaster recovery solution will cost you in the long run. Downtime costs vary according to the size of your business, of course, but put simply, they are enormous. Downtime also leads to employee productivity loss on the order of 78 percent. That’s a lot of money and staff time that can be saved by having an effective disaster recovery plan in place before a disruption occurs.

To learn more about the state of disaster recovery preparedness in organizations today, read Syncsort’s full “State of Resilience report.

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

How to Boost Your Marketing Analytics Skills

2018228 bnr marketing analytics skills 351x200 How to Boost Your Marketing Analytics Skills

Subscribe.

These two curated newsletters gather up a slew of great articles and resources about applied analytics, especially applied marketing analytics.

The Full Monty: This is a curated newsletter and/or a podcast. There are plenty of analytics-related articles mentioned here and a ton of other stuff of interest to marketers.

Azeem Azhar’s Exponential View: This Sunday newsletter leans heavy on AI and machine learning, but has irresistible, useful reading for marketers and SAAS people.

Follow experts on Twitter.

Need to get your analytics insights but the spoonful ― or maybe by the tweet? No problem. Follow anyone mentioned in this article, plus at least these three accounts (listed in no particular order):

Jeffalytics@jeffalytics: Conducting experiments in digital marketing and analytics.

Avinash Kaushik@avinash: Author, “Web Analytics 2.0 & Web Analytics: An Hour A Day”

Feras Alhlou @ferasa: Co-author of “Google Analytics Breakthrough: From Zero to Business Impact.”

Join marketing analytics groups on LinkedIn.

Analytics and Artificial Intelligence (AI) in Marketing and Retail (63,580 members). How they describe their group: “This group provides a platform for experts to discuss analytics, machine learning (ML), and artificial intelligence (AI) as relevant to marketing, media, and retail.”

Better Marketing with Analytics (15,104 members). “Our mission is to provide a place for people who want to learn about and use analytics for marketing. Meet other marketing people involved with analytics to have discussions and make connections. Whether you are someone who wants to use Analytics to improve your marketing efforts or someone who is already an expert of how to create and use it, this is the place for you.”

Big Data, Analytics, Business Intelligence & Visualization Experts Community  (210,797 members). “A premier community for both existing expert professionals and companies researching the convergence of big data analytics and discovery, Hadoop, data warehousing, cloud, unified data architectures, digital marketing, visualization, and business intelligence.”

Web Analytics Professionals (21,607 members). “Biggest analytics group in LinkedIn. Designed for analytics professionals. Lots of jobs, insights, benchmarks, industry leaders, tool providers, and many other user types. Your way to get in to the analytics industry.”

Get certified.

Google Analytics Qualification (for individuals). To become Google Analytics Qualified, you’ll have to pass the Google Analytics Individual Qualification (IQ) exam. To get access to the IQ Test, you’ll need to register with Google Partners. Joining Partners is free, and once you’re registered you’ll have access to a library of resources, including the Google Analytics for Beginners and Advanced Google Analytics tracks. Finish those training programs, take the 90-minute exam, get at least 80% on it, and you’re in!

The Direct Marketing Association’s marketing analytics certificate. One of my old NYU Master’s Degree Professors, the super smart Perry Drake, teaches this DMA course. The class is just 8-9 hours long. Costs $ 479 for DMA members and $ 799 for non-members.

General Assembly’s Data Analysis online courses. I came within an eyelash of taking this course a few years ago. It’s still on my wish list. 11-week course. $ 1,250.

University of Washington’s Professional and Continuing Education’s Certificate in Digital Marketing Analytics. 8 month curriculum; $ 3,297.

PennState’s Graduate Certificate in Marketing Analytics. This is a hefty program, with hefty requirements (4 full courses) and a hefty price tag ($ 11,160). But you can spread the coursework out, and if you really want to master data like a pro, seriously consider this program.

Listen.

Our own Nathan Isaacs has two recent podcasts about marketing analytics:

Kaushik’s site also has a page of the podcasts/webinars he’s appeared on recently. And don’t forget Audible for all the analytics books you don’t have time to read.

Watch.

TED Talks are a fantastic resource. And while the number of TED Talks expressly about marketing analytics is pretty small, there are a slew of talks about data.

Here are just a few of my favorites:

TED Talks are crazy good at one particular thing: Getting you to think about how analytics can be applied. This will add depth to your understanding of the more mundane parts of marketing analytics. It also helps you see your analytics work with fresh eyes.

Visualize.

How you present your data is almost as important as what you learn from it. Witness how hours of work can fall flat if you show a C-level executive a complex data visualization that doesn’t make sense to them. You standing there saying “but this, but that” won’t help if they don’t understand the chart you showed them.

So get smart. Clean up your charts by knowing how to present data in more elegant ways. Make your reports actionable and shareable.

Let’s block ads! (Why?)

Act-On Blog

Three Ways Artificial Intelligence Can Boost Your Marketing

In 2013, several UK supermarket chains discovered that products they were selling as beef were actually made at least partly—and in some cases, entirely—from horsemeat. The resulting uproar led to a series of product recalls, prompted stricter food testing, and spurred the European food industry to take a closer look at how unlabeled or mislabeled ingredients were finding their way into the food chain.

By 2020, a scandal like this will be eminently preventable.

The separation between bovine and equine will become immutable with Internet of Things (IoT) sensors, which will track the provenance and identity of every animal from stall to store, adding the data to a blockchain that anyone can check but no one can alter.

Food processing companies will be able to use that blockchain to confirm and label the contents of their products accordingly—down to the specific farms and animals represented in every individual package. That level of detail may be too much information for shoppers, but they will at least be able to trust that their meatballs come from the appropriate species.

The Spine of Digitalization

Q118 CoverFeature img1 spine Three Ways Artificial Intelligence Can Boost Your Marketing

Keeping food safer and more traceable is just the beginning, however. Improvements in the supply chain, which have been incremental for decades despite billions of dollars of technology investments, are about to go exponential. Emerging technologies are converging to transform the supply chain from tactical to strategic, from an easily replicable commodity to a new source of competitive differentiation.

You may already be thinking about how to take advantage of blockchain technology, which makes data and transactions immutable, transparent, and verifiable (see “What Is Blockchain and How Does It Work?”). That will be a powerful tool to boost supply chain speed and efficiency—always a worthy goal, but hardly a disruptive one.

However, if you think of blockchain as the spine of digitalization and technologies such as AI, the IoT, 3D printing, autonomous vehicles, and drones as the limbs, you have a powerful supply chain body that can leapfrog ahead of its competition.

What Is Blockchain and How Does It Work?

Here’s why blockchain technology is critical to transforming the supply chain.

Blockchain is essentially a sequential, distributed ledger of transactions that is constantly updated on a global network of computers. The ownership and history of a transaction is embedded in the blockchain at the transaction’s earliest stages and verified at every subsequent stage.

A blockchain network uses vast amounts of computing power to encrypt the ledger as it’s being written. This makes it possible for every computer in the network to verify the transactions safely and transparently. The more organizations that participate in the ledger, the more complex and secure the encryption becomes, making it increasingly tamperproof.

Why does blockchain matter for the supply chain?

  • It enables the safe exchange of value without a central verifying partner, which makes transactions faster and less expensive.
  • It dramatically simplifies recordkeeping by establishing a single, authoritative view of the truth across all parties.
  • It builds a secure, immutable history and chain of custody as different parties handle the items being shipped, and it updates the relevant documentation.
  • By doing these things, blockchain allows companies to create smart contracts based on programmable business logic, which can execute themselves autonomously and thereby save time and money by reducing friction and intermediaries.

Hints of the Future

Q118 CoverFeature img2 future Three Ways Artificial Intelligence Can Boost Your MarketingIn the mid-1990s, when the World Wide Web was in its infancy, we had no idea that the internet would become so large and pervasive, nor that we’d find a way to carry it all in our pockets on small slabs of glass.

But we could tell that it had vast potential.

Today, with the combination of emerging technologies that promise to turbocharge digital transformation, we’re just beginning to see how we might turn the supply chain into a source of competitive advantage (see “What’s the Magic Combination?”).

What’s the Magic Combination?

Those who focus on blockchain in isolation will miss out on a much bigger supply chain opportunity.

Many experts believe emerging technologies will work with blockchain to digitalize the supply chain and create new business models:

  • Blockchain will provide the foundation of automated trust for all parties in the supply chain.
  • The IoT will link objects—from tiny devices to large machines—and generate data about status, locations, and transactions that will be recorded on the blockchain.
  • 3D printing will extend the supply chain to the customer’s doorstep with hyperlocal manufacturing of parts and products with IoT sensors built into the items and/or their packaging. Every manufactured object will be smart, connected, and able to communicate so that it can be tracked and traced as needed.
  • Big Data management tools will process all the information streaming in around the clock from IoT sensors.
  • AI and machine learning will analyze this enormous amount of data to reveal patterns and enable true predictability in every area of the supply chain.

Combining these technologies with powerful analytics tools to predict trends will make lack of visibility into the supply chain a thing of the past. Organizations will be able to examine a single machine across its entire lifecycle and identify areas where they can improve performance and increase return on investment. They’ll be able to follow and monitor every component of a product, from design through delivery and service. They’ll be able to trigger and track automated actions between and among partners and customers to provide customized transactions in real time based on real data.

After decades of talk about markets of one, companies will finally have the power to create them—at scale and profitably.

Amazon, for example, is becoming as much a logistics company as a retailer. Its ordering and delivery systems are so streamlined that its customers can launch and complete a same-day transaction with a push of a single IP-enabled button or a word to its ever-attentive AI device, Alexa. And this level of experimentation and innovation is bubbling up across industries.

Consider manufacturing, where the IoT is transforming automation inside already highly automated factories. Machine-to-machine communication is enabling robots to set up, provision, and unload equipment quickly and accurately with minimal human intervention. Meanwhile, sensors across the factory floor are already capable of gathering such information as how often each machine needs maintenance or how much raw material to order given current production trends.

Once they harvest enough data, businesses will be able to feed it through machine learning algorithms to identify trends that forecast future outcomes. At that point, the supply chain will start to become both automated and predictive. We’ll begin to see business models that include proactively scheduling maintenance, replacing parts just before they’re likely to break, and automatically ordering materials and initiating customer shipments.

Italian train operator Trenitalia, for example, has put IoT sensors on its locomotives and passenger cars and is using analytics and in-memory computing to gauge the health of its trains in real time, according to an article in Computer Weekly. “It is now possible to affordably collect huge amounts of data from hundreds of sensors in a single train, analyse that data in real time and detect problems before they actually happen,” Trenitalia’s CIO Danilo Gismondi told Computer Weekly.

The project, which is scheduled to be completed in 2018, will change Trenitalia’s business model, allowing it to schedule more trips and make each one more profitable. The railway company will be able to better plan parts inventories and determine which lines are consistently performing poorly and need upgrades. The new system will save €100 million a year, according to ARC Advisory Group.

New business models continue to evolve as 3D printers become more sophisticated and affordable, making it possible to move the end of the supply chain closer to the customer. Companies can design parts and products in materials ranging from carbon fiber to chocolate and then print those items in their warehouse, at a conveniently located third-party vendor, or even on the client’s premises.

In addition to minimizing their shipping expenses and reducing fulfillment time, companies will be able to offer more personalized or customized items affordably in small quantities. For example, clothing retailer Ministry of Supply recently installed a 3D printer at its Boston store that enables it to make an article of clothing to a customer’s specifications in under 90 minutes, according to an article in Forbes.

This kind of highly distributed manufacturing has potential across many industries. It could even create a market for secure manufacturing for highly regulated sectors, allowing a manufacturer to transmit encrypted templates to printers in tightly protected locations, for example.

Meanwhile, organizations are investigating ways of using blockchain technology to authenticate, track and trace, automate, and otherwise manage transactions and interactions, both internally and within their vendor and customer networks. The ability to collect data, record it on the blockchain for immediate verification, and make that trustworthy data available for any application delivers indisputable value in any business context. The supply chain will be no exception.

Q118 CoverFeature img3 cheers ver2 Three Ways Artificial Intelligence Can Boost Your Marketing

Blockchain Is the Change Driver

The supply chain is configured as we know it today because it’s impossible to create a contract that accounts for every possible contingency. Consider cross-border financial transfers, which are so complex and must meet so many regulations that they require a tremendous number of intermediaries to plug the gaps: lawyers, accountants, customer service reps, warehouse operators, bankers, and more. By reducing that complexity, blockchain technology makes intermediaries less necessary—a transformation that is revolutionary even when measured only in cost savings.

“If you’re selling 100 items a minute, 24 hours a day, reducing the cost of the supply chain by just $ 1 per item saves you more than $ 52.5 million a year,” notes Dirk Lonser, SAP go-to-market leader at DXC Technology, an IT services company. “By replacing manual processes and multiple peer-to-peer connections through fax or e-mail with a single medium where everyone can exchange verified information instantaneously, blockchain will boost profit margins exponentially without raising prices or even increasing individual productivity.”

But the potential for blockchain extends far beyond cost cutting and streamlining, says Irfan Khan, CEO of supply chain management consulting and systems integration firm Bristlecone, a Mahindra Group company. It will give companies ways to differentiate.

“Blockchain will let enterprises more accurately trace faulty parts or products from end users back to factories for recalls,” Khan says. “It will streamline supplier onboarding, contracting, and management by creating an integrated platform that the company’s entire network can access in real time. It will give vendors secure, transparent visibility into inventory 24×7. And at a time when counterfeiting is a real concern in multiple industries, it will make it easy for both retailers and customers to check product authenticity.”

Blockchain allows all the critical steps of the supply chain to go electronic and become irrefutably verifiable by all the critical parties within minutes: the seller and buyer, banks, logistics carriers, and import and export officials. Although the key parts of the process remain the same as in today’s analog supply chain, performing them electronically with blockchain technology shortens each stage from hours or days to seconds while eliminating reams of wasteful paperwork. With goods moving that quickly, companies have ample room for designing new business models around manufacturing, service, and delivery.

Q118 CoverFeature img4 roadblock Three Ways Artificial Intelligence Can Boost Your Marketing

Challenges on the Path to Adoption

For all this to work, however, the data on the blockchain must be correct from the beginning. The pills, produce, or parts on the delivery truck need to be the same as the items listed on the manifest at the loading dock. Every use case assumes that the data is accurate—and that will only happen when everything that’s manufactured is smart, connected, and able to self-verify automatically with the help of machine learning tuned to detect errors and potential fraud.

Companies are already seeing the possibilities of applying this bundle of emerging technologies to the supply chain. IDC projects that by 2021, at least 25% of Forbes Global 2000 (G2000) companies will use blockchain services as a foundation for digital trust at scale; 30% of top global manufacturers and retailers will do so by 2020. IDC also predicts that by 2020, up to 10% of pilot and production blockchain-distributed ledgers will incorporate data from IoT sensors.

Despite IDC’s optimism, though, the biggest barrier to adoption is the early stage level of enterprise use cases, particularly around blockchain. Currently, the sole significant enterprise blockchain production system is the virtual currency Bitcoin, which has unfortunately been tainted by its associations with speculation, dubious financial transactions, and the so-called dark web.

The technology is still in a sufficiently early stage that there’s significant uncertainty about its ability to handle the massive amounts of data a global enterprise supply chain generates daily. Never mind that it’s completely unregulated, with no global standard. There’s also a critical global shortage of experts who can explain emerging technologies like blockchain, the IoT, and machine learning to nontechnology industries and educate organizations in how the technologies can improve their supply chain processes. Finally, there is concern about how blockchain’s complex algorithms gobble computing power—and electricity (see “Blockchain Blackouts”).

Blockchain Blackouts

Q118 CoverFeature img5 blackout Three Ways Artificial Intelligence Can Boost Your MarketingBlockchain is a power glutton. Can technology mediate the issue?

A major concern today is the enormous carbon footprint of the networks creating and solving the algorithmic problems that keep blockchains secure. Although virtual currency enthusiasts claim the problem is overstated, Michael Reed, head of blockchain technology for Intel, has been widely quoted as saying that the energy demands of blockchains are a significant drain on the world’s electricity resources.

Indeed, Wired magazine has estimated that by July 2019, the Bitcoin network alone will require more energy than the entire United States currently uses and that by February 2020 it will use as much electricity as the entire world does today.

Still, computing power is becoming more energy efficient by the day and sticking with paperwork will become too slow, so experts—Intel’s Reed among them—consider this a solvable problem.

“We don’t know yet what the market will adopt. In a decade, it might be status quo or best practice, or it could be the next Betamax, a great technology for which there was no demand,” Lonser says. “Even highly regulated industries that need greater transparency in the entire supply chain are moving fairly slowly.”

Blockchain will require acceptance by a critical mass of companies, governments, and other organizations before it displaces paper documentation. It’s a chicken-and-egg issue: multiple companies need to adopt these technologies at the same time so they can build a blockchain to exchange information, yet getting multiple companies to do anything simultaneously is a challenge. Some early initiatives are already underway, though:

  • A London-based startup called Everledger is using blockchain and IoT technology to track the provenance, ownership, and lifecycles of valuable assets. The company began by tracking diamonds from mine to jewelry using roughly 200 different characteristics, with a goal of stopping both the demand for and the supply of “conflict diamonds”—diamonds mined in war zones and sold to finance insurgencies. It has since expanded to cover wine, artwork, and other high-value items to prevent fraud and verify authenticity.
  • In September 2017, SAP announced the creation of its SAP Leonardo Blockchain Co-Innovation program, a group of 27 enterprise customers interested in co-innovating around blockchain and creating business buy-in. The diverse group of participants includes management and technology services companies Capgemini and Deloitte, cosmetics company Natura Cosméticos S.A., and Moog Inc., a manufacturer of precision motion control systems.
  • Two of Europe’s largest shipping ports—Rotterdam and Antwerp—are working on blockchain projects to streamline interaction with port customers. The Antwerp terminal authority says eliminating paperwork could cut the costs of container transport by as much as 50%.
  • The Chinese online shopping behemoth Alibaba is experimenting with blockchain to verify the authenticity of food products and catch counterfeits before they endanger people’s health and lives.
  • Technology and transportation executives have teamed up to create the Blockchain in Transport Alliance (BiTA), a forum for developing blockchain standards and education for the freight industry.

It’s likely that the first blockchain-based enterprise supply chain use case will emerge in the next year among companies that see it as an opportunity to bolster their legal compliance and improve business processes. Once that happens, expect others to follow.

Q118 CoverFeature img7 milk Three Ways Artificial Intelligence Can Boost Your Marketing

Customers Will Expect Change

It’s only a matter of time before the supply chain becomes a competitive driver. The question for today’s enterprises is how to prepare for the shift. Customers are going to expect constant, granular visibility into their transactions and faster, more customized service every step of the way. Organizations will need to be ready to meet those expectations.

If organizations have manual business processes that could never be automated before, now is the time to see if it’s possible. Organizations that have made initial investments in emerging technologies are looking at how their pilot projects are paying off and where they might extend to the supply chain. They are starting to think creatively about how to combine technologies to offer a product, service, or business model not possible before.

A manufacturer will load a self-driving truck with a 3D printer capable of creating a customer’s ordered item en route to delivering it. A vendor will capture the market for a socially responsible product by allowing its customers to track the product’s production and verify that none of its subcontractors use slave labor. And a supermarket chain will win over customers by persuading them that their choice of supermarket is also a choice between being certain of what’s in their food and simply hoping that what’s on the label matches what’s inside.

At that point, a smart supply chain won’t just be a competitive edge. It will become a competitive necessity. D!


About the Authors

Gil Perez is Senior Vice President, Internet of Things and Digital Supply Chain, at SAP.

Tom Raftery is Global Vice President, Futurist, and Internet of Things Evangelist, at SAP.

Hans Thalbauer is Senior Vice President, Internet of Things and Digital Supply Chain, at SAP.

Dan Wellers is Global Lead, Digital Futures, at SAP.

Fawn Fitter is a freelance writer specializing in business and technology.

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

HBO’s ‘Fahrenheit 451’ New Film Starring Michael B. Jordan Releases New Trailer

Michael B Jordan HBO’s ‘Fahrenheit 451’ New Film Starring Michael B. Jordan Releases New Trailer

HBO released its first look at “Fahrenheit 451,” starring Michael B. Jordan and Michael Shannon.

Based on Ray Bradbury’s 1953 dystopian novel of the same name, the film follows a futuristic American society where books are burned to censor and destroy knowledge.

“By the time you guys grow up, there won’t be one book left,” Jordan’s character Guy Montag, a “fireman” in charge of burning the novels who struggles with his role of erasing history, says in the clip before an auditorium of children.

“A little knowledge is a dangerous thing — news, facts, memoirs, internet of old. Burn it,” Shannon, who plays Jordan’s captain, says in the trailer. “We are not born equal, so we must be made equal by the fire.”

Directed by Ramin Bahrani, the film also stars Sofia Boutella, Lilly Singh, Laura Harrier, and Martin Donovan. Jordan served as an executive producer on the project, producing through his Outlier Productions. Other executive producers are Sarah Green of Brace Cove Productions, Alan Gasmer, Peter Jaysen, and Noruz Films’ Bahrani, who co-wrote with Amir Naderi. David Coatsworth is a producer.

This is not the first time Bradbury’s story has been adapted, as it was previously made into a 1966 British film, a 1979 play, a 1984 video game, and a collection of short stories by the author, titled “A Pleasure to Burn.”

“Fahrenheit 451” will be released on HBO in May.

Gina Rodriguez To Star In And Produce New Netflix Comedy Titled ‘Someone Great’!

Let’s block ads! (Why?)

The Humor Mill

What CFOs Need To Know As The Workforce Adjusts To The Digital Age: Part 1

In 2013, several UK supermarket chains discovered that products they were selling as beef were actually made at least partly—and in some cases, entirely—from horsemeat. The resulting uproar led to a series of product recalls, prompted stricter food testing, and spurred the European food industry to take a closer look at how unlabeled or mislabeled ingredients were finding their way into the food chain.

By 2020, a scandal like this will be eminently preventable.

The separation between bovine and equine will become immutable with Internet of Things (IoT) sensors, which will track the provenance and identity of every animal from stall to store, adding the data to a blockchain that anyone can check but no one can alter.

Food processing companies will be able to use that blockchain to confirm and label the contents of their products accordingly—down to the specific farms and animals represented in every individual package. That level of detail may be too much information for shoppers, but they will at least be able to trust that their meatballs come from the appropriate species.

The Spine of Digitalization

Q118 CoverFeature img1 spine What CFOs Need To Know As The Workforce Adjusts To The Digital Age: Part 1

Keeping food safer and more traceable is just the beginning, however. Improvements in the supply chain, which have been incremental for decades despite billions of dollars of technology investments, are about to go exponential. Emerging technologies are converging to transform the supply chain from tactical to strategic, from an easily replicable commodity to a new source of competitive differentiation.

You may already be thinking about how to take advantage of blockchain technology, which makes data and transactions immutable, transparent, and verifiable (see “What Is Blockchain and How Does It Work?”). That will be a powerful tool to boost supply chain speed and efficiency—always a worthy goal, but hardly a disruptive one.

However, if you think of blockchain as the spine of digitalization and technologies such as AI, the IoT, 3D printing, autonomous vehicles, and drones as the limbs, you have a powerful supply chain body that can leapfrog ahead of its competition.

What Is Blockchain and How Does It Work?

Here’s why blockchain technology is critical to transforming the supply chain.

Blockchain is essentially a sequential, distributed ledger of transactions that is constantly updated on a global network of computers. The ownership and history of a transaction is embedded in the blockchain at the transaction’s earliest stages and verified at every subsequent stage.

A blockchain network uses vast amounts of computing power to encrypt the ledger as it’s being written. This makes it possible for every computer in the network to verify the transactions safely and transparently. The more organizations that participate in the ledger, the more complex and secure the encryption becomes, making it increasingly tamperproof.

Why does blockchain matter for the supply chain?

  • It enables the safe exchange of value without a central verifying partner, which makes transactions faster and less expensive.
  • It dramatically simplifies recordkeeping by establishing a single, authoritative view of the truth across all parties.
  • It builds a secure, immutable history and chain of custody as different parties handle the items being shipped, and it updates the relevant documentation.
  • By doing these things, blockchain allows companies to create smart contracts based on programmable business logic, which can execute themselves autonomously and thereby save time and money by reducing friction and intermediaries.

Hints of the Future

Q118 CoverFeature img2 future What CFOs Need To Know As The Workforce Adjusts To The Digital Age: Part 1In the mid-1990s, when the World Wide Web was in its infancy, we had no idea that the internet would become so large and pervasive, nor that we’d find a way to carry it all in our pockets on small slabs of glass.

But we could tell that it had vast potential.

Today, with the combination of emerging technologies that promise to turbocharge digital transformation, we’re just beginning to see how we might turn the supply chain into a source of competitive advantage (see “What’s the Magic Combination?”).

What’s the Magic Combination?

Those who focus on blockchain in isolation will miss out on a much bigger supply chain opportunity.

Many experts believe emerging technologies will work with blockchain to digitalize the supply chain and create new business models:

  • Blockchain will provide the foundation of automated trust for all parties in the supply chain.
  • The IoT will link objects—from tiny devices to large machines—and generate data about status, locations, and transactions that will be recorded on the blockchain.
  • 3D printing will extend the supply chain to the customer’s doorstep with hyperlocal manufacturing of parts and products with IoT sensors built into the items and/or their packaging. Every manufactured object will be smart, connected, and able to communicate so that it can be tracked and traced as needed.
  • Big Data management tools will process all the information streaming in around the clock from IoT sensors.
  • AI and machine learning will analyze this enormous amount of data to reveal patterns and enable true predictability in every area of the supply chain.

Combining these technologies with powerful analytics tools to predict trends will make lack of visibility into the supply chain a thing of the past. Organizations will be able to examine a single machine across its entire lifecycle and identify areas where they can improve performance and increase return on investment. They’ll be able to follow and monitor every component of a product, from design through delivery and service. They’ll be able to trigger and track automated actions between and among partners and customers to provide customized transactions in real time based on real data.

After decades of talk about markets of one, companies will finally have the power to create them—at scale and profitably.

Amazon, for example, is becoming as much a logistics company as a retailer. Its ordering and delivery systems are so streamlined that its customers can launch and complete a same-day transaction with a push of a single IP-enabled button or a word to its ever-attentive AI device, Alexa. And this level of experimentation and innovation is bubbling up across industries.

Consider manufacturing, where the IoT is transforming automation inside already highly automated factories. Machine-to-machine communication is enabling robots to set up, provision, and unload equipment quickly and accurately with minimal human intervention. Meanwhile, sensors across the factory floor are already capable of gathering such information as how often each machine needs maintenance or how much raw material to order given current production trends.

Once they harvest enough data, businesses will be able to feed it through machine learning algorithms to identify trends that forecast future outcomes. At that point, the supply chain will start to become both automated and predictive. We’ll begin to see business models that include proactively scheduling maintenance, replacing parts just before they’re likely to break, and automatically ordering materials and initiating customer shipments.

Italian train operator Trenitalia, for example, has put IoT sensors on its locomotives and passenger cars and is using analytics and in-memory computing to gauge the health of its trains in real time, according to an article in Computer Weekly. “It is now possible to affordably collect huge amounts of data from hundreds of sensors in a single train, analyse that data in real time and detect problems before they actually happen,” Trenitalia’s CIO Danilo Gismondi told Computer Weekly.

The project, which is scheduled to be completed in 2018, will change Trenitalia’s business model, allowing it to schedule more trips and make each one more profitable. The railway company will be able to better plan parts inventories and determine which lines are consistently performing poorly and need upgrades. The new system will save €100 million a year, according to ARC Advisory Group.

New business models continue to evolve as 3D printers become more sophisticated and affordable, making it possible to move the end of the supply chain closer to the customer. Companies can design parts and products in materials ranging from carbon fiber to chocolate and then print those items in their warehouse, at a conveniently located third-party vendor, or even on the client’s premises.

In addition to minimizing their shipping expenses and reducing fulfillment time, companies will be able to offer more personalized or customized items affordably in small quantities. For example, clothing retailer Ministry of Supply recently installed a 3D printer at its Boston store that enables it to make an article of clothing to a customer’s specifications in under 90 minutes, according to an article in Forbes.

This kind of highly distributed manufacturing has potential across many industries. It could even create a market for secure manufacturing for highly regulated sectors, allowing a manufacturer to transmit encrypted templates to printers in tightly protected locations, for example.

Meanwhile, organizations are investigating ways of using blockchain technology to authenticate, track and trace, automate, and otherwise manage transactions and interactions, both internally and within their vendor and customer networks. The ability to collect data, record it on the blockchain for immediate verification, and make that trustworthy data available for any application delivers indisputable value in any business context. The supply chain will be no exception.

Q118 CoverFeature img3 cheers ver2 What CFOs Need To Know As The Workforce Adjusts To The Digital Age: Part 1

Blockchain Is the Change Driver

The supply chain is configured as we know it today because it’s impossible to create a contract that accounts for every possible contingency. Consider cross-border financial transfers, which are so complex and must meet so many regulations that they require a tremendous number of intermediaries to plug the gaps: lawyers, accountants, customer service reps, warehouse operators, bankers, and more. By reducing that complexity, blockchain technology makes intermediaries less necessary—a transformation that is revolutionary even when measured only in cost savings.

“If you’re selling 100 items a minute, 24 hours a day, reducing the cost of the supply chain by just $ 1 per item saves you more than $ 52.5 million a year,” notes Dirk Lonser, SAP go-to-market leader at DXC Technology, an IT services company. “By replacing manual processes and multiple peer-to-peer connections through fax or e-mail with a single medium where everyone can exchange verified information instantaneously, blockchain will boost profit margins exponentially without raising prices or even increasing individual productivity.”

But the potential for blockchain extends far beyond cost cutting and streamlining, says Irfan Khan, CEO of supply chain management consulting and systems integration firm Bristlecone, a Mahindra Group company. It will give companies ways to differentiate.

“Blockchain will let enterprises more accurately trace faulty parts or products from end users back to factories for recalls,” Khan says. “It will streamline supplier onboarding, contracting, and management by creating an integrated platform that the company’s entire network can access in real time. It will give vendors secure, transparent visibility into inventory 24×7. And at a time when counterfeiting is a real concern in multiple industries, it will make it easy for both retailers and customers to check product authenticity.”

Blockchain allows all the critical steps of the supply chain to go electronic and become irrefutably verifiable by all the critical parties within minutes: the seller and buyer, banks, logistics carriers, and import and export officials. Although the key parts of the process remain the same as in today’s analog supply chain, performing them electronically with blockchain technology shortens each stage from hours or days to seconds while eliminating reams of wasteful paperwork. With goods moving that quickly, companies have ample room for designing new business models around manufacturing, service, and delivery.

Q118 CoverFeature img4 roadblock What CFOs Need To Know As The Workforce Adjusts To The Digital Age: Part 1

Challenges on the Path to Adoption

For all this to work, however, the data on the blockchain must be correct from the beginning. The pills, produce, or parts on the delivery truck need to be the same as the items listed on the manifest at the loading dock. Every use case assumes that the data is accurate—and that will only happen when everything that’s manufactured is smart, connected, and able to self-verify automatically with the help of machine learning tuned to detect errors and potential fraud.

Companies are already seeing the possibilities of applying this bundle of emerging technologies to the supply chain. IDC projects that by 2021, at least 25% of Forbes Global 2000 (G2000) companies will use blockchain services as a foundation for digital trust at scale; 30% of top global manufacturers and retailers will do so by 2020. IDC also predicts that by 2020, up to 10% of pilot and production blockchain-distributed ledgers will incorporate data from IoT sensors.

Despite IDC’s optimism, though, the biggest barrier to adoption is the early stage level of enterprise use cases, particularly around blockchain. Currently, the sole significant enterprise blockchain production system is the virtual currency Bitcoin, which has unfortunately been tainted by its associations with speculation, dubious financial transactions, and the so-called dark web.

The technology is still in a sufficiently early stage that there’s significant uncertainty about its ability to handle the massive amounts of data a global enterprise supply chain generates daily. Never mind that it’s completely unregulated, with no global standard. There’s also a critical global shortage of experts who can explain emerging technologies like blockchain, the IoT, and machine learning to nontechnology industries and educate organizations in how the technologies can improve their supply chain processes. Finally, there is concern about how blockchain’s complex algorithms gobble computing power—and electricity (see “Blockchain Blackouts”).

Blockchain Blackouts

Q118 CoverFeature img5 blackout What CFOs Need To Know As The Workforce Adjusts To The Digital Age: Part 1Blockchain is a power glutton. Can technology mediate the issue?

A major concern today is the enormous carbon footprint of the networks creating and solving the algorithmic problems that keep blockchains secure. Although virtual currency enthusiasts claim the problem is overstated, Michael Reed, head of blockchain technology for Intel, has been widely quoted as saying that the energy demands of blockchains are a significant drain on the world’s electricity resources.

Indeed, Wired magazine has estimated that by July 2019, the Bitcoin network alone will require more energy than the entire United States currently uses and that by February 2020 it will use as much electricity as the entire world does today.

Still, computing power is becoming more energy efficient by the day and sticking with paperwork will become too slow, so experts—Intel’s Reed among them—consider this a solvable problem.

“We don’t know yet what the market will adopt. In a decade, it might be status quo or best practice, or it could be the next Betamax, a great technology for which there was no demand,” Lonser says. “Even highly regulated industries that need greater transparency in the entire supply chain are moving fairly slowly.”

Blockchain will require acceptance by a critical mass of companies, governments, and other organizations before it displaces paper documentation. It’s a chicken-and-egg issue: multiple companies need to adopt these technologies at the same time so they can build a blockchain to exchange information, yet getting multiple companies to do anything simultaneously is a challenge. Some early initiatives are already underway, though:

  • A London-based startup called Everledger is using blockchain and IoT technology to track the provenance, ownership, and lifecycles of valuable assets. The company began by tracking diamonds from mine to jewelry using roughly 200 different characteristics, with a goal of stopping both the demand for and the supply of “conflict diamonds”—diamonds mined in war zones and sold to finance insurgencies. It has since expanded to cover wine, artwork, and other high-value items to prevent fraud and verify authenticity.
  • In September 2017, SAP announced the creation of its SAP Leonardo Blockchain Co-Innovation program, a group of 27 enterprise customers interested in co-innovating around blockchain and creating business buy-in. The diverse group of participants includes management and technology services companies Capgemini and Deloitte, cosmetics company Natura Cosméticos S.A., and Moog Inc., a manufacturer of precision motion control systems.
  • Two of Europe’s largest shipping ports—Rotterdam and Antwerp—are working on blockchain projects to streamline interaction with port customers. The Antwerp terminal authority says eliminating paperwork could cut the costs of container transport by as much as 50%.
  • The Chinese online shopping behemoth Alibaba is experimenting with blockchain to verify the authenticity of food products and catch counterfeits before they endanger people’s health and lives.
  • Technology and transportation executives have teamed up to create the Blockchain in Transport Alliance (BiTA), a forum for developing blockchain standards and education for the freight industry.

It’s likely that the first blockchain-based enterprise supply chain use case will emerge in the next year among companies that see it as an opportunity to bolster their legal compliance and improve business processes. Once that happens, expect others to follow.

Q118 CoverFeature img7 milk What CFOs Need To Know As The Workforce Adjusts To The Digital Age: Part 1

Customers Will Expect Change

It’s only a matter of time before the supply chain becomes a competitive driver. The question for today’s enterprises is how to prepare for the shift. Customers are going to expect constant, granular visibility into their transactions and faster, more customized service every step of the way. Organizations will need to be ready to meet those expectations.

If organizations have manual business processes that could never be automated before, now is the time to see if it’s possible. Organizations that have made initial investments in emerging technologies are looking at how their pilot projects are paying off and where they might extend to the supply chain. They are starting to think creatively about how to combine technologies to offer a product, service, or business model not possible before.

A manufacturer will load a self-driving truck with a 3D printer capable of creating a customer’s ordered item en route to delivering it. A vendor will capture the market for a socially responsible product by allowing its customers to track the product’s production and verify that none of its subcontractors use slave labor. And a supermarket chain will win over customers by persuading them that their choice of supermarket is also a choice between being certain of what’s in their food and simply hoping that what’s on the label matches what’s inside.

At that point, a smart supply chain won’t just be a competitive edge. It will become a competitive necessity. D!


About the Authors

Gil Perez is Senior Vice President, Internet of Things and Digital Supply Chain, at SAP.

Tom Raftery is Global Vice President, Futurist, and Internet of Things Evangelist, at SAP.

Hans Thalbauer is Senior Vice President, Internet of Things and Digital Supply Chain, at SAP.

Dan Wellers is Global Lead, Digital Futures, at SAP.

Fawn Fitter is a freelance writer specializing in business and technology.

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

Google and NCAA team up on March Madness AI competition with $100,000 in prizes

 Google and NCAA team up on March Madness AI competition with $100,000 in prizes

Google and the NCAA are staging a data science contest to see who can best pick a bracket for the upcoming March Madness college basketball tournament. The two organizations have ponied up some serious incentives for people to compete, with a total of $ 100,000 in prize money available.

The contest will be hosted by Kaggle, a platform for data science competitions that Google acquired last year. Teams will compete in trials for both the Division I men’s and women’s tournament. For each of those tournaments, the first place team will receive $ 25,000, the second place team will receive $ 15,000, and the third place team will receive $ 10,000.

This is hardly the first time AI techniques have been used to tackle the question of how to build a March Madness bracket, nor even the first time Kaggle has hosted a competition to see which machine reigns supreme. Members of the online machine learning competition community have competed since 2014, mostly for bragging rights and ranking on the platform.

Microsoft’s Bing search engine and Cortana virtual assistant also offer that tech titan’s predictions for who’s going to succeed.

Google is using this tournament as an opportunity to drum up interest in machine learning, especially for its own cloud services. The company will be co-hosting 20 events to teach college students about machine learning skills and how to use Google Cloud Platform. Kaggle CEO Anthony Goldbloom will hold an “ask me anything” question and answer session on Reddit Wednesday at 10 a.m. Pacific to discuss the application of machine learning to topical events like this competition.

Right now, Kaggle teams can access data for how particular schools have performed in the past, in order to build their models for this year’s competition. On March 11, the NCAA will announce the 68 college teams that will participate in the tournament. The following day, data scientists can start submitting their predictions, until 7 a.m. Pacific on March 15.

Teams will be judged based on a log loss function, which evaluates whether a model’s prediction is correct, as well as how confident that model was. It’s built to penalize confident incorrect predictions the most. The smaller a team’s log loss is, the better.

Once the submission period closes, teams will be able to watch and see how their models handle the live results of the games played.

Let’s block ads! (Why?)

Big Data – VentureBeat

On-premises data gateway February update is now available

We are excited to announce that we have just released the February update for the On-premises data gateway. Here are some of the things that we would like to highlight with this month’s release:

  • You are now able to refresh queries combining data from on-premises and cloud data sources.
  • Improvements to the gateway support for DirectQuery over SAP HANA (preview).
  • The on-premises data gateway now includes the February version of the mashup engine.

You can download this new version from the link below and continue reading for more details about each enhancement.

ab5a94b7 064c 4710 b489 208e184fdffb On premises data gateway February update is now available

Important Note

Before we start talking about this month’s improvements, we wanted to reminder you that on March 15th, our support to on-premises data gateway versions that are older than the August 2017 release will end.

We always recommend staying up to date with new releases but if you are using a release that is older than the August 2017 release, please upgrade to a newer version to ensure no disruptions to the gateway functionality.

Please refer to our documentation here for more details and the reasons behind this change.

 

Now onto this month’s enhancements

Combining on-premises and cloud data sources

This has been the top gateway ask from the community on our ideas website.

Last year we enabled the ability to refresh Power BI Desktop files that retrieve on-premises data and cloud data in separate queries. A huge ask remained to enable the ability to merge or append data from those sources in the same query. We have been working on enabling that and we are happy to announce that you are able to do this now.

Here is a quick guide on how to do that:

  1. After installing on-premises data gateway, go to the “Manage gateways” page and select the gateway you want to use.
  2. Under Gateway Cluster Settings, you’ll see a new checkbox that says “Allow user’s cloud data sources to refresh through this gateway cluster. These cloud data sources do not need to be configured under this gateway cluster”, make sure this box is checked (currently it is not checked by default).

    f59e5102 27d6 4dbd b7a0 ac9a57708bc0 On premises data gateway February update is now available

  3. Add any on-premises data sources you are using in your queries under this gateway cluster as you do normally. You do not need to add the cloud data sources you are using in your queries.
  4. Upload the Power BI Desktop file that has the queries that combine the on-premises and the cloud data sources to the Power BI service.
  5. Got to the dataset settings page for the new dataset. You’ll notice the following:
    • For the on-premises source, you can pick the gateway that has this source defined like you normally do.
    • For the cloud source, you will be able to edit its credentials under “Data source credentials” as you normally do with any other cloud source.

      099219f5 f611 4671 9925 0d0d1208d56f On premises data gateway February update is now available

  6. Once the cloud credentials are set, you can now refresh the dataset using the Refresh now option or schedule it to refresh periodically.

Improvements to the gateway support for DirectQuery over SAP HANA

Earlier this month, with the February release of Power BI Desktop, we announced improvements to DirectQuery over SAP HANA connectivity particularly around treating SAP HANA as a multi-dimensional source by default.

With this month’s gateway release, Power BI Desktop reports using that improved SAP HANA DirectQuery connector can be used with the on-premises data gateway as well.

For more details on how to use this new improved SAP HANA DirectQuery connector in Power BI Desktop, please check our announcement earlier this month as well as our SAP HANA DirectQuery connectivity documentation.

 

Updated version of the mashup engine

Last but not least, this gateway update includes the same version of the Mashup engine as the Power BI Desktop update released earlier this month. This will ensure that the reports that you publish to the Power BI Service and refresh via the gateway will go through the same query execution logic/runtime as in the latest Power BI Desktop version. Note that there are some Beta connectors that are still not supported outside of Power BI Desktop. Please refer to specific connector documentation for more details or contact us if you have any questions.

That’s all for this month’s gateway update. We hope that you find these enhancements useful and continue sending us feedback for what new capabilities you’d like to see in the future.


ab5a94b7 064c 4710 b489 208e184fdffb On premises data gateway February update is now available

Additional resources:

· Try Power BI

· Follow @MSPowerBI on Twitter

· Join the conversation at the Power BI Community

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Changing Data Types on Large Tables: The INT to BIGINT Conundrum

During one of our regular SQL Server heath checks, using sp_blitz, one of our largest production tables raised a worrying alert. The ID column of a table holding customer order information was an INT datatype, and it was running out of numbers.

The table was around 500GB with over 900 million rows. Based on the average number of inserts a day on that table, I estimated that we had eight months before inserts on that table would grind to a halt. This was an order entry table, subject to round-the-clock inserts due to customer activity. Any downtime to make the conversion to BIGINT was going to have to be minimal.

This article describes how I planned and executed a change from an INT to a BIGINT data type, replicating the process I used in a step by step guide for the AdventureWorks database. The technique creates a new copy of the table, with a BIGINT datatype, on a separate SQL Server instance, then uses object level recovery to move it into the production database.

Assessing the options

The obvious option for changing the datatype from INT to BIGINT was simply to ALTER the table directly. However, the requirement for minimal downtime ruled this out immediately. The ID column is a clustered Primary Key, which poses the first issue: you must drop the clustered index before running the ALTER TABLE command.

This was going to cause an outage of around nine hours and a considerable amount of logging. We could add extra disk capacity to cope with the logging requirements, but that length of downtime was not an option.

We didn’t have the luxury of being on Azure SQL Database, or SQL Server 2016 or 2017, each of which offer online rebuild options. There are limitations to online rebuilds though, one being this warning from MSDN:

Online alter column does not reduce the restrictions on when a column can be altered. References by index/stats, etc. might cause the alter to fail.

Another option involved the use of a trigger. This would entail copying all the data to a new table, creating all indexes and constraints, then creating a trigger to make sure inserts go to both tables. I was worried about many aspects of this idea, including maintenance and performance.

Another option suggested was to reseed the INT to use negative numbers. This means resetting the INT from -1 onwards to -2.147 billion rows, but I wasn’t a fan of the idea, and, in any event, this was not a long-term answer.

A standard approach I’d seen documented, was to create a replica table, except with a BIGINT datatype instead of INT and copy the data across in small batches. You then keep the two tables in sync, by capturing modifications to the original table (using a trigger or Change Data Capture) and applying them to the new table. Finally, during some brief downtime, switch the new table for the old. This was my preferred approach…but with a new twist. I created the replica table on a separate development instance, rather than side-by-side with the original in production. Instead of a trigger or Change Data Capture, I used SSIS packages to keep the tables in sync. Then, I used object-level restore to move the new table to production for the switch over.

I’d not seen this approach written about anywhere, but it seemed like a great way to minimize any possible disruption to production, and my instincts told me it would work!

The proof of concept

I did a lot of work in our test and development environments, making sure this approach was going to work exactly as intended. The following sections summarize the testing work. The demo mimics the steps I took as closely as possible but using the sample AdventureWorks database. I’ll assume you’ve restored the database to a development environment and start by creating the replica table.

Create the ‘replica’ table

In a newly restored AdventureWorks database, create a PersonNEW table alongside the original Person table, using a BIGINT data type for the clustered index column as shown in Listing 1. Note: to more closely follow along with the process used when I made the change in production, create the new table in a database on another instance.

Listing 1: Create the PersonNEW table


Transfer across the data, then create indexes

I used SSIS to transfer all data across into the PersonNEW table, and then created all the necessary indexes and constraints. When you create the SSIS package, make sure to click Enable Identity Insert(see below). You will find this option under the Edit Mappings tab in Select Source Tables and Views. In my scenario there was an identity column so this was needed. I also didn’t want any discrepancies, since the ID is a unique number for each order used by many applications and throughout the company.

word image 18 Changing Data Types on Large Tables: The INT to BIGINT Conundrum

During testing, I periodically updated the data in the BIGINT table using SSIS packages. For example, if the last import stopped at ID 6000, I would create the next SSIS package with > 6000. I used the cluster key to eliminate scanning the table each time for the most efficient transfer. I did this every day to keep the transferring of data time down. Listing 2 shows the query to use in the SSIS package for the Person table.

Listing 2: The transfer query

During testing, I also used Redgate’s SQL Data Compare after data transfers to verify that the data was copied exactly as expected.

Object level restore

The next step was testing the process on a separate staging server. I wanted to see if I could use object level recovery of the table into a database with a different name. To do this, I had to use a third-party SQL Server backup tool since object level recovery is not supported natively. I restored a fresh copy of AdventureWorks to the staging server, naming it AdventureWorksBIGINT. This represented the Production database in my test. I then restored the new table, PersonNEW, from the backup into the new staging database.

This was a smoke test to make sure that the same object level restore, from Development to Production would work exactly as expected. When restoring to Production, I restored the table using the object level recovery feature in my SQL Server backup tool.

Create a trigger to stop entries to the original table

During the switch over to the new table, changes to the data must be stopped. I used a trigger (Listing 3) to stop all change to the production table and stopped the application from being able to insert, update, or delete.

Listing 3: Trigger to stop changes


Swap out the old table for the new one

Now that both the original and replica tables were in the same database, the final step was to do a swap of the tables, swapping the indexes, constraints, table names, foreign keys, a trigger and a couple database permissions for denying access to certain columns. You can download the test object flip script for AdventureWorks at the bottom of this article, but I won’t show it here. Looking back, I did overcomplicate the index name flip as only the primary key was needed in my environment. Bear in mind not all indexes need to be changed since you could reuse the same name in two different tables.

Before I did the swap, I checked which rows were last added to the table, so when I did the last transfer, I knew what the most recent row should be. I used Redgate’s SQL Data Compare to make sure none of the rows were different.

What I also decided to do while I could amend the replica table as I pleased as it’s on our development environment, was to compress the table, saving around 200GB. Finally, I took a differential backup to make sure we had an up to date backup of the production database on hand, in case any issues occurred.

Everything looked good, so I ‘flicked the object flip switch’.

Once all objects are renamed, you can remove the trigger to reopen the table. Listing 4 shows how to check that you can access the table and entries are being made or can be made.

Listing 4: Insert check query

Check your tables you will see the Person table now has a BIGINT datatype!

Moving onto Production…

To my sureprise, my method worked like a charm. My boss Sjors Takes (blog | Simple-Talk articles) was impressed.

We ran a pilot in our Acceptance environment, which mimics our production set up, and it worked fine also.

When running on Acceptance and Production, the process was performed in the following steps:

  1. Restore a full database backup of the production database to a dev/test environment
  2. In the restored database, create the replica table with a BIGINT instead of INT
  3. Create the SSIS package and, with IDENTITY INSERT enabled, transfer the data
  4. Create all the indexes and constraints on the replica table
  5. Compress the new table (optional)
  6. Perform the object restore into the production database, keeping the table named as PersonNew .
  7. Update the PersonNew table periodically using SSIS packages to transfer data from a reporting instance in an availability group
  8. In the planned maintenance window, do one more SSIS transfer and then create the trigger to make the table read-only. We also shut down the applications that accesses this table
  9. Do a differential backup
  10. Do the table swap
  11. Check data consistency using SQL Data Compare
  12. Remove trigger and bring the APIs back online

This method cut down the downtime from a potential nine hours to less than fifteen minutes, and a lot of the intensive work was done away from the production instance. I did not see much impact on performance from the restore of the table using object level recovery.

Summary

There are many ways to make this and other datatype changes to production databases. The option you choose will often depend on the available downtime window you have available. The shorter that is, the more ‘creative’ you need to be with your solution, while always ensuring data integrity as the number one priority.

The method I’ve described minimized both downtime, and the potential to affect production server performance, since it allowed me to do most of the work to the table on a separate development instance.

Let’s block ads! (Why?)

SQL – Simple Talk

Chris Paul Talks Growth through Teamwork & Selflessness at Grow Live Houston

Posted by Evan Heby, Service Industry Marketing Manager

Playing professional basketball is hard. Becoming one of the best point guards to ever play the game, a model citizen for the NBA and still finding time for your family is even harder. But, that’s exactly what Chris Paul – CP3 – has accomplished. Basketball fans know CP3 as one of the best passers, defenders and competitors in the history of the NBA.

By partnering with our great customer Steiner Sports, who specializes in sports memorabilia and sports marketing, I had the opportunity to host a 30-minute interview with the Houston Rockets Point Guard where we discussed everything from basketball to teamwork to Beyonce & Jay-Z. Chris provided the crowd an inside look at what it’s like to be on the best NBA team in the league and outlined how he balances his professional career with philanthropy, business and family.

CP3 Chris Paul Talks Growth through Teamwork & Selflessness at Grow Live Houston

CP3 downplays his own individual achievements, instead opting to highlight the importance of a team culture, not just on the court but in everything you do. When it comes to basketball, the only individual accolade he takes a lot of pride in his selections to the NBA All-Defense Team (an award given to the top defensive players each season) because he feels that playing defense is selfless; it doesn’t necessarily show up in the stat sheet, but helps the entire team. Chris attributed much of his growth as a player and a person to selflessness – minimizing the importance of his personal achievements to instead highlight the greater effect a group can have when working towards a goal.

Chris shared several stories about how being part of a team can lead to exponential growth and positive achievements. Case and point, his work with the National Basketball Players Association (NBPA).

IMG 6260 Chris Paul Talks Growth through Teamwork & Selflessness at Grow Live Houston

Chris has held the role as president of the NBPA since 2013. When he was first offered the role, he was reluctant, and tried to renege because of the massive responsibility that comes with the job. Chris compared being on the board of the NBPA to being on the executive board of a business, but in this case, you are not compensated with equity or stock – this board is an unpaid service position. When asked, “What’s your greatest accomplishment as NBPA president?” he responded: “It’s never about me – it’s not my biggest achievement; it’s our biggest achievement.” Chris went on to specifically call out a couple of his past colleagues on the board, like James Jones, and then detailed their biggest achievement as a unit – healthcare for retired players. “One things for certain, anybody who plays in the NBA right now, at some point they will be a retired player; [healthcare for retired players is] one thing that no other league has and now we have a fund that takes care of our retired players.”

 Chris shared a few of his favorite basketball and personal moments with us and he attributes his growth as a player and a person to the extraordinary teams that surround him. Oracle NetSuite sponsored this event with CP3 at the St. Regis in Houston on February 22nd, 2018. To learn more about our Grow Live series, and possibly join us in our next city, click here.

Posted on Mon, February 26, 2018
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

Parenting Lesson #3,756

They always have to poop

 Parenting Lesson #3,756

via

Advertisements


About Krisgo

I’m a mom, that has worn many different hats in this life; from scout leader, camp craft teacher, parents group president, colorguard coach, member of the community band, stay-at-home-mom to full time worker, I’ve done it all– almost! I still love learning new things, especially creating and cooking. Most of all I love to laugh! Thanks for visiting – come back soon icon smile Parenting Lesson #3,756

Let’s block ads! (Why?)

Deep Fried Bits