• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: future

Hands-on: Amazon Fresh grocery stores tease brick-and-mortar retail’s future

November 2, 2020   Big Data

Maintain your employer brand in a pandemic

Read the VentureBeat Jobs guide to employer branding

Download eBook

There were no lines outside Irvine, California’s new Amazon Fresh grocery store on its opening day last week, despite the fact that it was only the second such location in the world — and the first to be open to the public on day one. But after early visitors discovered the store’s high-tech shopping carts, two lines formed over the weekend, stretching past Amazon’s front doors to adjacent retailers in the suburban plaza. One line was longer and moving slower than the other.

“Do you want to try the Dash Cart?” an employee asked people near the end of the queue. “If not, you can move into the shorter line, and you’ll get in faster.”

We were there specifically for the Dash Cart: Friends told us that it was worth the 10- to 20-minute wait to go hands-on with one of the 25 magical shopping carts, since their integrated touchscreens and cameras were the key to Amazon’s next-generation shopping experience. The Dash Cart felt like the future of brick-and-mortar retail, they said, even though the rest of the store wasn’t that amazing.

Our friends were correct, but there’s more to the Amazon Fresh story than just Dash Carts. Here’s what it’s like to visit the supermarket of the future, today, as it’s been dreamed up and implemented by Amazon.

A boxy, spartan layout, seemingly by design

Unlike Whole Foods, the high-end supermarket chain Amazon acquired in 2017, Amazon Fresh stores look like small warehouses, and have all the charm of Walmart’s grocery sections, minus two-thirds the people and half the choices. From the signage to the aisles and specialty counters, Amazon’s latest store feels as if it was designed largely by engineers, and conceived to be as easy as possible to retrofit inside another retailer’s abandoned space — in this case, the 40,000 feet formerly occupied by a Babies R Us store.

Apart from the produce, nothing about the environment feels organic: Floors are spartan, displays are boxy, and everything looks to have been optimized for customers by computers, rather than humans. There are places to purchase whole cooked chickens for $ 4.97 and pizzas for $ 8.99, but nowhere to sit and eat them. A staffed customer service area is in the back, not the front, which instead allocates a lot of interior space to managing shopping carts.

Even the baked goods, which in other stores flow attractively off the edges of store shelves, seem to have been assigned to a specific corner of Amazon Fresh and told to stay firmly within the lines. If it wasn’t for a cadre of friendly greeters, walking through Amazon Fresh would feel more like visiting a warehouse or Costco than shopping in a typical supermarket of its size.

That feeling extends to how Amazon Fresh uses — and doesn’t use — people in its operations. Instead of having employees answer inventory-related questions, Amazon scatters Alexa terminals throughout the store, offering AI guidance on item locations, wine pairings, and measurement unit conversions. On our first visit, the Alexa terminals were both working and helpful, accurately pointing us towards items we wanted to locate. But on our second visit, all of the terminals were experiencing “connectivity issues,” perhaps the closest Amazon Fresh stores will get to a business-disrupting employee strike.

The Alexa terminals suggest that Amazon wants to staff Fresh stores as leanly as possible, even if it’s liberally using employees during the launch phase to address potential customer pain points. There were lots of Fresh staffers — too many, really — constantly restocking shelves while otherwise keeping to themselves, plus the aforementioned greeters at the front doors to help get people in and out of the store. In traditional supermarkets, all of these employees might be floaters who move from place to place as needed, alternating between helping customers and restocking shelves. But at Amazon Fresh, Alexa could help reduce the need for greeters as customers become familiar with the technology, and shelving recalibrations could reduce the need for such frequent stock replenishment.

Lower labor costs could translate directly into lower prices. And half of the new store’s appeal is reasonable pricing — that’s the single biggest problem with Whole Foods, which offers well-heeled customers an impressive selection of high-end foods and beverages that just aren’t affordable to the masses. By contrast, Amazon Fresh is clearly aimed at middle-income shoppers who still want to do some of their purchasing and browsing in person instead of on a computer screen. There are a handful of fancy items on the shelves, such as $ 10 pints of McConnell’s ice cream, but most of the signage is directed towards selling 15-cent bananas and 89-cent loaves of bread, rather than champagne and caviar.

Dash Cart as a solution and a problem

The more exciting part of Amazon Fresh is the Dash Cart, a shopping cart that uses sensors and smartphone technology to replace checkout lanes and standalone produce scales. As mentioned above, you don’t have to use a Dash Cart to shop at Amazon Fresh, and despite its speedy name, you’ll likely get in and out of the store faster without waiting for one. But without a Dash Cart, the shopping experience isn’t hugely different from any old small suburban supermarket you’ve previously visited.

Once you make it through the Dash Cart waiting line, you’ll get a three-minute human tutorial that explains how to link your cart to your Amazon app with a QR code, scan packaged items by dropping them into one of two included paper bags, and add produce items by inputting four-digit PLU codes into the cart’s tablet screen. These steps are supposed to eliminate the need for employees to check you out and bag your purchases; instead, the cart’s cameras and scale track everything you place in the bags, so when you leave the store, your Amazon account is automatically charged for whatever you bagged yourself. It’s an evolution of what Amazon pioneered with much smaller Amazon Go stores years ago.

Dash Carts are cool in concept, but their execution leaves a lot to be desired. On a positive note, their cameras and software did a good job with accurately scanning items we placed in the bags, and automatically removing items if we pulled them from the cart. The tablet-like touchscreen worked as expected, and though the scale inside the cart wasn’t fast, it could — with practice — be faster than walking over to a standalone produce scale and printing out a label for each item.

On the other hand, the Dash Carts had limitations that beg to be resolved in future iterations. Each cart is limited to two bags, which restricts your ability to complete a full shopping trip, and limits Amazon’s maximum take per shopper. You can’t overfill the bags, lest the cart’s cameras become incapable of seeing what’s inside. Additionally, Amazon is so concerned about theft or damage that it swaps each Dash Cart for a regular one before customers leave for the parking lot, or hands you the bags to carry to whatever distant parking space you selected. These are the sorts of practical inconveniences that could kill Dash Cart’s utility for some people.

Real-world glitches also undermined our Dash Cart experience. One of our bags ripped and needed to be replaced during the cart-to-cart transfer. We also had to go through a manual checkout line — including rescanning and rebagging every item — because our cart’s integrated code scanner couldn’t recognize an Amazon coupon. Staff said that the carts were somewhat finicky and had been experiencing hiccups like this.

Whenever one of these issues with the Dash Cart popped up, we felt as if we were holding up people who were waiting behind us, even though the issues weren’t really our fault. Other delays, such as learning how to enter PLU codes and weigh produce, caused Dash Cart users to abruptly stop mid-aisle and fidget with the tablet’s screen. We noticed some customers without the high-tech carts becoming visibly frustrated with other customers’ Dash-inspired touch interactions, but cart users seemed to be too focused on their screens to notice.

Could data make all the difference for future retailers?

It’s easy to overlook a key element of this retail experience — the intersection between Amazon.com and Amazon Fresh — because it’s so muddled at the moment. But it could wind up being a critical differentiator for Amazon’s brick-and-mortar ventures going forward.

As part of the initial onboarding experience, Amazon openly encourages Dash Cart users to digitally manage their shopping lists with the cart and browse current in-store specials using their phones. This is a mess for two reasons: The cart’s integrated shopping list management software is extremely limited, and the idea of asking users to check not just one but two touchscreens while they’re shopping is just straight-out crazy. No one wants to be stuck behind that guy who’s blocking shelves or freezers while browsing through lists and brochures. If Larry David ever visits Amazon Fresh, there’s enough material here for an entire Curb Your Enthusiasm sub-plot.

Yet there’s obvious value in tying the internet directly — and more thoughtfully — to a customer’s shopping cart. Your first visit to an Amazon Fresh store could conceivably be your last trip through its aisles: Amazon could just present you with a list of the items you purchased, offer to reorder them, and make them instantly available for either pickup or delivery. That could eliminate the need (and the premium people currently pay) for Instacart. It also could reduce the footprints of future Amazon Fresh stores by lowering the number of people who simultaneously walk through them, enabling many customers to complete transactions using the equivalent of drive-through windows.

Amazon is technically doing some if not most of these things already, but it needs to refine its smartphone and cart software to make the end-to-end experience intuitive and frictionless for customers. Somewhat ironically, the sign that it has succeeded will be if its Fresh grocery stores aren’t packed with people but are still hugely profitable, which is to say that they’ll be moving tons of products without the packed aisles and long lines normally associated with successful supermarkets.

The best of the rest of Amazon, plus coupons

One thing we loved at Amazon Fresh was an area labeled “Customer Service, Returns & Pick Up.” Normally, these things are found very close to the entrance of a supermarket, but at Amazon Fresh, they’re in the back, a decision that was likely made to get returns and pick-ups closer to the store’s storage areas and loading docks. Customers can pick up items from Amazon lockers and drop off Amazon returns — conveniences that simultaneously provide an incentive to do grocery shopping while eliminating the need to visit standalone Amazon shipping and return locations, something we can see ourselves using at least occasionally.

Amazon Fresh also includes a limited selection of the online retailer’s popular gadgets and books. We spotted the José Andrés cookbook Vegetables Unleashed and the Death & Co. cocktail guide on shelves only a short distance away from Fire tablets and Echo speakers, none of which we were looking to purchase at a supermarket — but then, we had already bought some of them online in the past. Over time, new items will replace them, and we might have reason to consider buying non-grocery goods at Amazon Fresh, as well.

At this stage, it would be hard to describe Amazon Fresh as the guaranteed future of brick-and-mortar retailing; the experience currently feels closer to a public beta test than a fully formed and polished business. Visitors can certainly have a normal or even a unique experience in the store, but they’re actually guinea pigs in a grand experiment that runs smoothly — until it doesn’t.

To Amazon’s credit, the speed bumps aren’t too daunting. Moreover, the company is actively addressing problems by handing out coupons to apologize for technical issues, and on one of our two visits, was giving away free cans of sparkling water and refrigerator magnets to everyone exiting the store. Despite the glitches, we didn’t see anyone leaving the store angry, and between the coupons and the small number of bags we left with, we were already planning our next trip to the store as we walked out to our car.

Only Amazon knows whether such a mixed but positive impression counts as “mission accomplished” or whether its early Amazon Fresh grocery customers are just helping it refine a larger campaign to completely dominate the retail world. Thanks to Amazon’s growing scale and unquestionable ambition, the Fresh grocery stores could either become very real challengers to traditional supermarkets — or fizzle out as experiments that made little difference to the company’s bottom line.

If you’re interested in seeing Amazon Fresh for yourself, you can visit the new store at 13672 Jamboree Road in Irvine, or the first location — open to the public since September — at 6245 Topanga Canyon Boulevard in Woodland Hills, California. Both stores are open from 7 a.m. to 10 p.m., seven days a week.


How startups are scaling communication:

The pandemic is making startups take a close look at ramping up their communication solutions. Learn how


Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Recognizing data points that signal trends for the future of business post-pandemic

October 18, 2020   Big Data
 Recognizing data points that signal trends for the future of business post pandemic

The audio problem

Learn how new cloud-based API solutions are solving imperfect, frustrating audio in video conferences.

Access here

Planning for a post-COVID-19 future and creating a robust enterprise strategy require both strategic scenario planning and the ability to recognize what scenario planners call “news from the future” — data points that tell you whether the world is trending in the direction of one or another of your imagined scenarios. As with any scatter plot, data points are all over the map, but when you gather enough of them, you can start to see the trend line emerge.

Because there are often many factors pushing or pulling in different directions, it’s useful to think of trends as vectors — quantities that are described by both a magnitude and a direction, which may cancel, amplify, or redirect each other. New data points can also show whether vectors are accelerating or decelerating. As you see how trend vectors affect each other, or that new ones need to be added, you can continually update your scenarios.

Sometimes a trend itself is obvious. Twitter, Facebook, Google, and Microsoft each announced a commitment to new work-from-home policies even after the pandemic. But how widespread will this be? To see if other companies are following in their footsteps, look for job listings from companies in your industry that target new metro areas or ignore location entirely. Drops in the price or occupancy rate of commercial real estate, and how that spills over into residential real estate, might add or subtract from the vector.

Think through possible follow-on effects to whatever trend you’re watching. What are the second-order consequences of a broader embrace of the work-from-home experience? Your scenarios might include the possible emptying out of dense cities that are dependent on public transportation and movement from megacities to suburbs or to smaller cities. Depending on who your workers and your customers are, these changes could have an enormous impact on your business.

What are some vectors you might want to watch? And what are examples of news from the future along those trend lines?

The progress of the pandemic itself. Are cases and deaths increasing or declining? If you’re in the U.S., Covid Act Now is a great site for tracking the pandemic. This suggests that pandemic response won’t be a “one and done” strategy, but more like what Tomas Pueyo described in his essay “The Hammer and the Dance,” in which countries drop the hammer to reduce cases, reopen their economies, see recurrences, and drop the hammer again, with the response increasingly fine-grained and local as better data becomes available. As states and countries reopen, there is a lot of new data that will shape all of our estimates of the future, albeit with new uncertainty about a possible resurgence (even if the results are positive).

Is there progress toward treatment or a vaccine? Several vaccine candidates are in trials, and new treatments seem to improve the prognosis for the disease. A vector pushing in the other direction is the discovery of previously missed symptoms or transmission factors. Another is the politicization of public health, which began with masks but may also extend to vaccine denial. We may be living with uncertainty for a long time to come; any strategy involving a “return to normal” needs to be held very loosely.

How do people respond if and when the pandemic abates? Whatever comes back is likely to be irretrievably changed. As Ben Evans said, sometimes the writing is on the wall, but we don’t read it. It was the end of the road for BlackBerry the moment the iPhone was introduced; it just took four years for the story to play out. Sometimes a seemingly unrelated shock accelerates a long overdue collapse. For example, ecommerce has been growing its share for years, but this may be the moment when the balance tips and much in-person retail never comes back. As Evans put it, a bunch of industries look like candidates to endure a decade of inevitability in a week’s time.

Will people continue to walk and ride bikes, bake bread at home, and grow their own vegetables? (This may vary from country to country. People in Europe still treasure their garden allotments 70 years after the end of World War II, but U.S. victory gardens were a passing thing.) Will businesses have the confidence to hire again? Will consumers have the confidence to spend again? What percentage of businesses that shut down will reopen? Are people being rehired and unemployment rates going down? The so-called Y-shaped recovery, in which upper-income jobs have recovered while lower-income jobs are still stagnant, has been so unprecedented that it hasn’t yet made Wikipedia’s list of recession shapes.

Are there meaningful policy innovations that are catching on? Researchers in Israel have proposed a model for business reopening in which people work four-day shifts followed by ten days off in lockdown. Their calculations suggest that this would lower transmissibility of the virus almost as well as full lockdown policies, but allow people in many more occupations to get back to work, and many more businesses to reopen. Might experiments like this lead to permanent changes in work or schooling schedules? What about other long-discussed changes like universal basic income or a shorter work week? How will governments pay for the cost of the crisis, and what will the economic consequences be? There are those, like Ray Dalio, who think that printing money to pay for the crisis actually solves a long-standing debt crisis that was about to crash down on us in any case. Others disagree.

Are business models sustainable under new conditions? Many businesses, such as airlines, hotels, on-demand transportation, and restaurants, are geared very tightly to full occupancy. If airlines have to run planes with half as many passengers, will flights ever be cheap enough to attract the level of passengers we had before the pandemic? Could “on demand” transportation go away forever? Uber and Lyft were already unprofitable because they were subsidizing low prices for passengers. Or might these companies be replaced as the model evolves, much as AOL yielded online leadership to Yahoo!, which lost it in turn to Google? (My bet is that algorithmic, on-demand business models are still in their infancy.)

These topics are all over the news. You can’t escape them, but you can form your own assessment of the deeper story behind them and its relevance to your strategy. Remember to think of the stories as clustering along lines with magnitude and direction. Do they start to show patterns? More importantly, find vectors specific to your business. These may call for deep changes to your strategy.

Also remember that contrarian investments can bring outsized returns. It may be that there are markets that you believe in, where you think you can make a positive difference for your customers despite their struggles, and go long. For O’Reilly, this has been true of many technologies where we placed early bets against what seemed overwhelming odds of success. Chasing what’s “hot” puts you in the midst of ferocious competition. Thinking deeply about who needs you and your products and how you can truly help your customers is the basis for a far more robust strategy.


The audio problem:

Learn how new cloud-based API solutions are solving imperfect, frustrating audio in video conferences. Access here


Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Storage 101: The Future of Storage

August 21, 2020   BI News and Info

The series so far:

  1. Storage 101: Welcome to the Wonderful World of Storage
  2. Storage 101: The Language of Storage
  3. Storage 101: Understanding the Hard-Disk Drive 
  4. Storage 101: Understanding the NAND Flash Solid State Drive
  5. Storage 101: Data Center Storage Configurations
  6. Storage 101: Modern Storage Technologies
  7. Storage 101: Convergence and Composability 
  8. Storage 101: Cloud Storage
  9. Storage 101: Data Security and Privacy 
  10. Storage 101: The Future of Storage

An IDC report published in November 2018 predicted that the world’s data would grow to 175 zettabytes by the year 2025. For those unaccustomed to such amounts, a zettabyte is about 1,000 exabytes, which comes to one billion terabytes or one trillion gigabytes. Given our current trajectory, we’ll likely see those predictions come true. Even if we fall short, there will still be a heap load of data.

Current storage technologies are going to have a tough time keeping up. They’re already having a tough time keeping up. With the explosion of mobile devices, followed by the influx of the Internet of Things (IoT), more data than ever is being generated—by people, by applications, by machines. The only way to derive meaning from all that data is to develop innovative high-performing, high capacity storage solutions.

 Storage 101: The Future of Storage

Figure 1. Data’s exponential growth (image by geralt)

Scientists are pioneering storage solutions that can support our data loads into the future. To this end, they’re searching for ways to improve NAND flash and storage class memory, while experimenting with new storage technologies. In this article—the last in my series on storage—I provide an overview of many of these efforts to give you a sense of what we might expect in the near and, with any luck, not-too-distant future.

What’s up with NAND flash?

NAND flash adoption has significant data center market share, offering substantially better performance and durability than what hard-disk drives (HDDs) are physically capable of ever achieving. As NAND’s popularity has increased, along with its densities, prices have steadily dropped, making it a more viable storage option than ever.

Yet even with these improvements, they’re not enough to meet the demands of many of today’s data volumes and workloads, which is why vendors are working hard to make solid-state drives (SSDs) that can deliver better performance and greater densities while minimizing the cost-per-GB.

The primary strategy for doing so is adding more bits per cell, more layers per chip, or a combination of both. Flash SSDs have gone from one bit per cell to two bits and then three. Now we have quad-level cell (QLC) SSDs, which squeeze four bits into each cell. Initially, QLC flash primarily targeted PCs, but that’s starting to change, with some vendors now offering QLC storage for the data center.

More bits per cell increases the need for error correction, slowing program/erase (P/E) cycles. The additional bits also decrease endurance as cells become more labile. Until significant advances are made in P/E processes such as garbage collection, enterprise QLC flash will be limited to read-intensive workloads. In the meantime, vendors are pushing ahead with more bits per cell, even developing penta-level cell (PLC) SSDs that boast five bits per cell.

At some point, adding more bits per cell will no longer be practical, which is why vendors are also adding more layers to their NAND chips, a technology referred to as 3D NAND. In this type of chip, memory cells are stacked into vertical layers to increase capacity. The first 3D NAND chips had 32 layers. Many vendors now offer SSDs with 96 layers.

In addition, several vendors are ramping up production on 128-layer SSDs, with 256-layer devices on the horizon. Devices featuring 500 and even 800 layers or more are forecast. But additional layers mean thinner materials, amplifying manufacturing challenges and costs. The cost-per-GB is unlikely to decline as quickly as it has been without novel technological advances.

Who’s invading the flash space?

While vendors continue to enhance their NAND flash offerings, some are also investing in technologies that could eventually replace flash or be used in conjunction with flash to create a hybrid solution. One of these is Intel’s Optane DC SSD, which is based on the 3D XPoint architecture, a storage-class memory (SCM) technology developed by Intel in partnership with Micron.

The Optane DC SSD provides greater throughput and lower latency than a traditional flash SSD, including Intel’s own line of enterprise flash storage. IBM is now working on its second generation of the Optane DC SSD, offering hints that it might nearly double the speed of its first-gen implementation.

a close up of a sign description automatically ge Storage 101: The Future of Storage

Figure 2. Moving into flash territory (image by Fotocitizen)

Not to be outdone, Samsung now offers its own alternative to traditional NAND flash—the Z-SSD drive (or Z-NAND). Although the Z-SSD is based on NAND technologies, it offers a unique circuit design and controller that delivers much better performance. In fact, the Z-SSD is often described as an SCM device and is considered Samsung’s answer to Intel’s Optane DC SSD.

Micron has also released an SSD built on the XPoint architecture—the X100 NVMe SSD. Both Micron and Samsung appear to be planning their next generation of flash alternatives. But they’ve released few details about the devices or how they’ll perform.

In the meantime, Kioxia (formerly Toshiba Memory) is working on its own NAND flash alternative, Twin BiCs FLASH, which the company describes as the “world’s first three-dimensional (3D) semicircular split-gate flash memory cell structure.” That’s quite the mouthful and certainly sounds intriguing. However, the project is still in research and development and will likely not see the light of day for some time to come.

It’s uncertain at this point what the future looks like for NAND flash alternatives such as those from Intel, Micron, Samsung, and Kioxia. Much will depend on how traditional NAND flash evolves and the affordability of these new devices over the long-term. With workload and data demands increasing, organizations will continue to look for whatever solutions can effectively balance performance and capacity against endurance and cost.

Where does storage class memory fit in?

In the last couple years, storage class memory (SCM) has inspired many headlines, especially with IBM’s recent release of the first Optane DC persistent memory modules (PMMs).The modules plug into standard dual in-line memory module (DIMM) slots, allowing the PMMs to connect directly to the server’s memory space. The Optane DC modules represent a big step forward toward the vision of a new storage tier that sits between traditional dynamic RAM (DRAM) and NAND flash storage to support demanding enterprise workloads.

Intel’s Optane DC modules are typically referred to as a type of phase-change memory (PCM)—“typically” because the company’s messaging has been somewhat mixed around this issue and they are sometimes considered to be a type of resistive RAM. However, the consensus is that the Optane DC modules fit neatly into the PCM category.

Phase-change memory is a type of nonvolatile memory that stores data by rapidly changing a material between amorphous and crystalline states. Phase-change memory offers much faster performance and lower latency than NAND flash and has the potential of delivering greater endurance. On the other hand, PCM is also much more expensive.

But PCM is not the only SCM effort under development. Scientists are actively researching other technologies that they believe can also serve as a bridge between DRAM and flash storage. One of these is resistive RAM (RRAM or ReRAM), another type of nonvolatile memory that promises significantly greater performance than NAND flash, with speeds approaching those of DRAM.

Resistive RAM works by applying different voltage levels to a material in order to switch its resistance from one state to another. Compared to NAND flash, RRAM offers much better performance and higher endurance while consuming less power. In fact, the technology shows so much promise that it has been proposed as a possible replacement for both NAND flash and DRAM.

Another nonvolatile memory technology that shows promise is ferroelectric memory (FRAM or FeRAM), which is built on a ferroelectric capacitor architecture that incorporates a mechanism for controlling polarities. Ferroelectric memory offers high read and write speeds, low power consumption, and high endurance. But in its current form, it has a very low density and its processing costs are high.

Nanotube RAM (NRAM) is another nonvolatile memory technology that’s being actively researched for its DRAM-like performance, low power consumption, and ability to withstand extreme environmental conditions. Nanotube RAM can also retain data far beyond NAND flash capabilities. A NRAM device is made up of tiny carbon nanotubes that are extremely strong and have conductive properties. The nanotubes sit between two electrodes through which voltage is applied to change the resistance, providing the structure for data storage.

Researchers are also focusing on Magnetic RAM (MRAM), which could potentially deliver speeds on par with static RAM (SRAM). Magnetic RAM—also called magnetoresistive RAM—is a nonvolatile memory technology that uses magnetic states to store data bits, rather than using electrical charges like other memory technologies.

Vendors are pursuing different strategies for implementing MRAM. One of the most promising is spin tunnel torque MRAM (STT-MRAM), which leverages the angular momentum in quantum mechanics to store data. The biggest challenge with MRAM, however, is its extremely low density.

All of these memory types—along with others being investigated—are in various stages of research and development. Although several vendors already offer products based on some of these technologies, today’s research is what will drive them into the future and make it possible to create a memory-storage stack in which all memory is nonvolatile, profoundly changing the way we deliver applications and store data.

What does the future hold?

The memory technologies I’ve discussed so far are mostly works in progress, with vendors looking for ways to make them more practical and profitable beyond a handful of small niche use cases. But researchers are also looking further into the future, working on technologies that are still in their infancy or have been around for a while but are now being infused with new efforts.

One area of research that’s caught the industry’s imagination is silica glass, which can be used to store data much like the crystals that taught Superman about his Krypton roots. This idea of silica glass got its boost in 2013 from researchers at the University of Southampton, who demonstrated storing a 300 KB text file in fused glass.

The storage medium, referred to as 5D memory crystal, or 5D storage, relies on superfast femtosecond laser technology, like that used for refractive surgery. The laser etches microscopic nanogratings into the glass to provide the data bit structure. A special technique is then used to retrieve the data, taking advantage of the light’s polarization and intensity.

According to the researchers, a 25-mm silica disk could store as much as 360 TB of data, sustain temperatures up to 190 degrees Celsius, and remain viable for over 13 billion years, making today’s storage media seem like cardboard cutouts. In fact, 5D storage has already received a fair share of notoriety. A silica disk storing Isaac Asimov’s Foundation series now orbits the sun, sitting inside Elon Musk’s cherry red Tesla Roadster, which itself sits onboard the Falcon Heavy SpaceX rocket.

Microsoft was so impressed with the 5D storage technology that it has launched its own initiative, dubbed Project Silica, whose stated goal is to develop the “first-ever storage technology designed and built from the media up, for the cloud.” Project Silica uses femtosecond lasers to write data into quartz glass, the same process used for 5D storage. As its first proof of concept, Microsoft teamed up with Warner Bros. to store and retrieve the entire 1978 Superman movie on a piece of glass about the size of a drink coaster.

Another innovative approach to data storage is racetrack memory, which was first proposed by IBM researchers in 2008. Racetrack memory applies electrical current to nanowires to create domain walls with opposite magnetic regions between them (thus the racetrack concept). The domain walls and their regions provide a structure for efficiently storing data. IBM hopes that racetrack technology might eventually yield a nonvolatile, solid-state storage device that can hold 100 times more data than current technologies at a lower cost-per-GB.

Other researchers are pursuing a different approach to racetrack memory, leveraging the inherent properties in skyrmions, which are microscopic swirls found in certain magnetic materials. Skyrmions work in conjunction with anti-skyrmions to create opposing magnetic swirls that can be used to create a three-dimensional structure for hosting digital data. Skyrmion-based storage requires very little current and has the potential for storing large quantities of data while delivering high-speed performance.

Scientists are also researching the potential of storing data at the molecular level. One of the most publicized approaches is DNA, in which data is encoded directly into the genetic material. Corporate, university, and government researchers are actively pursuing DNA’s potential for persisting data. DNA can store massive amounts of information, is millions of times more efficient than anything we have today, requires almost no maintenance, and can endure for many millennia.

 Storage 101: The Future of Storage

Figure 3. Betting on DNA storage (image by geralt)

The challenge with DNA storage, however, is that it’s error-prone and expensive to produce. To address these issues, scientist have been experimenting with multiple solutions. For example, researchers at the University of Texas at Austin have come up with error-correcting algorithms that help compensate for the high rate of errors. Using synthetic DNA, they have successfully stored the entire book The Wizard of Oz, translated into Esperanto. But this is nothing compared to DNA’s true potential. As many have claimed, DNA could make it possible to store the entire internet in a shoe box.

Despite the enthusiasm around DNA storage, researchers are also investigating different molecular storage techniques, using molecules that are smaller than DNA and other long-chain polymers. The big advantage here is that smaller molecules can be cheaper and easier to produce, and they have the potential for storing more data. If that’s not small enough, scientists are also researching single-atom data storage, with each bit stored in an individual atom. So far, I’ve come across no discussions about going smaller.

Where do we go from here?

If technologies such as molecular storage and silica glass storage can be manufactured in a way that is both efficient and cheap, we’ll be better prepared to handle all the data that’s expected in the years to come. But we have a long way to go before we get there, and until then, we’ll have to rely on the advancements being made with NAND flash and its alternatives, as well as with SCM. What we’ll do with all that data once we figure out how to store is another matter altogether. In terms of storage, however, the sky is indeed the limit.

Let’s block ads! (Why?)

SQL – Simple Talk

Read More

Alexa and Google Assistant execs on future trends for AI assistants

July 17, 2020   Big Data
 Alexa and Google Assistant execs on future trends for AI assistants

VB Transform

Join live for the last day of the AI event of the year

Hosted Digitally

7/17 from 9 – 4:25 PT

Register Now

Find all the Transform 2020 sessions on-demand. 


Businesses and developers making conversational AI experiences should start with the understanding that you’re going to have to use unsupervised learning to scale, said Prem Natarajan, Amazon head of product and VP of Alexa AI and NLP. He spoke with Barak Turovsky, Google AI director of product for the NLU team, at VentureBeat’s Transform 2020 AI conference today as part of a conversation about future trends for AI assistants.

Natarajan called unsupervised learning for language models an important trend for AI assistants and an essential part of creating conversational AI that works for everyone. “Don’t wait for the unsupervised learning realization to come to you yet again. Start from the understanding that you’re going to have to use unsupervised learning at some level of scale,” he said.

Unsupervised learning uses raw, unlabeled data to draw inferences from raw, unclassified data. A complementary trend, Natarajan said, is the development of self-learning systems that can adapt based on signals received from interacting with a person speaking with Alexa.

“It’s the old thing, you know: If you fail once, that’s OK, but don’t make the same failures multiple times. And we’re trying to build systems that learn from their past failures,” he said. Members of Amazon’s machine learning team and conversational AI teams told VentureBeat last fall that self-learning and unsupervised learning could be key to more humanlike interactions with AI assistants.

Another continuing trend is the evolution of trying to weave features into experiences. Last summer, Amazon launched Alexa Conversations in preview, which fuses together Alexa skills into a single cohesive experience using a recurrent neural network to predict dialog paths. For example, the proverbial night out scenario involves skills for buying tickets, making dinner reservations, and making arrangements with a ridesharing app. At the June 2019 launch, Amazon VP of devices David Limp referred to Amazon’s work on the feature “the holy grail of voice science.” Additional Alexa Conversations news is slated for an Amazon event next week.

Natarajan and Turovsky agreed that multimodal experience design is an another emerging trend. Multimodal models combine input from multiple mediums like text and photos or videos. Some examples of models that combine language and imagery include Google’s VisualBERT and OpenAI’s ImageGPT, which received an honorable mention from the International Conference on Machine Learning (ICML) this week.

Turovsky talked about advances in surfacing the limited number of answers voice alone can offer. Without a screen, he said, there’s no infinite scroll or first page of Google search results, and so responses should be limited to three potential results, tops. For both Amazon and Google, this means building smart displays and emphasizing AI assistants that can both share visual content and respond with voice.

In a conversation with VentureBeat in January, Google AI chief Jeff Dean predicted progress in multimodal models in 2020. The advancement of multimodal models could lead to a number of benefits for image recognition and language models, including more robust inference from models receiving input from more than a single medium.

Another continuing trend, Turovsky said, is the growth of access to smart assistants thanks to the maturation of translation models. Google Assistant is currently able to speak and translate 44 languages,

In a separate presentation earlier today, Turovsky detailed steps Google has taken to remove gender bias from language models. Powered by unsupervised learning, Google introduced changes earlier this year to reduce gender bias in neural machine translation models.

“In my opinion, we are in the early stages of this war. This problem could be seemingly simple; a lot of people could think it’s very simple to fix. It’s extremely hard to fix, because the notion of a bias in many cases doesn’t exist in an AI environment, when we watch it learn, and get both training data and train models to actually address it well,” Turovsky said. Indeed, earlier this year researchers affiliated with Georgetown University and Stanford University found racial automatic speech detection systems from companies including Amazon and Google work better for White users than Black users.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Teleoperated scooters: Go X and Tortoise test micromobility’s future in Georgia pilot

July 7, 2020   Big Data
 Teleoperated scooters: Go X and Tortoise test micromobility’s future in Georgia pilot

Micromobility is evolving in a northern suburb of Atlanta, Georgia, where the sight of an electric scooter rolling down the street without a rider is likely causing more than a few double-takes. The City of Peachtree Corners has launched an experiment with partner companies Tortoise and Go X to deploy 100 scooters that can be remotely operated.

The Go X Apollo service allows riders to order a scooter via a mobile app. The scooter comes right to them, and when they’re finished it returns to a station on its own. The experiment is a first step toward developing fully autonomous scooters. But even in this phase, participants hope these remote-controlled scooters can address many of the operational and environmental issues that have plagued the micromobility industry over the past few years.

“In metro Atlanta, traffic is a major concern for everyone,” said Brandon Branham, assistant city manager and chief technology officer for Peachtree. “We didn’t have [microbility services], and we wanted to change that. But how do you do that in a way that we’re not fighting issues that downtown Atlanta and others see with the clutter and the other issues brought by doing this?”

Micromobility options such as electric scooters and dockless bikes have been popping up around the globe as companies like Lime and Bird raised more than a combined $ 1 billion in venture capital. The idea is for electric scooters to get people out of their cars and help cities reduce traffic and pollution. But amid the scooter boom came a surge in complaints and frustrations from affected communities.

VB Transform 2020 Online – July 15-17. Join leading AI executives: Register for the free livestream.

The scooters tended to pile up in limited locations, cluttering the city and blocking pedestrian paths. The chaos also made them inconvenient for riders and forced companies to contract people to retrieve the scooters, charge them, and then redistribute them to more locations. Companies typically lose money on each ride, due to the cost of maintenance and replacement. And cities received growing complaints about piles of scooters becoming eyesores, creating problems for people with impaired mobility and contravening Americans with Disabilities guidelines (ADA) by blocking sidewalks, and frustrating residents who resented out-of-control scooter riders.

Fixing scooters’ woes

Lime, Bird, and others have said they are addressing these concerns, and cities are also implementing new regulations. But the founders of Tortoise thought they could address many of these problems by making scooters more predictable and convenient for riders and cities alike.

Tortoise cofounder David Graham once built a lawnmower that used a combination of remote and autonomous technologies with only $ 100 in off-the-shelf hardware. Last year, he teamed up with Dmitry Shevelenko to adapt this system to electric scooters.

Graham and Shevelenko aren’t the only ones with this idea. Singapore-based robotics company CtrlWorks had developed a project called Scootbee that built autonomous scooters. But after a brief test in Malaysia, it appears Scootbee has shut down. There were also rumors that Uber had a self-driving scooter in the works. And in August 2019, Segway-Ninebot introduced the autonomous KickScooter T60, though it doesn’t appear to be operational in any cities yet.

So for the moment, Tortoise appears to be leading the pack. The company took existing scooters and retrofitted them with an additional camera, more powerful processors, and retractable training wheels (rather than a kickstand) to make them more stable when they are moving without a rider. Tortoise then launched a remote operations center in Mexico City, from which staff can remotely control the scooters and guide them down streets, either to a rider or back to the bike station.

Shevelenko said remotely repositioned scooters offer a number of advantages. First off, they are better for the environment because companies don’t need to drive around picking up and redistributing the vehicles. That also helps with the unit economics and durability because the scooters don’t get thrown down and knocked over when they aren’t being used.

The current platform would allow the scooters to run autonomously for about 30% of their trip, but the company has opted not to turn on that functionality for now. Shevelenko wants cities and residents to get accustomed to seeing riderless scooters in their community before implementing that feature.

“Our goal is to crawl, walk, run,” he said. “We don’t want it to be a shock when you start using that autonomy software.”

Even when Tortoise does activate the autonomous features, the company anticipates it will continue to use the teleoperation center for the bulk of the scooters.

Tortoise’s business model is built on “repositioning as a service.” This allows other companies to build the Tortoise platform into their scooters and apps. The system is mainly a reference design that anyone can use to adapt their own scooter fleet. Tortoise then charges a per-minute usage fee to remotely reposition scooters.

“That’s why we picked the name ‘Tortoise,’” Shevelenko said. “They move slow. They win. And they live a long time. If you’re trying to build a 100-year company and you depend on the public rights of way, you have to get off on the right foot with public regulators.”

The Peachtree experiment

Peachtree Corners is about 20 minutes from downtown Atlanta and has 45,000 residents. It has become one of the region’s most dynamic economic development hubs, in part because it’s home to Technology Park Atlanta, a business center designed to attract innovative companies. The park includes Curiosity Labs, an incubator focused on mobility and smart cities. It has created a 1.5-mile testing track for autonomous vehicles and partnered with Sprint to roll out a 5G network that is already operational. As a planned community that’s home to innovative companies, Peachtree is an ideal testbed for these new scooters.

Go X is offering the initial phase of the service to the 8,000 employees working within the 500-acre office park. Go X manages the scooters for the city of Peachtree, and its app makes an API request to the Tortoise system. The local terrain is mostly flat, but a 13% grade at one section of the autonomous testing track presents an additional challenge as Tortoise and Go X put the system through its paces

As the launch is coinciding with the coronavirus pandemic, Go X and the city had to adapt some of their original plans. Adjustments include periodically sanitizing the scooters. Go X CEO Alexander Debelov said the network has 20 docking stations and a local employee will be notified when 10 or more scooters need to be disinfected. The company expects scooters will need to be disinfected two to three times a day, Debelov said.

Even at that rate, operational costs will be much lower than for traditional scooter-sharing ventures because Go X won’t need to contract numerous people to retrieve the scooters and recharge them, according to Debelov.

From the perspective of Peachtree officials, the experiment is a chance to get ahead of potential traffic issues. The Atlanta Tech Park is near hotels, restaurants, and various other shops that employees and local residents can access via the Go X scooters during the day. Eventually, the service will be rolled out to the entire city. It will hopefully be directly connected to public transportation options for getting into and out of Atlanta.

“Once we get through that phase and we feel confident, we’ll expand it,” Branham said. “Downtown is about a mile and a half north of the testing facility. And so we’ll add those connection points between there and [Atlanta Technology] Park and then finally roll it out into the citywide residential and business side.”

The Go X scooters also offer an opportunity to burnish Peachtree’s reputation as a tech hub. Other mobility companies, such as Local Motors, are already testing autonomous shuttles in the city. In shaping regulations around scooter-sharing, the city included requirements that services have an autonomous or remote-controlled option.

“The city took an aggressive approach,” Branham said. “We know we need this, but how do we do it and not face the problems other cities have had?”

Now that GoX and Tortoise have soft-launched, part of the mission becomes educating residents about the technology. The city went through a similar exercise when it started testing the Local Motors shuttles. For that trial, Branham organized events and invited the public to try the shuttles and learn about the technology.

The same will likely happen with the scooters. In part, autonomous or remote-controlled vehicles are a social experiment. How quickly will people accept them? What are the factors that make them seem more or less reliable? The city has been posting information on its social media accounts and has created a dedicated spot on its website to provide information and updates.

“The faces and looks you get when you see that scooter driving down the road by itself, it’s fun to watch them,” Branham said. “We were doing some testing, and some 15 cars drove by and they all stop and they’re staring.”

The success of this venture could determine to a large degree how quickly other cities embrace similar experiments.

“I think we are really paving the way,” Branham said. “I’ve had a ton of phone calls from other cities and been on the phone a lot, just working through this with other cities. All cities are competing against each other for economic development. But when it comes down to it, we’re all after the same thing, and that’s providing a better service to our residents and our businesses.”

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

The Future of Analytics is Converged and Augmented

July 2, 2020   TIBCO Spotfire
AnalyticsTIBCO 696x464 The Future of Analytics is Converged and Augmented

Reading Time: 2 minutes

In times of rapid change, analytics are an adaptive imperative—but analytics itself is changing as well. The definition of what we’ve known as “augmented analytics” is expanding, and this expansion represents a new opportunity for data leaders, data analysts, and data scientists to help their organizations more quickly adapt to change.

The Impact of Blurring Boundaries on Augmented Analytics 

This transformation is impacting the analytics and BI and data science and machine learning markets most especially. Brought together by augmented analytics, as these two markets converge, consumers’ needs are changing and providers’ capabilities are changing to meet those needs. Capabilities are merging. We’re seeing data science capabilities for visual analytics and vice versa. Even streaming capabilities have blurred the traditional boundaries of analytics markets. 

With these capabilities unified and available across different analytics solutions, data scientists and data analysts today can select the right capabilities to get the job done. This can mean a more seamless analytics workflow and a more user-friendly experience. It also means that organizations no longer need to compromise on the best capabilities to adapt to change. 

The definition of what we’ve known as “augmented analytics” is expanding, and this expansion represents a new opportunity for data leaders, data analysts, and data scientists to help their organizations more quickly adapt to change. Click To Tweet

Analytics Maturation: Opportunities and Challenges Ahead

According to Gartner’s recent ground-breaking report, Worlds Collide as Augmented Analytics Draws Analytics, BI, and Data Science Together(1), “Organizations that seize the opportunities of the newly catalyzed market could dramatically accelerate their analytics maturation”.

If you’re a data and analytics leader, we encourage you to read the Gartner report on the convergence of these markets. From the report, we believe you will learn about:

  • Drivers of this convergence of the analytics and data science markets
  • Blurring analytics roles and responsibilities 
  • Reimagining the traditional step-wise approach to analytics
  • Rethinking the strategy of analytics and data science capabilities 

While this convergence represents an immense opportunity for leaders to gain a competitive advantage, it also poses a challenge for those same organizations to fulfill new customer demands. Author of the report, Carlie Idoine, writes, “The convergence of analytics and BI and DSML platform capabilities, together with the broader collision of data and analytics worlds, is unsettling. Consequently, data and analytics leaders are struggling to understand what capabilities are available and how best to meet the needs of their changing user base, which increasingly includes both technical and nontechnical, expert and citizen, users.” 

Download a complimentary copy of the report to learn more about the implications of augmented analytics for data and analytics leaders.

 (1) Gartner, Worlds Collide as Augmented Analytics Draws Analytics, BI and Data Science Together, Carlie Idoine, 10 March 2020 | GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally, and is used herein with permission. All rights reserved.

Let’s block ads! (Why?)

The TIBCO Blog

Read More

COVID-19’s Immediate and Future Impact on Marketing

May 28, 2020   CRM News and Info

COVID-19 is forcing us to adapt the way we live our lives, raise our families, and conduct our businesses. And as the global economic downturn impacts companies big and small, we need to update our thinking, our approach, and our execution to marketing our offerings and value — likely for the long haul. 

Keep reading to learn how.

Adapt COVID 19’s Immediate and Future Impact on Marketing

Find Ways To Do More With Less

With the threat of a pandemic and the short- and long-term impact it could have on our economy, marketing will likely need to seek ways to achieve more with less — even as our target audiences are tightening their spending.

Here are a few business suggestions for marketers to consider based on the current reality.

Be Brutal With Priorities

More than ever, marketers need to re-evaluate their priorities and take a hard look at what is important right now. It is not business as usual. Focus on those initiatives and messages that will generate the most business value against the backdrop of the pandemic while also looking to increase your engagement with your company’s existing customer base to tighten bonds. This might mean that you need to pause or reduce questionable or experimental campaigns that might have potential in order to be brutal with your priorities.

Choose the Most Cost-Effective Tools

You need to look for technology solutions that help you consolidate and move faster with less while freeing up valuable marketing resources that can be applied elsewhere. As you begin planning for 2021, consider what your marketing tech stack looks like and find ways to focus on the core. Expanding your focus beyond the traditional top-of-funnel demand generation is critical, so look for centralized tools that can be used across your marketing functions.

Identify Team Efficiencies

Now that you’ve brutally prioritized your initiatives, developing clear and concise processes is vital — across teams and technologies. Streamlining your processes will give your marketing team the best chance to execute at higher-than-normal levels, which is essential to counter the negative macroeconomic conditions. Define your short- and long-term goals clearly with stakeholders, and equip your team with the right collaboration tools, documented processes, data collection, and automation capabilities to get the job done better with less.

Roll Out the (Virtual) Red Carpet for Existing Customers

In any recession or crisis environment, a focus on customer retention becomes absolutely paramount in protecting your business and revenue. By doubling down on your customer marketing efforts to deliver great experiences and quality engagement with your install base, you can begin to build “brand moats” to maintain revenue streams and, most importantly, inspire your happy customers to attract new consumers. 

Customer expectations have changed greatly in the last 5 years. When your customers feel more connected to your brand, they’re 57% more likely to increase their spending and 76% more likely to buy from you instead of one of your competitors. Therefore, you need to truly invest in creating authentic connections by understanding what makes your customers tick — their interests, their favorite content types and topics, their current pain points, and their ambitions.  Take that data and build a strong engagement plan that blows them away.

Lastly, an authentic and personalized approach is even more critical in today’s environment to cut through the noise, as 75% of consumers report avoiding any advertising altogether. Conversely, brands with at least five online reviews from happy customers enjoy a 270% increase in sales. As more and more buyers turn to social and professional networks (as well as third-party review sites) to make purchasing decisions, we need to focus on delivering engaging growth-centric adoption and retention campaigns that deepen connections between customer and brand.

Follow these proven techniques to make it happen.

  • Create great engagement strategies to welcome, onboard, and check-in with your newer customers. Hear what they have to say early and feed it into your organization.
  • Really focus on personalization across your digital communications, especially with repeat or long term customers. Go that extra mile to make sure they know you hear them. Now more than ever, customers do not want to be treated like a number.
  • Invest in virtual events now. Customers want community, and there will likely not be any live events until 2021. By being the organization that creates these social connections with your customers, you continue to build the value of your brand.
  • Listen harder than ever. Use sites like Yelp and G2 to gather feedback, analyze the information, identify patterns, and then apply what you’ve learned to improve consumer confidence and satisfaction.

Get Serious About Digital — and Do It Now

Even before the pandemic, 87% of all consumers began their product searches online. And now that many storefronts are closed or limited and in-person events, conferences, and user groups are on hold, being exceptional at digital marketing is more important than ever and will allow you to stand out. To do this well, you need to ensure that you are collecting data on your customer’s digital behavior across your digital properties, online services, and products. Leverage this critical data in your customer engagement strategies to deliver more consistent value. Doing this well will endear you to your customer base and help you distance yourself from the competition.

Do so by using simple and sophisticated tools that are focused on driving engagement and delivering great experiences throughout the customer lifecycle. As budgets and workforces decrease, you need tech that is agile and easy for small teams to use easily and with good results. Instead of automatically defaulting to enterprise solutions with excessive feature sets, really think about how you can achieve your goals efficiently and at a price point that allows you to scale within your budget.

Marketing Has Changed, but the Goal Remains the Same

The coronavirus is changing how we market to individual consumers and massive corporations, but the primary goal will always stay the same: generate the kind of brand excitement, product stickiness, and customer affinity that results in sustainable and scalable business growth. 

If you want some help doing that with a really great platform built by some really smart people, please let us know. We’ll be here — now and in the future.

Let’s block ads! (Why?)

Act-On Blog

Read More

The Future Plant: Current Challenges In Asset Performance

May 12, 2020   SAP
 The Future Plant: Current Challenges In Asset Performance

Part 1 of a three-part series

When Horst started his work as a machine technician at a manufacturing plant 20 years ago, asset management looked very different than it looks today. Having climbed up the career ladder to become an asset manager, Horst has created a modern maintenance environment that tackles many of the major problems German manufacturing companies are concerned with.

Horst no longer has to do a daily tour through the plant to note downed-machine issues or check on maintenance-due dates. Instead, Horst uses asset management software that provides him a constant overview of all assets, right from his desk. Every asset is digitally represented by its digital twin and can permanently be monitored via a visual display.

By continuously collecting relevant data, designated devices automatically enrich an asset’s digital twin with information about its current performance and condition. Data analytics algorithms can use this information to generate a set of relevant KPIs throughout each asset’s entire lifecycle.

For Horst, it is crucial to always be prepared for any possible machine breakdown. Therefore, he is especially interested in knowing an asset’s mean time to failure (MTTF) or mean time between failures (MTBF), as well as the frequency of these incidents.

Knowing particular failures, and how often they typically occur with certain assets, helps Horst classify machine problems to common failure modes and get an understanding of when a failure is likely to happen. It also supports him in grouping his assets into certain risk categories depending on how often, how severe, and how detectable failures are occurring with an asset. This entire process is called Failure Mode Analytics – an important analysis for strategic asset management that is strongly enabled by the ability to monitor each asset’s performance.

Two other important KPIs are relevant once a predicted failure occurs: mean time to repair (MTTR) and mean downtime. As the main measures of machine availability, these KPIs are supposed to be relatively low to enable a maximum level of production continuity.

Following the principles of lean management, Horst is constantly engaged in putting appropriate measures in place to reduce the time a machine is down for repair. In this context, respective breakdown costs also play a meaningful role in managing asset performance.

Last, but not least, an asset’s comprehensive performance can be evaluated in the Overall Equipment Effectiveness KPI. This KPI indicates the percentage of time in which an asset is producing only good parts (quality) as fast as possible (performance) with no stop time (availability). Combining the aspects of quality, performance, and availability makes this measure a very powerful tool for Horst in assessing his assets and in gaining data-based knowledge about his overall plant productivity.

The variety of different KPIs makes it possible to have continual, real-time insight into all assets and their performance. For Horst, who always needs to have a profound overview of his assets’ current state, this really makes life easier. More importantly, the asset performance software equips him with a reliable base for decision-making.

While in the past, most decisions were made based on gut feeling, today the digital twin and its KPIs serve as the source for making machine diagnoses and determining asset maintenance routines. Also, standardized KPIs allow comparisons between several groups of assets or across different plants. This makes processes more transparent and more reliable, therefore helping Horst achieve the best possible asset operation.

By enabling technologies for the smart factory, companies are achieving Mission Unstoppable: making facilities management a transparent, manageable process.

Let’s block ads! (Why?)

Digitalist Magazine

Read More

IonQ CEO Peter Chapman on how quantum computing will change the future of AI

May 10, 2020   Big Data
 IonQ CEO Peter Chapman on how quantum computing will change the future of AI

Businesses eager to embrace cutting-edge technology are exploring quantum computing, which depends on qubits to perform computations that would be much more difficult, or simply not feasible, on classical computers. The ultimate goals are quantum advantage, the inflection point when quantum computers begin to solve useful problems, and quantum supremacy, when a quantum computer can solve a problem that classical computers practically cannot. While those are a long way off (if they can even be achieved), the potential is massive. Applications include everything from cryptography and optimization to machine learning and materials science.

As quantum computing startup IonQ has described it, quantum computing is a marathon, not a sprint. We had the pleasure of interviewing IonQ CEO Peter Chapman last month to discuss a variety of topics. Among other questions, we asked Chapman about quantum computing’s future impact on AI and ML.

Strong AI

The conversation quickly turned to Strong AI, or Artificial General Intelligence (AGI), which does not yet exist. Strong AI is the idea that a machine could one day understand or learn any intellectual task that a human being can.

“AI in the Strong AI sense, that I have more of an opinion just because I have more experience in that personally,” Chapman told VentureBeat. “And there was a really interesting paper that just recently came out talking about how to use a quantum computer to infer the meaning of words in NLP. And I do think that those kinds of things for Strong AI look quite promising. It’s actually one of the reasons I joined IonQ. It’s because I think that does have some sort of application.”

VB Transform 2020 Online – July 15-17: Join leading AI executives at the AI event of the year. Register today and save 30% off digital access passes.

In a follow-up email, Chapman expanded on his thoughts. “For decades it was believed that the brain’s computational capacity lay in the neuron as a minimal unit,” he wrote. “Early efforts by many tried to find a solution using artificial neurons linked together in artificial neural networks with very limited success. This approach was fueled by the thought that the brain is an electrical computer, similar to a classical computer.”

“However, since then, I believe we now know, the brain is not an electrical computer, but an electrochemical one,” he added. “Sadly, today’s computers do not have the processing power to be able to simulate the chemical interactions across discrete parts of the neuron, such as the dendrites, the axon, and the synapse. And even with Moore’s law, they won’t next year or even after a million years.”

Chapman then quoted Richard Feynman, who famously said “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”

“Similarly, it’s likely Strong AI isn’t classical, it’s quantum mechanical as well,” Chapman said.

Machine learning

One of IonQ’s competitors, D-Wave, argues that quantum computing and machine learning are “extremely well matched.” Chapman is still on the fence.

“I haven’t spent enough time to really understand it,” he admitted. “There clearly is a lot of people who think that ML and quantum have an overlap. Certainly, if you think of 85% of all ML produces a decision tree. And the depth of that decision tree could easily be optimized with a quantum computer. Clearly there’s lots of people that think that generation of the decision tree could be optimized with a quantum computer. Honestly, I don’t know if that’s the case or not. I think it’s still a little early for machine learning, but there clearly is so many people that are working on it. It’s hard to imagine it doesn’t have application.”

Again, in an email later, Chapman followed up. “ML has intimate ties to optimization: many learning problems are formulated as minimization of some loss function on a training set of examples. Generally, Universal Quantum Computers excel at these kinds of problems.”

Chapman listed three improvements in ML that quantum computing will likely allow:

  • The level of optimization achieved will be much higher with a QC as compared to today’s classical computers.
  • The training time might be substantially reduced because a QC can work on the problem in parallel, where classical computers perform the same calculation serially.
  • The amount of permutations that can be considered will likely be much larger because of the speed improvements that QCs bring.

AI is not a focus for IonQ

Strong AI or ML, IonQ isn’t particularly interested either. The company leaves that part to its customers and future partners.

“There’s so much to be to be done in a quantum,” Champan said. “From education at one end all the way to the quantum computer itself. I think some of our competitors have taken on lots of the entire problem set. We at IonQ are just focused on producing the world’s best quantum computer for them. We think that’s a large enough task for a little company like us to handle.”

“So, for the moment we’re kind of happy to let everyone else work on different problems,” he added. “We just think, producing the world’s best quantum computer is a large enough task. We just don’t have extra bandwidth or resources to put into working on machine learning algorithms. And luckily, there’s lots of other companies that think that there’s applications there. We’ll partner with them in the sense that we’ll provide the hardware that their algorithms will run on. But we’re not in the ML business per se.”

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

School of Rock, One Beat CPR Share Their Front Line Action Plans for Adaptation and the Future

April 24, 2020   NetSuite
gettyimages 1080244684%20(1) School of Rock, One Beat CPR Share Their Front Line Action Plans for Adaptation and the Future

Posted by Barney Beal, Content Director

For Rob Price, CEO of the School of Rock, the information swirling about in the early days of the coronavirus was a temptation to be avoided.

“We decided to get some independent and customized and proprietary medical advice from the get go, to help us synthesize what was going on rather than listening to the news and trying to reconcile between CNN and MSNBC and Fox news,” Price said on a recent NetSuite virtual event, part of the NetSuite Business Now series. “All of that input in the early days was complete mumbo jumbo.”

Price and team brought in a pediatric specialist with some background in epidemiology to help them plan how to manage their corporate- and franchise-owned music schools as things progressed. Even with that guidance, the coronavirus still caught the company unexpected in many ways. It caught most businesses unexpected.

According to a survey conducted during the virtual event, 48% of the nearly 350 attendees said they didn’t have a business continuity plan in place, another 17% said they had one but it wasn’t useful.

So, what to do? Navigating business uncertainty was the theme of last week’s virtual event in which Price was joined by his CFO, John Cappadona, along with Lawrence Franchetti, CEO of One Beat CPR Learning Center, which provides CPR training and distributes defibrillators and related health care devices.

For most businesses, the first step is taking stock and determining immediate priorities. While most companies think in 12-, 24-, and 36-month increments, uncertainty means narrowing that window significantly, said Scott Beaver, senior product manager at Oracle NetSuite and moderator of the event. Franchetti’s teams were positioned to see the warning signs early.

“We serve first responders, hospitals and everything from schools to churches,” Franchetti said. “We knew there was an emerging problem, and it was coming very quickly.”

For both School of Rock and One Beat CPR Learning Center, adapting required a swift shift in business model. It was a common concern: the biggest challenge cited by attendees on the virtual event was continuing business operations (51%), followed by order reductions (35%) and those who had to close doors until restrictions are lifted (10%). Just 3% cited existing supply chain that can’t deliver materials.

From Training to Equipment and In-person Lessons to Online

One Beat CPR shifted from a training-plus-product business to providing Personal Protective Equipment (PPE) and other equipment its customers suddenly needed, such as ventilators. While the business had sold some ventilators in the past, it took some work to make the transition, identifying its most trusted suppliers and targeting them early. One Beat CPR is now selling masks, face shields and sanitizers among other items in demand by people grappling with the coronavirus.

“We were able to work with our bank, the finance team and undergo plans to acquire products that folks on the front line needed,” Franchetti said. “We were able to work with vendors as well, to create products and business plans for our team to be successful. The first day it didn’t seem like it, but now we’re pretty nimble.”

It’s now working with manufacturers to acquire more PPE products and has started making its own face shields.

“We had to think big, bigger and biggest,” Franchetti said. “We thought, ‘why can’t we do this ourselves?’ I’ve been calling it a watershed moment for our organization. We’ve opened up additional distribution lines, tested manufacturing our own products. We’re shipping globally and getting into areas where we didn’t have a strong foothold.”

School of Rock had to quickly transition from its business of in-person music lessons.

“We needed to plan for the short term, but in context of the long term,” Price said.

Within a week, School of Rock had created an online education platform, a process that likely would have taken six months without the pressure of the current climate, Price said. Not only will that platform help the company and its franchisees survive now, but it helps solve some of the reason schoolchildren don’t take lessons, like summer vacations, lessons that are too far away, or students unable to drive to classes. The organization was able to do that thanks to collaborating closely with franchisees and leaning on its culture of learning and sharing. The organization held regular meetings, built a wiki page for feedback on the platform and shared best practices of the franchises that were managing best.

“Our community and our team has rallied around a purpose and our culture,” said Cappadona, the CFO. “We’ve done a lot of work over the last few years building that up. It has really revealed itself during this time of crisis.”

Staying Focused on the Mission

The coronavirus didn’t just introduce business challenges, it raised stress levels as well. Both companies were able to turn to their business mission to help cope.

“The reality is that we have our team [to worry about] too,” Franchetti said. “The stress and anxiety of how this will affect their home life. How are they going to come home? Anchor in yourself and your purpose. For ourselves, it’s helping save lives.”

One Beat CPR created online classroom sessions for its employees who had children home from school. It also paid for babysitting.

Both organizations were also careful and clear communicating with customers and employees. School of Rock worked to educate franchise operators on managing their costs, including navigating conversations with their landlords.

“This has revealed to me the power and importance of individual communication,” Price said. “Every franchisee in our system has my cell phone number.”

With new business models and new learnings, both businesses are poised to bounce back when things return to normal. But as one poll during the virtual event posed—How will you get back to normal?—many are not sure. Just 5% said they would get back to business as normal, 43% predicted minor tweaks, 26% predicted significant changes and a quarter of respondents answered, “we don’t know.”

Watch the recording of the full virtual event of Navigating Business Uncertainty and register for more of the upcoming events.

Posted on Fri, April 24, 2020
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

Read More
« Older posts
  • Recent Posts

    • solve for variable in iterator limit
    • THE UNIVERSE: A WONDROUS PLACE
    • 2020 ERP/CRM Software Blog Award Winners
    • Top 10 CRM Software Blog Posts in 2020
    • Database trends: Why you need a ledger database
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited