• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

IBM adds noise to boost AI’s accuracy on analog memory

May 18, 2020   Big Data

In a study published this week in the journal Nature Communications, researchers at IBM’s lab in Zurich, Switzerland claim to have developed a technique that achieves both energy efficiency and high accuracy on machine learning workloads using phase-change memory. By exploiting in-memory computing methods using resistance-based storage devices, their approach marries the compartments used to store and compute data, in the process significantly cutting down on active power consumption.

Many existing AI inferencing setups physically split the memory and processing units, causing AI models to be stored in off-chip memory. This adds computational overhead because data must be shuffled between the units, a process that slows down processing and contributes to electrical usage. IBM’s technique ostensibly solves those problems with phase-change memory, a form of non-volatile memory that’s faster than the commonly used flash memory technology. The work, if proven scalable, could pave the way for powerful hardware that runs AI in drones, robots, mobile devices, and other compute-constrained devices.

As the IBM team explains, the challenge with phase-change memory devices is that it tends to introduce computational inaccuracy. That’s because it’s analog in nature; its precision is limited due to variability as well as read and write conductance noise.

The solution the study proposed entails injecting additional noise during the training of AI models in software to improve the models’ resilience. The results suggest it’s successful — training a ResNet model with noise achieved accuracy of 93.7% on the popular CIFAR-19 data set and top-1 accuracy on ImageNet of 71.6% after mapping the trained weights (i.e., parameters that transform input data) to phase-change memory components. Moreover, after mapping the weights of a particular model onto 723,444 phase-change memory devices in a prototype chip, the accuracy stayed above 92.6% over the course of a single day. The researchers claim it’s a record.

VB Transform 2020 Online – July 15-17. Join leading AI executives: Register for the free livestream.

In an attempt to further improve accuracy retention over time, the coauthors of the study also developed a compensation technique that periodically corrects the activation functions (equations that determine the model’s output) during inference. This led to an improvement in accuracy to 93.5% on hardware, they say.

In parallel, the team experimented with training machine learning models using analog phase-change memory components. With a mixed-precision architecture, they report that they managed to attain “software-equivalent” accuracies on several types of small-scale models, including multilayer perceptrons, convolutional neural networks, long-short-term-memory networks, and generative adversarial networks. The training experiments are detailed in full in a study recently published in the journal Frontiers in Neuroscience.

IBM’s latest work in the domain follows the introduction of the company’s phase-change memory chip for AI training. While still in the research stage, company researchers demonstrated the system could store weight data as electrical charges, performing 100 times more calculations per square millimeter than a graphics card while using 280 times less power.

“In an era transitioning more and more towards AI-based technologies, including internet-of-things battery-powered devices and autonomous vehicles, such technologies would highly benefit from fast, low-powered, and reliably accurate DNN inference engines,” IBM said in a statement. “The strategies developed in our studies show great potential towards realizing accurate AI hardware-accelerator architectures to support DNN training and inferencing in an energy-efficient manner.”

Let’s block ads! (Why?)

Big Data – VentureBeat

Accuracy, adds, AI’s, analog, Boost, Memory, Noise
  • Recent Posts

    • solve for variable in iterator limit
    • THE UNIVERSE: A WONDROUS PLACE
    • 2020 ERP/CRM Software Blog Award Winners
    • Top 10 CRM Software Blog Posts in 2020
    • Database trends: Why you need a ledger database
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited