• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

MIT CSAIL aims for energy efficiency in AI model training

April 24, 2020   Big Data

In a newly published paper, MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) researchers propose a system for training and running AI models in a way that’s more environmentally friendly than previous approaches. They claim it can cut down on the pounds of carbon emissions involved to “low triple digits” in some cases, mainly by improving the computational efficiency of the aforementioned models.

Impressive feats have been achieved with AI across domains like image synthesis, protein modeling, and autonomous driving, but the technology’s sustainability issues remain largely unresolved. Last June, researchers at the University of Massachusetts at Amherst released a report estimating that the amount of power required for training and searching a certain model involves the emissions of roughly 626,000 pounds of carbon dioxide — equivalent to nearly five times the lifetime emissions of the average U.S. car.

The researchers’ solution, a “once-for-all” network, trains a large model comprising many pretrained sub-models of different sizes that can be tailored to a range of platforms without retraining. Each sub-model can operate independently at inference time without retraining, and the system identifies the best sub-model based on the accuracy and latency trade-offs that correlate to the target hardware’s power and speed limits. (For instance, for smartphones, the system will select larger subnetworks, but with different structures depending on individual battery lifetimes and computation resources.)

A “progressive shrinking” algorithm efficiently trains the large model to support all of the sub-models simultaneously. The large model is trained first, and then smaller sub-models are trained with the help of the large model so that they learn concurrently. In the end, all of the sub-models are supported, allowing speedy specialization based on the target platform’s specifications.

In experiments, the researchers found that training a computer vision model containing over 10 quintillion architectural settings with their approach ended up being far more efficient than spending hours training each sub-network. Furthermore, it didn’t compromise the model’s accuracy or efficiency — the model achieved state-of-the-art accuracy on mobile devices when tested against a common benchmark (ImageNet) and was between 1.5 to 2.6 times faster in terms of inference than leading classification systems.

 MIT CSAIL aims for energy efficiency in AI model training

Perhaps more impressive, the researchers claim that the computer vision model required roughly 1/1,300 the carbon emissions while training compared with today’s popular model search techniques. “If rapid progress in AI is to continue, we need to reduce its environmental impact,” said IBM fellow and member of the MIT-IBM Watson AI Lab John Cohn, referring to the study. “The upside of developing methods to make AI models smaller and more efficient is that the models may also perform better.”

Both the code and pretrained models for devices like the Samsung Galaxy Note8 and Note10, Samsung Galaxy S7 Edge, LG G8, Google Pixel and Pixel 2; processors like Intel Xeon; and graphics cards like Nvidia’s GTX 1080Ti, Jetson TX2, and V100 are available on GitHub.

It’s worth noting that MIT CSAIL’s work builds on approaches like that outlined in a 2017 paper, “Efficient Processing of Deep Neural Networks: A Tutorial and Survey.” It laid out some of the ways to reduce the computational demands of machine learning models, including changes to hardware design, changes to collaboration on hardware design, and changes to the algorithms themselves. Other proposals have called for an industry-level energy analysis for computer scientists and a compute-per-watt standard for machine learning projects.

Let’s block ads! (Why?)

Big Data – VentureBeat

a Training, Aims, CSAIL, Efficiency., energy, model
  • Recent Posts

    • We were upgraded to the Unified Interface for Dynamics 365. Now What?
    • Recreating Art – the unexpected way
    • Upcoming Webinar: Using Dynamics 365 to Empower your Marketing and Sales Teams with Digital Automation
    • Center for Applied Data Ethics suggests treating AI like a bureaucracy
    • Improving Dynamics 365 Data Integrations with Alternate Keys
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited