viralamo

Menu
  • Technology
  • Science
  • Money
  • Culturs
  • Trending
  • Video

Subscribe To Our Website To Receive The Last Stories

Join Us Now For Free
Home
Technology
Google researchers investigate how transfer learning works
Technology

Google researchers investigate how transfer learning works

28/08/2020

Transfer learning is an area of intense AI research — it focuses on storing knowledge gained while solving a problem and applying it to a related problem. But despite recent breakthroughs, it’s not yet well-understood what enables a successful transfer and which parts of algorithms are responsible for it.

That’s why Google researchers sought to develop analysis techniques tailored to explainability challenges in transfer learning. In a new paper, they say their contributions help to solve a few of the mysteries around why machine learning models successfully — or unsuccessfully — transfer.

During the first of several experiments in the course of the study, the researchers sourced images from a medical imaging data set of chest x-rays (CheXpert) and sketches, clip art, and paintings from the open source DomainNet corpus. They partitioned each image into equal-sized blocks and shuffled the blocks randomly, disrupting the images’ visual features, after which they compared the agreements and disagreements between models trained from pretraining versus from scratch.

The researchers found the reuse of features — the individual measurable properties of a phenomenon being observed — is an important factor in successful transfers, but not the only one. Low-level statistics of the data that weren’t disturbed by things like shuffling the pixels also play a role. Moreover, any two instances of models trained from pretrained weights make similar mistakes, suggesting these models capture features in common.

Working from this knowledge, the researchers attempted to pinpoint where feature reuse occurs within models. They observed that features become more specialized the denser the model becomes (in terms of layers) and that feature-reuse is more prevalent in layers closer to the input. (Deep learning models contain mathematical functions arranged in layers that transmit signals from input data.) They also find it’s possible to fine-tune pretrained models on a target task sooner than originally assumed without sacrificing accuracy.

“Our observation of low-level data statistics improving training speed could lead to better network initialization methods,” the researchers wrote. “Using these findings to improve transfer learning is of interest for future work.”

A better understanding of transfer learning could yield substantial algorithmic performance gains. Already, Google is using transfer learning in Google Translate so that insights gleaned through training on high-resource languages including French, German, and Spanish (which have billions of parallel examples) can be applied to the translation of low-resource languages like Yoruba, Sindhi, and Hawaiian (which have only tens of thousands of examples). Another Google team has applied transfer learning techniques to enable robot control algorithms to learn how to manipulate objects faster with less data.

Source link

Share
Tweet
Pinterest
Linkedin
Stumble
Google+
Email
Prev Article
Next Article

Related Articles

The RetroBeat: Sony recommits to 3D platformers with PlayStation 5
We got to see more of Ratchet & Clank: Rift …

Ratchet & Clank: Rift Apart will release in PS5’s ‘launch window’

FIFA 20, Final Fantasy XV, and big games that divided critics and players
The Last of Us Part II is likely the most …

FIFA 20, Final Fantasy XV, and big games that divided critics and players

Leave a Reply Cancel reply

Find us on Facebook

Related Posts

  • PlayStation 5 logo revealed as PS4 surpasses 106 million systems sold
    PlayStation 5 logo revealed as PS4 surpasses …
    07/01/2020
  • Fast & Furious: Crossroads lives video games a quarter-mile at a time
    Snapchat will launch Bitmoji TV, a personalized …
    29/12/2019
  • TechCrunch’s Favorite Things of 2019
    Goldman Sachs predicts brutal second quarter as …
    20/03/2020
  • AppsFlyer raises funding from Salesforce Ventures at $2 billion valuation
    AppsFlyer raises funding from Salesforce Ventures at …
    24/11/2020
  • Beat Saber is now an Oculus studio after Facebook acquisition
    Josh Kopelman on how to start and …
    04/02/2020

Popular Posts

  • Comcast hides upload speeds deep inside its infuriating ordering system
    Comcast hides upload speeds deep inside its …
    03/03/2021 0
  • Comcast overcharged elderly couple $600, denied refund until contacted by Ars
    Comcast lifts uploads to 5Mbps amid complaints …
    03/02/2021 0
  • High-performance computers are under siege by a newly discovered backdoor
    High-performance computers are under siege by a …
    03/02/2021 0
  • Top 10 Strangest Flavoured Foods And Drinks …
    03/02/2021 0
  • 10 Bizarre Things Celebrities Have Done – …
    03/02/2021 0

viralamo

Pages

  • Contact Us
  • Privacy Policy
Copyright © 2021 viralamo
Theme by MyThemeShop.com

Ad Blocker Detected

Our website is made possible by displaying online advertisements to our visitors. Please consider supporting us by disabling your ad blocker.

Refresh