Africa-focused technology, digital and innovation ecosystem insight and commentary.
…
continue reading
Contenu fourni par Linear Digressions, Ben Jaffe, and Katie Malone. Tout le contenu du podcast, y compris les épisodes, les graphiques et les descriptions de podcast, est téléchargé et fourni directement par Linear Digressions, Ben Jaffe, and Katie Malone ou son partenaire de plateforme de podcast. Si vous pensez que quelqu'un utilise votre œuvre protégée sans votre autorisation, vous pouvez suivre le processus décrit ici https://fr.player.fm/legal.
Player FM - Application Podcast
Mettez-vous hors ligne avec l'application Player FM !
Mettez-vous hors ligne avec l'application Player FM !
The Lottery Ticket Hypothesis
MP3•Maison d'episode
Manage episode 254320684 series 2527355
Contenu fourni par Linear Digressions, Ben Jaffe, and Katie Malone. Tout le contenu du podcast, y compris les épisodes, les graphiques et les descriptions de podcast, est téléchargé et fourni directement par Linear Digressions, Ben Jaffe, and Katie Malone ou son partenaire de plateforme de podcast. Si vous pensez que quelqu'un utilise votre œuvre protégée sans votre autorisation, vous pouvez suivre le processus décrit ici https://fr.player.fm/legal.
Recent research into neural networks reveals that sometimes, not all parts of the neural net are equally responsible for the performance of the network overall. Instead, it seems like (in some neural nets, at least) there are smaller subnetworks present where most of the predictive power resides. The fascinating thing is that, for some of these subnetworks (so-called “winning lottery tickets”), it’s not the training process that makes them good at their classification or regression tasks: they just happened to be initialized in a way that was very effective. This changes the way we think about what training might be doing, in a pretty fundamental way. Sometimes, instead of crafting a good fit from wholecloth, training might be finding the parts of the network that always had predictive power to begin with, and isolating and strengthening them. This research is pretty recent, having only come to prominence in the last year, but nonetheless challenges our notions about what it means to train a machine learning model.
…
continue reading
291 episodes
MP3•Maison d'episode
Manage episode 254320684 series 2527355
Contenu fourni par Linear Digressions, Ben Jaffe, and Katie Malone. Tout le contenu du podcast, y compris les épisodes, les graphiques et les descriptions de podcast, est téléchargé et fourni directement par Linear Digressions, Ben Jaffe, and Katie Malone ou son partenaire de plateforme de podcast. Si vous pensez que quelqu'un utilise votre œuvre protégée sans votre autorisation, vous pouvez suivre le processus décrit ici https://fr.player.fm/legal.
Recent research into neural networks reveals that sometimes, not all parts of the neural net are equally responsible for the performance of the network overall. Instead, it seems like (in some neural nets, at least) there are smaller subnetworks present where most of the predictive power resides. The fascinating thing is that, for some of these subnetworks (so-called “winning lottery tickets”), it’s not the training process that makes them good at their classification or regression tasks: they just happened to be initialized in a way that was very effective. This changes the way we think about what training might be doing, in a pretty fundamental way. Sometimes, instead of crafting a good fit from wholecloth, training might be finding the parts of the network that always had predictive power to begin with, and isolating and strengthening them. This research is pretty recent, having only come to prominence in the last year, but nonetheless challenges our notions about what it means to train a machine learning model.
…
continue reading
291 episodes
Tous les épisodes
×Bienvenue sur Lecteur FM!
Lecteur FM recherche sur Internet des podcasts de haute qualité que vous pourrez apprécier dès maintenant. C'est la meilleure application de podcast et fonctionne sur Android, iPhone et le Web. Inscrivez-vous pour synchroniser les abonnements sur tous les appareils.