Gods and Robots In this episode of the podcast we shake things up! Neil is on the guest side of the table with his partner Rabbi Laura Janner-Klausner to discuss their upcoming project Gods and Robots. Katherine is joined on the host side by friend of the show professor Michael Littman. See... See More Episodes arXiv Whitepapers Dataset Distillation: A Comprehensive Review Recent success of deep learning can be largely attributed to the huge amount of data used for training deep neural networks. However, the sheer amount of data significantly increase the burden on storage and transmission. It would also consume considerable time and computational resources to train... Data Distillation: A Survey The popularity of deep learning has led to the curation of a vast number of massive and multifarious datasets. Despite having close-to-human performance on individual tasks, training parameter-hungry models on large datasets poses multi-faceted problems such as (a) high model-training time; (b) slow... TRUE: Re-evaluating Factual Consistency Evaluation Grounded text generation systems often generate text that contains factual inconsistencies, hindering their real-world applicability. Automatic factual consistency evaluation may help alleviate this limitation by accelerating evaluation cycles, filtering inconsistent outputs and augmenting training... News Articles Lincoln Laboratory enters licensing agreement to produce its localizing ground-penetrating radar Bringing neural networks to cellphones Stay in the loop. Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings. E-mail Leave this field blank Miniaturizing the brain of a drone