Gods and Robots In this episode of the podcast we shake things up! Neil is on the guest side of the table with his partner Rabbi Laura Janner-Klausner to discuss their upcoming project Gods and Robots. Katherine is joined on the host side by friend of the show professor Michael Littman. See... See More Episodes arXiv Whitepapers Holistic Evaluation of Language Models Language models (LMs) are becoming the foundation for almost all major language technologies, but their capabilities, limitations, and risks are not well understood. We present Holistic Evaluation of Language Models (HELM) to improve the transparency of language models. First, we taxonomize the vast... Melody transcription via generative pre-training Despite the central role that melody plays in music perception, it remains an open challenge in music information retrieval to reliably detect the notes of the melody present in an arbitrary music recording. A key challenge in melody transcription is building methods which can handle broad audio... AfroLM: A Self-Active Learning-based Multilingual Pretrained Language Model for 23 African Languages In recent years, multilingual pre-trained language models have gained prominence due to their remarkable performance on numerous downstream Natural Language Processing tasks (NLP). However, pre-training these large multilingual language models requires a lot of training data, which is not available... News Articles Faster big-data analysis Bug-repair system learns from example Stay in the loop. Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings. E-mail Leave this field blank New leadership for MIT-IBM Watson AI Lab Artificial intelligence suggests recipes based on food photos Lincoln Laboratory enters licensing agreement to produce its localizing ground-penetrating radar Bringing neural networks to cellphones Miniaturizing the brain of a drone