Aug 21, 2020
In COVID-related AI news, Andy and Dave discuss survey from Amazon Web Surveys that examines the current status of Internet of Things applications related to COVID-19, include scenarios that might help to reduce the severity of an outbreak. MIT publishes an combinatorial machine learning method to maximize the coverage of a COVID-19 vaccine. In “quick takes” on research, Andy and Dave discuss research from Microsoft, University of Washington, and UC Irvine, which provides a checklist to help identify bugs in natural language processing algorithms. A paper from Element AI and Stanford examines whether benchmarks for natural language systems actually correspond to how we use those systems. University of Illinois at Urbana-Champaign, Columbia University, and US Army Research Lab introduce GAIA, which processes unstructured and heterogeneous multimedia data and creates a coherent knowledge base, and allows for text queries. Research published in Nature Neuroscience examines the brain connectivity of 130 mammalian species and finds efficiency of information transfer through the brain does not depend on the size or structure of any specific brain. And finally, Andy and Dave spend some time talking about the broader implications of GPT-3, the experiments that people are conducting with it, and how it is not an AGI. Dave concludes with an analogy from Star Trek: the Next Generation, that he gets mostly correct, though he misattributes Geordi La Forge’s action to Dr. Pulaski. If only he had a positronic matrix!