Mar 19, 2021
Andy and Dave discuss the latest in AI news, including an
announcement from Facebook AI that it achieved state of the art
computer vision performance with its SEER model, by learning from
one billion (with a ‘b’) random, unlabeled, and uncurated public
Instagram images, reaching 84% top-1 accuracy on 13k images from
ImageNet. DARPA launches a new Perceptually-enabled Task Guidance
(PTG) to help humans perform complex tasks (such as through
augmented reality); the effort will include both fundamental
research as well as integrated demonstrations. DARPA also announces
research teams from its Semantic Forensics (SemaFor) effort at
probing media manipulations. Chris Ume, a Belgian visual effects
artist, releases four deepfake videos of Tom Cruise, using two
NVIDIA GPUs, two months training time, and further days of
processing and tweaking for each clip. Researchers at the
University of Washington, Berkeley, and Google Research use the
StyleGAN2 framework to create “time-travel photography,” which
peels away the limitations of early cameras to reveal restored
images of the original photos; the effort also involves the
creation of a modern “sibling,” which then gets merged with the
original. OpenAI publishes the discovery that neurons in its CLIP
network respond to the same concept, whether literal, symbolic
(e.g., a sketch) , or conceptual (e.g., text); they also discover
an absurdly simple attack, which involves places a stick with a
word onto an item. The report of the week from UNICEF looks at
Adolescent Perspectives on AI, with insights from 245 adolescents
from five countries. Montreal.AI provides a 33-page “cheat sheet”
with condensed information and links on AI topics. The book of the
week from E-IR examines Remote Warfare: Interdisciplinary
Perspectives. And the fun site of the week, MyHeritage, lets users
animate photos, or “re-animate your dead loves ones.”
Listeners Survey: https://bit.ly/3bqyiHk
Click here to visit our website and explore the links mentioned in the episode.