Deep Learning: Zero to One podcast

Art Generation - Facebook AI Research, Google DeepDream, and Ruder's Style Transfer for Video - Deep Learning: Zero to One

0:00
7:49
Reculer de 15 secondes
Avancer de 15 secondes
Justin Johnson, now at Facebook, wrote the original Torch implementation of the Gatys 2015 paper, which combines the content of one image and the style of another image using convolutional neural networks. Manuel Ruder’s newer 2016 paper transfers the style of one image to a whole video sequence, and it uses a computer vision technique called optical flow to generate consistent and stable stylized video sequences. Ruder’s implementation was used, by me, to generate a stylized butterfly video, located at https://medium.com/@SamPutnam/deep-learning-zero-to-one-art-generation-b532dd0aa390

D'autres épisodes de "Deep Learning: Zero to One"