Similar Tracks
Sparsity in Deep Learning: Pruning + growth for efficient inference and training in neural networks
Scalable Parallel Computing Lab, SPCL @ ETH Zurich
Stéphane Mallat: "Deep Generative Networks as Inverse Problems"
Institute for Pure & Applied Mathematics (IPAM)
AI Snake Oil: What Artificial Intelligence Can Do, What It Can’t, and How to Tell the Difference
MIT Shaping the Future of Work Initiative
Yann LeCun: “AI Breakthroughs & Obstacles to Progress, Mathematical and Otherwise”
Institute for Pure & Applied Mathematics (IPAM)
Tips Tricks 20 - Understanding transfer learning for different size and channel inputs
DigitalSreeni
Joel Tropp - Scalable semidefinite programming - IPAM at UC:A
Institute for Pure & Applied Mathematics (IPAM)
Xavier Bresson: "Convolutional Neural Networks on Graphs"
Institute for Pure & Applied Mathematics (IPAM)