Print List Price: | CDN$ 91.95 |
Kindle Price: | CDN$ 87.35 Save CDN$ 4.60 (5%) |
includes free international wireless delivery via Amazon Whispernet |

![The Principles of Deep Learning Theory: An Effective Theory Approach to Understanding Neural Networks by [Daniel A. Roberts, Sho Yaida, Boris Hanin]](https://m.media-amazon.com/images/W/IMAGERENDERING_521856-T1/images/I/51gxF5cOdAL._SY346_.jpg)
Follow the Authors
OK
The Principles of Deep Learning Theory: An Effective Theory Approach to Understanding Neural Networks [Print Replica] Kindle Edition
Amazon Price | New from | Used from |
- Kindle Edition
$87.35 Read with Our Free App - Hardcover
$91.95
- LanguageEnglish
- PublisherCambridge University Press
- Publication dateMay 26 2022
- File size13154 KB
Product description
Review
'For a physicist, it is very interesting to see deep learning approached from the point of view of statistical physics. This book provides a fascinating perspective on a topic of increasing importance in the modern world.' Edward Witten, Institute for Advanced Study
'This is an important book that contributes big, unexpected new ideas for unraveling the mystery of deep learning’s effectiveness, in unusually clear prose. I hope it will be read and debated by experts in all the relevant disciplines.' Scott Aaronson, University of Texas at Austin
'It is not an exaggeration to say that the world is being revolutionized by deep learning methods for AI. But why do these deep networks work? This book offers an approach to this problem through the sophisticated tools of statistical physics and the renormalization group. The authors provide an elegant guided tour of these methods, interesting for experts and non-experts alike. They write with clarity and even moments of humor. Their results, many presented here for the first time, are the first steps in what promises to be a rich research program, combining theoretical depth with practical consequences.' William Bialek, Princeton University
'This book’s physics-trained authors have made a cool discovery, that feature learning depends critically on the ratio of depth to width in the neural net.' Gilbert Strang, Massachusetts Institute of Technology --This text refers to the hardcover edition.
Book Description
About the Author
Sho Yaida is a research scientist at Meta AI. Prior to joining Meta AI, he obtained his PhD in physics at Stanford University and held postdoctoral positions at MIT and at Duke University. At Meta AI, he uses tools from theoretical physics to understand neural networks, the topic of this book.
Boris Hanin is an Assistant Professor at Princeton University in the Operations Research and Financial Engineering Department. Prior to joining Princeton in 2020, Boris was an Assistant Professor at Texas A&M in the Math Department and an NSF postdoc at MIT. He has taught graduate courses on the theory and practice of deep learning at both Texas A&M and Princeton. --This text refers to the hardcover edition.
Product details
- ASIN : B09YM1R6XW
- Publisher : Cambridge University Press (May 26 2022)
- Language : English
- File size : 13154 KB
- Simultaneous device usage : Up to 4 simultaneous devices, per publisher limits
- Text-to-Speech : Not enabled
- Enhanced typesetting : Not Enabled
- X-Ray : Not Enabled
- Word Wise : Not Enabled
- Sticky notes : Not Enabled
- Best Sellers Rank: #694,390 in Kindle Store (See Top 100 in Kindle Store)
- #78 in Mathematical Physics (Kindle Store)
- #78 in Mathematical Physics eBooks
- #430 in Mathematical Physics Books
- Customer Reviews:
About the authors
Dan Roberts (https://danintheory.com) is currently a Research Affiliate at the Center for Theoretical Physics at MIT, an Affiliate of the NSF AI Institute for Artificial Intelligence and Fundamental Interactions, and a Principal Researcher at Salesforce. Previously, he was Co-Founder and CTO of Diffeo, a collaborative AI company acquired by Salesforce, a research scientist at Facebook AI Research (FAIR) in NYC, and a Member of the School of Natural Sciences at the Institute for Advanced Study in Princeton, NJ. Dan received a Ph.D. from MIT, funded by a Hertz Foundation Fellowship and the NDSEG, and he studied at Cambridge and Oxford as a Marshall Scholar. Dan's research has centered on the interplay between physics and computation, and previously he has focused on the relationship between black holes, quantum chaos, computational complexity, randomness, and how the laws of physics are related to fundamental limits of computation.
Sho Yaida (https://shoyaida.com) is currently a Research Scientist at Meta AI. Prior to joining Meta AI, he obtained his Ph.D. in physics at Stanford University -- where he studied physics of black holes and strongly correlated systems -- and held postdoctoral positions at MIT and at Duke University -- where he studied physics of glasses. At Meta AI, he studies physics of machine learning.
Customers who bought this item also bought
Customer reviews
Top reviews from other countries
