Tomaso Poggio, Hrushikesh Mhaskar, Lorenzo Rosasco, Brando Miranda and Qianli Liao. Why and When Can Deep-but Not Shallow-networks Avoid the Curse of Dimensionality:A Review. International Journal of Automation and Computing, vol. 14, no. 5, pp. 503-519, 2017. DOI: 10.1007/s11633-017-1054-2
Citation: Tomaso Poggio, Hrushikesh Mhaskar, Lorenzo Rosasco, Brando Miranda and Qianli Liao. Why and When Can Deep-but Not Shallow-networks Avoid the Curse of Dimensionality:A Review. International Journal of Automation and Computing, vol. 14, no. 5, pp. 503-519, 2017. DOI: 10.1007/s11633-017-1054-2

Why and When Can Deep-but Not Shallow-networks Avoid the Curse of Dimensionality:A Review

  • The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return