#یادگیری_عمیق دقیقا چیه؟
طی چند روز گذشته، بعضی از محققهای بزرگ هوش مصنوعی مانند #یان_لکان و #فرانسوا_شوله تعریف خودشون رو از دیپ لرنینگ توی سال 2019 اعلام کردند.
یان لکان:
"Some folks still seem confused about what deep learning is. Here is a definition:
DL is constructing networks of parameterized functional modules & training them from examples using gradient-based optimization. That's it.
This definition is orthogonal to the learning paradigm: reinforcement, supervised, or self-supervised.
Don't say "DL can't do X" when what you really mean is "supervised learning needs too much data to do X"
Extensions (dynamic networks, differentiable programming, graph NN, etc) allow the network architecture to change dynamically in a data-dependent way."
https://www.facebook.com/722677142/posts/10156463919392143/
فرانسوا شوله:
"What's deep learning?
The "common usage" definition as of 2019 would be "chains of differentiable parametric layers trained end-to-end with backprop".
But this definition seems overly restrictive to me. It describes *how we do DL today*, not *what it is*."
https://twitter.com/fchollet/status/1210031900695449600
اندری بورکوف:
"Looks like in late 2019, people still need a definition of deep learning, so here's mine: deep learning is finding parameters of a nested parametrized non-linear function by minimizing an example-based differentiable cost function using gradient descent."
https://www.linkedin.com/posts/andriyburkov_looks-like-in-late-2019-people-still-need-activity-6615377527147941888-ce68/
#deep_learning
طی چند روز گذشته، بعضی از محققهای بزرگ هوش مصنوعی مانند #یان_لکان و #فرانسوا_شوله تعریف خودشون رو از دیپ لرنینگ توی سال 2019 اعلام کردند.
یان لکان:
"Some folks still seem confused about what deep learning is. Here is a definition:
DL is constructing networks of parameterized functional modules & training them from examples using gradient-based optimization. That's it.
This definition is orthogonal to the learning paradigm: reinforcement, supervised, or self-supervised.
Don't say "DL can't do X" when what you really mean is "supervised learning needs too much data to do X"
Extensions (dynamic networks, differentiable programming, graph NN, etc) allow the network architecture to change dynamically in a data-dependent way."
https://www.facebook.com/722677142/posts/10156463919392143/
فرانسوا شوله:
"What's deep learning?
The "common usage" definition as of 2019 would be "chains of differentiable parametric layers trained end-to-end with backprop".
But this definition seems overly restrictive to me. It describes *how we do DL today*, not *what it is*."
https://twitter.com/fchollet/status/1210031900695449600
اندری بورکوف:
"Looks like in late 2019, people still need a definition of deep learning, so here's mine: deep learning is finding parameters of a nested parametrized non-linear function by minimizing an example-based differentiable cost function using gradient descent."
https://www.linkedin.com/posts/andriyburkov_looks-like-in-late-2019-people-still-need-activity-6615377527147941888-ce68/
#deep_learning