«The most natural playground for exploring functional structures trained as deep learning networks would be a new language that can run back-propagation directly on functional programs. As it turns out, hidden in the details of implementation, functional programs are actually compiled into a computational graph similar to what back-propagation requires.» http://edge.org/response-detail/26794 #fp #link to #deeplearning
No likes; this must have slipped under the radar. To me, this feels as if I found a secret door in a well-known room. ‎· 9000