As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
In view of discussing the genuine roots of the connectionist paradigm we toss in this paper the non symmetry features of the involved random phenomena. Reading these features in terms of intentionality with which we drive a learning process far from a simple random walk, we focus on elementary processes where trajectories cannot be decomposed as the sum of a deterministic recursive function plus a symmetric noise. Rather we look at nonlinear compositions of the above ingredients, as a source of genuine non symmetric atomic random actions, like those at the basis of a training process. To this aim we introduce an extended Pareto distribution law with which we analyze some intentional trajectories. With this model we issue some preliminary considerations on elapsed times of training sessions of some families of neural networks.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.