Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

September 11, 2011

Efficient P2P Ensemble Learning with Linear Models on Fully Distributed Data

Filed under: Ensemble Methods,Machine Learning,P2P — Patrick Durusau @ 7:02 pm

Efficient P2P Ensemble Learning with Linear Models on Fully Distributed Data by Róbert Ormándi, István Hegedűs, and Márk Jelasity.

Abstract:

Machine learning over fully distributed data poses an important problem in peer-to-peer (P2P) applications. In this model we have one data record at each network node, but without the possibility to move raw data due to privacy considerations. For example, user profiles, ratings, history, or sensor readings can represent this case. This problem is difficult, because there is no possibility to learn local models, yet the communication cost needs to be kept low. Here we propose gossip learning, a generic approach that is based on multiple models taking random walks over the network in parallel, while applying an online learning algorithm to improve themselves, and getting combined via ensemble learning methods. We present an instantiation of this approach for the case of classification with linear models. Our main contribution is an ensemble learning method which-through the continuous combination of the models in the network-implements a virtual weighted voting mechanism over an exponential number of models at practically no extra cost as compared to independent random walks. Our experimental analysis demonstrates the performance and robustness of the proposed approach.

Interesting. In a topic map context, I wonder about creating associations based on information that is not revealed to the peer making the association? Or the peer suggesting the association?

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress