Abstract
|
<p>This Letter presents an approach … <p>This Letter presents an approach to using both labelled and unlabelled data to train a multilayer perceptron. The unlabelled data are iteratively pre-processed by a perceptron being trained to obtain the soft class label estimates. It is demonstrated that substantial gains in classification performance may be achieved from the use of the approach when the labelled data do not adequately represent the entire class distributions. The experimental investigations performed have shown that the approach proposed may be successfully used to train neural networks for learning different classification problems.</p>fferent classification problems.</p>
|
Author
|
Antanas Verikas +
, Adas Gelzinis +
, Kerstin Malmqvist +
|
DOI
|
http://dx.doi.org/10.1023/A:1012707515770 +
|
Diva
|
http://hh.diva-portal.org/smash/record.jsf?searchId=1&pid=diva2:286834
|
EndPage
|
201 +
|
Issue
|
3 +
|
Journal
|
Neural Processing Letters +
|
PublicationType
|
Journal Paper +
|
Publisher
|
Springer +
|
StartPage
|
179 +
|
Title
|
Using unlabelled data to train a multilayer perceptron +
|
Volume
|
14 +
|
Year
|
2001 +
|
Has queryThis property is a special property in this wiki.
|
Publications:Using unlabelled data to train a multilayer perceptron +
, Publications:Using unlabelled data to train a multilayer perceptron +
, Publications:Using unlabelled data to train a multilayer perceptron +
, Publications:Using unlabelled data to train a multilayer perceptron +
, Publications:Using unlabelled data to train a multilayer perceptron +
, Publications:Using unlabelled data to train a multilayer perceptron +
, Publications:Using unlabelled data to train a multilayer perceptron +
, Publications:Using unlabelled data to train a multilayer perceptron +
, Publications:Using unlabelled data to train a multilayer perceptron +
, Publications:Using unlabelled data to train a multilayer perceptron +
|
Categories |
Publication +
|
Modification dateThis property is a special property in this wiki.
|
30 September 2016 20:42:05 +
|