Go to main content
Formats
Format
BibTeX
MARCXML
TextMARC
MARC
DataCite
DublinCore
EndNote
NLM
RefWorks
RIS

Files

Abstract

This paper introduces Parafinitary Learning (PL), a learning framework that asserts the primacy of scale in the construction of artificially intelligent agents. This framework is successfully applied as an extension of neural networks and is capable of producing competitive results in simple binary classification tasks. The base neural network implementation of PL employs a special case of Oja’s Rule (called Oja’s Golden Rule). This special case is then further augmented with two additional mechanisms for improving efficiency and stability. Respectively, this involves employing radix economies, a concept drawn from coding theory, and prediction markets, which cast the models being developed as logical inductors over the target distribution being learned.

Details

PDF

Statistics

from
to
Export
Download Full History