The ability to predict data sequences is important in data transmission to provide error correction. Certain algorithms can predict repetitive code with good accuracy, but fail in the presence of noisy code sequences.
Mr. James Johnson of Netrologic, Inc. (Dayton, OH) trained a BrainMaker neural network on noisy data and was able to predict code sequence accuracy from 62% to 93%, depending upon the initial conditions and the presence or absence of noise. Higher accuracy could probably be obtained by training a network with a wider variety of training samples.
The network was given an input of 100 bits generated using this algorithm:
b(a) = b(a-3) XOR b(a-31) where 32² a ² 100.
The network was asked to predict what the 101st bit should be in that sequence with no explicit knowledge of how the string was formed. The equation used to generate the bits contained a 31-bit random seed. A set of 1,000 training facts was generated to train a back propagation net. The first data sets were generated with sets of correlated data; that is, five sets of 100 bits were generated using the algorithm above and a 31-bit seed that was identical except that it was shifted right one additional position for each subsequent set of data to generate five separate sets of 100 bits. Then a new random 31-bit seed was generated and five more correlated 100-bit sets were produced.
The network learned all of the 1,000 training sets to within 10%. A test set was generated of 500 sets of 100 strings. The network got 468 out of 500 correct.
Inputs:
1st Number
2nd Number
*
*
*
100th Number
Output
Next Number