Apr 19, 2018

Does higher compression or entropy leads to faster learning? Or higher randomness means less things that can be learned?

a.   If the information can be squeezed, or compressed, or minimized in representation terseness, then the amount of "storage" needed to keep the information can be minimized.

b.   Minimization of the need for storage, will also help to minimize the need for transfer, or communication.

c.   So if the amount of communication can be minimized, then there is a possibility of making the learning FASTER.   So the lesser the data, or the more structured the data, the easier it is to "see" the pattern, and thus "learn".

d.   Compression of information is always associated with ENTROPY - the more compressed the more entropy, or randomness.

e.   So the correlation of entropy and learning rate, could possibly imply that a higher entropic data is also correlated with faster learning.

f.   (e) goes against the conventional wisdom that random data, being totally random, will enjoy much  less structured information, and thus more difficult to extract pattern out, or LEARN anything.

No comments: