Entropy is a fascinating thing. It's one of the fundamental parts of physics. It's also used as a basis in information theory.
In essence entropy indicates the disorder of the system. Even though the use of the term disorder is actually a bit misleading in this context. It's more like a spread or distribution that the entropy is about.
In information theory the entropy of a message indicates the amount of information that can be potentially transmitted. If the receiver knows or can predict the content of the message then the entropy is low. So more new information means more entropy and by the (bad) definition more disorder.
This means that when you receive new information and learn new things your brains entropy increases. This disorder of course leads to the spread of the knowledge and distribution of information.
I have been studying entropy, quantum mechanics and T symmetry recently (just for fun) but I really need to get my brains messed up good before I can truly grasp what's that all about. Once I'm there I will be disorganized your brain too with all that information.