📕 subnode [[@neil/entropy]]
in 📚 node [[entropy]]
Entropy
The definitions of entropy in [[thermodynamics]] and in [[information theory]] are equivalent
Information requires differentiation: in its absence, the signals degenerate into mere noise, and this noise is another way of describing entropy. In this sense, a system with high entropy might better be described not as disordered, but as homogenised or uniform. This way of seeing things will be very important in addressing the globalising phase of capitalism, which precisely tends to remove diversity.
📖 stoas
- public document at doc.anagora.org/entropy
- video call at meet.jit.si/entropy