📕 subnode [[@neil/entropy]] in 📚 node [[entropy]]

Entropy

The definitions of entropy in [[thermodynamics]] and in [[information theory]] are equivalent

[[The Entropy of Capitalism]]

Information requires differentiation: in its absence, the signals degenerate into mere noise, and this noise is another way of describing entropy. In this sense, a system with high entropy might better be described not as disordered, but as homogenised or uniform. This way of seeing things will be very important in addressing the globalising phase of capitalism, which precisely tends to remove diversity.

[[The Entropy of Capitalism]]

📖 stoas
⥱ context