📚 node [[entropy]]
-
[[entropy]] is the [[source]] of [[computation]].
- its destination is usually a [[state of higher order]].
- [[wp]] https://en.wikipedia.org/wiki/Entropy
Entropy
The definitions of entropy in [[thermodynamics]] and in [[information theory]] are equivalent
Information requires differentiation: in its absence, the signals degenerate into mere noise, and this noise is another way of describing entropy. In this sense, a system with high entropy might better be described not as disordered, but as homogenised or uniform. This way of seeing things will be very important in addressing the globalising phase of capitalism, which precisely tends to remove diversity.
📖 stoas
- public document at doc.anagora.org/entropy
- video call at meet.jit.si/entropy
⥱ context
⥅ related node [[entropy and diversity]]
⥅ related node [[cross entropy]]
⥅ related node [[entropy and the capitalist system with robert biel]]
⥅ related node [[the entropy of capitalism]]
🔎 full text search for 'entropy'