Entropy and the Second Law of Thermodynamics

Explore entropy increase, energy injection, and maintaining order

Entropy S: 0.00
Normalized Entropy (0-1): 0.00
Entropy Change ΔS: +0.00

Particle Controls

Energy Controls

Simulation Controls

What is Entropy?

Entropy is a measure of system disorder. In statistical mechanics, entropy represents the number of possible microstates. The more microstates, the more disordered the system, and the higher the entropy.

Boltzmann Entropy Formula

S = kB · ln Ω
  • S - Entropy
  • kB - Boltzmann constant (1.38×10⁻²³ J/K)
  • Ω - Number of microstates

Information Entropy

H = -Σ pi · log₂ pi
  • H - Information entropy (uncertainty)
  • pi - Probability of event i

Second Law of Thermodynamics

Isolated System

The entropy of an isolated system never decreases: ΔS ≥ 0

  • Reversible process: ΔS = 0
  • Irreversible process: ΔS > 0

Open System

Open systems can achieve local entropy reduction through external energy input

ΔStotal = ΔSsystem + ΔSsurroundings ≥ 0

Implications

  • Time has a direction (arrow of time)
  • Heat death (universe reaches thermal equilibrium)
  • Life and ordered structures require continuous energy input

Possibility of Entropy Reduction

While the entropy of an isolated system never decreases, open systems can achieve local entropy reduction. The key is external energy or information input.

Refrigerator/AC

Consumes electricity, reduces indoor entropy, but total entropy (including surroundings) increases

Living Systems

Maintains low entropy state through metabolism, excretes high-entropy waste

Crystal Growth

Releases heat, reduces system entropy, but environmental entropy increases more

Information Processing

Erasing information requires energy (Landauer's principle), increases entropy

Key Insight

Cost of Entropy Reduction: To decrease system entropy, energy must be consumed, which increases environmental entropy. Total universal entropy always increases; the cost of local order is greater global disorder.

Practical Applications

Thermodynamics

  • Heat engine efficiency limit (Carnot cycle)
  • Refrigerator design
  • Chemical reaction direction

Information Theory

  • Data compression
  • Cryptography
  • Communication channel capacity

Cosmology

  • Black hole entropy (Bekenstein-Hawking entropy)
  • Universe evolution
  • Big Bang

Biology

  • Metabolism
  • Evolution
  • Ecosystems