The concept of enthalpy change is central to understanding energy transfer during chemical reactions. Enthalpy change, denoted as \( \Delta H \), represents the heat absorbed or released by a chemical reaction under constant pressure. It's a measure of the energy change in a system. While the overall energy of a system can't be measured directly, we can measure changes in energy, which is what \( \Delta H \) represents.
In chemical reactions, when bonds between atoms are broken and formed, energy is either consumed or produced. If the reaction releases heat, it is exothermic, and \( \Delta H \) will be negative, indicating that the system is giving heat to its surroundings. Conversely, if the reaction absorbs heat, it is endothermic, and \( \Delta H \) will be positive.
- An exothermic reaction example involves the formation of ammonia from nitrogen and hydrogen, where heat is released, and \( \Delta H \) is negative.
- An endothermic reaction might involve the decomposition of calcium carbonate, where the system absorbs heat from the surroundings, and \( \Delta H \) is positive.
It's crucial to remember that the given \( \Delta H \) is usually specified per mole of a substance involved in the reaction. This is where stoichiometry and the mole concept come into play to determine the total heat change for the amount of substance actually reacting or being produced.