Apparatus used to measure the energy released/absorbed by a reaction.

## Specific Heat Capacity

Specific heat capacity is the amount of energy required to increase the temperature of 1g of a substance by 1ºC /K. It can be symbolised by the character ‘c’ and SHC’s units is Jg-1ºC -1. An example of SHC is water which has an SHC of 4.184 Jg-1ºC -1. Relative to substances such as sand which has a SHC of 0.48, water has a high SHC which is indicative of the fact that water molecules are bound together through hydrogen bonding. The extra intermolecular bonds which sand lacks means that less energy is required to raise its temperature by 1ºC.

$E=m\times c\times \Delta T$

SHC is especially useful when trying to work out how much energy was used to increase the temperature of a certain substance or working out the change in temperature given the amount of energy.

## Calorimetry

A calorimeter is an instrument used to measure energy changes in a reaction, how much energy is being absorbed or released depending on the type of reaction.

The parts of a bomb calorimeter is shown below. The outside of the bomb calorimeter is lined with an insulated layer to ensure that no heat either escapes or is introduced into the system. Thermometer to measure the temperature change. An electric heater that is used to calibrate the calorimeter. A pressurized vessel that is where the reaction takes place. It is important to note that the pressurized vessel that is held within the calorimeter is not insulated and allows heat to pass freely to the water in the calorimeter. This is to measure how much water is absorbed by the water in the calorimeter and allows us to easily measure the temperature change.

The first step in using a calorimeter is to first calibrate it and finding out the calibration factor. This is essential to find out how much energy is required to change the temperature within a calorimeter by 1 degree. This can be done by using the following formulae:

Energy released by Heater (E) = V I T
E is energy in Joules, V is voltage, I is current in amps and T represents time in seconds.

$CalibrationFactor=\frac { E }{ \Delta T }$

Once the calibration factor has been deduced, the reaction of interest can take place and the thermometer will measure the temperature change of the calorimeter as a result of the reaction. This can be done using the equation:

$E(released/absorbed)=Calibration Factor\times \Delta T$

If ΔH of a reaction was to be deduced, using ratios to work out the absolute value of ΔH is the quickest way of working it out. The sign, whether it’s positive or negative, depends on whether there an increase or decrease in temperature of the calorimeter. If there is an increase in temperature, it is indicative that the reaction is exothermic and ΔH needs to be a negative value. If there is a decrease in temperature, it is indicative that the reaction in endothermic and ΔH needs to be a positive value.