Heat rate is a term commonly used in power stations to indicate the power plant efficiency. The heat rate is the inverse of the efficiency: a lower heat rate is better.
[math]\displaystyle{ \text{Heat Rate} =\frac{\text{Thermal Energy In}}{\text{Electrical Energy Out}} }[/math]
The term efficiency is a dimensionless measure (sometimes quoted in percent), and strictly heat rate is dimensionless as well, but often written as energy per energy in relevant units. In SI-units it is joule per joule, but often also expressed as joule/kilowatt hour or British thermal units/kWh.[1] This is because kilowatt hour is often used when referring to electrical energy and joule or Btu is commonly used when referring to thermal energy.
Heat rate in the context of power plants can be thought of as the input needed to produce one unit of output. It generally indicates the amount of fuel required to generate one unit of electricity. Performance parameters tracked for any thermal power plant like efficiency, fuel costs, plant load factor, emissions level, etc. are a function of the station heat rate and can be linked directly.[2]
Given that heat rate and efficiency are inversely related to each other, it is easy to convert from one to the other.
Most power plants have a target or design heat rate. If the actual heat rate does not match the target, the difference between the actual and target heat rate is the heat rate deviation.