0

Initially water thermometer were used to measure temperatures around the 1600s, then in 18th century mercury was used in thermometers because it expands linearly with temperature. But how do they know mercury expands linearly with temperature?

If for example I make thermometer A with water (just as an example, it could use any liquid which shows non linear expansion with temperature) and thermometer B with mercury, and I match the boiling point and the freezing point but about any temperature in between, like if I measure thermal expansion of mercury with thermometer B it will be linear. Similarly, if I measure thermal expansion of water with thermometer A it will be linear but if I swap thermometer A with B both the substance will show non linear expansion. How do I know which is correct? One answer I get is that we can fix the pressure and amount of gas (for example a closed container and a piston on top) and according to ideal gas equation $PV=nRT$ we can measure change in volume corresponding to change in temperature and by this we can verify mercury thermometer is more accurate. But the ideal gas equation is simply the combination of three laws i.e. Boyle's law, Charles' law, and Avogadro's law and they used thermometers to take observations and derive laws, so we can't use these laws to make a thermometer since a thermometer saw used to derive these laws and also how do we know which gas behave ideally without a thermometer?

So to summarise my question is

  1. How did early scientists know that a mercury thermometers is more accurate than a water thermometer and how do they know mercury expands linearly with temperature?
  2. Was the concept of the ideal gas used to fabricate the thermometer or was a thermometer used to derive the ideal gas equation?
Mauricio
  • 3,897
  • 17
  • 35

0 Answers0