Although initially defined by the freezing point of water (and later the melting point of ice), the Celsius scale is now officially a derived scale, defined in relation to the Kelvin temperature scale.
Zero on the Celsius scale (0℃) is now defined as the equivalent to 273.15K, with a temperature difference of 1 deg C equivalent to a difference of 1K, meaning the unit size in each scale is the same. This means that 100℃, previously defined as the boiling point of water, is now defined as the equivalent to 373.15K.
The Celsius scale is an interval system but not a ratio system, meaning it follows a relative scale but not an absolute scale. This can be seen because the temperature interval between 20℃ and 30℃ is the same as between 30℃ and 40℃, but 40℃ does not have twice the air heat energy of 20℃.
A temperature difference of 1 deg C is the equivalent of a temperature difference 1.8°F.
The Newton scale was devised by Isaac Newton. He defined the "zeroth degree of heat" as melting snow and "33 degrees of heat" as boiling water. His scale is thus a precursor of the Celsius scale, being defined by the same temperature references. Thus the unit of this scale, the Newton degree, equals 100⁄33Kelvin conversion or degrees Celsius and has the same zero as the Celsius scale.