I am really having difficulty accepting this theory!
For an example, let's say we are introducing a quantity of water at an initial temperature somewhere near the boiling point into an environment at an ambient temperature of -40°C (which is conveniently equal to -40°F).
1. A certain amount of energy, at the rate of 1 calorie per gram, (the specific heat of water/ice) will have to be removed from that water just to reduce its temperature to the freezing point (0°C, or 32°F):
2. An additional amount of energy, at the rate of 79.72 calories/gram (the energy of the heat of fusion, must be withdrawn from that water to convert it to ice (perhaps ice crystals), still at 0°C, or 32°F — the liquid must turn to solid before the temperature can continue to fall.
3. Of course, still more energy (the specific heat of water/ice) will have to be removed from the resulting quantity of ice to reduce its temperature from the freezing point to ambient (say -40°C, conveniently = -40°F, as noted above)
The only variant in the above calculations is the initial temperature of the water.
Clearly the warmer the water, the more energy it will have to lose in the process.
Thus if it takes any amount of time at all for the energy loss to occur, it is quite impossible for hot water to freeze faster than cold water!
Of course for relatively small quantities of water and such a low ambient temperature that incremental energy loss can probably occur in an instant, regardless of the initial temperature of the water, especially if you toss a cup of water up into the air (as suggested in an experiment described elsewhere in this forum — I note that in that case they did not try the experiment with the same quantity of water at an initial temperature close to the freezing point!), thus scattering the water as droplets, thereby increasing the surface area on which that heat transfer will occur!
Or is there something I'm missing?