Using the waste heat from a liquid-fluoride thorium reactor to desalinate seawater is something that I’ve been thinking about for a long time. It seems like a perfect opportunity since the reactor needs to reject waste heat in order to produce electricity, and that waste heat will be available in large amounts.
But first of all, why does a reactor produce waste heat? That seems rather…wasteful, doesn’t it? This was one of those principles that took a bit of thinking and studying for me to understand. If thermodynamicists lived on a perfect planet, it would be at absolute zero (about 273 degrees Celsius below zero). It wouldn’t be too fun for the rest of us, but the thermodynamicists would really enjoy it, because all heat could be turned completely to work. Engines could be 100% efficient and there wouldn’t be any waste heat.
But fortunately for the rest of us we don’t live on a planet at absolute zero, so thermodynamicists have to content themselves with engines that are less than 100% efficient and have to reject waste heat. How much waste heat you have to reject depends on how hot you can generate heat in the first place.
Fire can heat water to 100 degrees Celsius (373 Kelvin) before it starts to boil. But if you keep the water under higher pressures it will be hotter when it boils, and you will be able to get work out of the high-pressure steam produced from the hot water. If you let the high-pressure steam expand back to atmospheric pressure, you can get some work out of the steam, like the old steam locomotives of the 1800s. But if you let the steam expand to pressures even below atmospheric pressure you can get even more work out of the steam. The downside of doing this is that you have to condense the steam back to water with outside water at normal pressure. The old locomotives just blew it out the cylinder.
Well, in super-simple terms, that’s what’s going on in nuclear plants and coal plants all over the world. Water is pumped up to high pressure and boiled, and then the steam is expanded to subatmospheric pressures in a big steam turbine and condensed using cooling water, and the process starts all over again. It’s fairly efficient—much more so than the steam locomotive—and let’s you turn about 1/3rd of the heat into electricity. The other 2/3rds of the heat has to be “dumped” into your cooling water.
So maybe all this waste heat can be used for desalination, right? Considering that a 1000 megawatt nuclear power plant has to produce about 3000 megawatts of heat and dump 2000 megawatts of that heat to the cooling water, it would stand to reason that something could be done with all that waste heat, right?
Well, as it turns out, not really. The reason why has to do with the fact that in our pursuit of maximum efficiency, we made the temperature in the steam condenser just about as low as we could get away will. In fact, the pressure in the steam condenser of a typical nuclear power plant is only about 5% of atmospheric pressure. Inside the condenser at 5% atmospheric pressure, steam will condense at a temperature of only 32C (91F). But all of the cooling water flowing over those condensation tubes isn’t going to boil at that temperature. No, it’s going to need something a lot hotter to get it to boil. Hence, no desalination from waste heat.
(to be continued…)