Jan 16, 17 / Aqu 16, 01 06:17 UTC

Nuclear Power in Space  

What are people's views of Nuclear energy in use in space?

I don't want to hear that people are against nuclear fission/fusion in space as Stars are fusion/fission generators...

What specific feedback I would like people to provide is in regard to the use of non-solar power generators in space within space vessels/space stations, fully autonomous vessels and satellites.

Larger objects in space provide for a larger target for micro-meteorites or space debris to hit. While I'm a big fan of what solar panels can do, the big problem I see is that should one be hit the larger the environmental impact for function-able satellites, space station, or space vessels.

There are many types of reactors, the most common ones are geared to produce lots of radioactive wastes. I have been reading about Thorium reactors which are inheritance more stable, safer and produce far less nuclear waste. Thorium is highly abundant, I believe that 1cm3 of Thorium can power a car for a year.

While solar is highly effective with newer panel technology providing increasing efficiencies. Thorium would provide a safe, compact energy source, that requires less shielding, is cheap to operate due to its reduced supporting infrastructure requirements.

Space vessels and autonomous vehicles that might be deployed to recover space debris or cargo launched into space would benefit from having power sources that are compact and not prone to damage that can occur when traveling though space debris hot spots.

The two countries that have significant historical research into Thorium reactor technology is the United States (through research done in the 1950's-1970's), and India through their current Thorium reactor program.

Jan 16, 17 / Aqu 16, 01 09:15 UTC

This post has been moved from General Discussion to the Technology forum for further discussion. Thanks for understanding.

Jan 16, 17 / Aqu 16, 01 11:50 UTC

The radioactive waste is not a big problem in the space. For example you can shot that into the Sun. But there are other problems:

~ Heat removal from the reactor

~ Microgravity

~ Radiation protection around the reactor

~ If a disaster happening you have to collect the radioactive waste before that enters the atmosphere.

  Updated  on Jan 16, 17 / Aqu 16, 01 11:50 UTC, Total number of edits: 1 time

Jan 16, 17 / Aqu 16, 01 12:14 UTC

Generally, nuclear power anywhere where a problem can happen, and cause another problem with something else isn't clever. Luckily, power can be transmitted - so it doesn't actually have to be anywhere it can cause a problem.

Typically it's quite safe, as long as you maintain coolant - Which as pretty much the only available option is InfraRed, it should be interesting to watch you mitigate megawatts or gigawatts of thermal energy with. You think a large array of solar panels is a bad idea, what do you think of a few dozen square miles of radiators? These leak coolant when fractured, causing a much bigger problem. There's way to migitage this, but that increases the mechanical complexity of the system, and scaling that up over a few dozen square miles is a massive headache waiting to happen.

Thorium reactors are pretty ancient really, as you point out 1950's technology. To assume you reference LFTR designs, the flouride salt used as a emergency heat sink in most models as a meltdown mitigation strategy(untested, IIRC) isn't quite suited for microgravity application, AFAIK. Low gravity like the moon might work, artificial gravity via centraugal force is a possible. Also, fluid dynamics adjust in microgravity... They produce just as much nuclear waste as any other type of reactor - decay aside you get just as much nuclear material out as you put in, but in some configurations known as a breeder reactor it's possible to create some isotopes that are useful to the medical community. For use in space I'd suggest just centrafuging the lot into Sol - waste management sorted. I'm not sure what makes you think this requires "reduced infrastructure" to other reactor models - they are all much of a likeness... generate heat, boil water, spin turbine... Claims of it being safer are mostly perspective. Sure, when you're juggling chainsaws, you throw them higher you have longer to catch it - but that doesn't make it a safe thing to be doing...

Solar isn't that effective at all, really. Newer panel technologies have increased efficiencies, but it's still lacking. There's the MMOD issues you've highlighted(but, we'll get that cleaned up before we're mass deploying anything) but even more critically is the short service life. Sure, they'd need replacing less often than a thorium reactor would need reloading, but they burn out - the closer they are to the sun, the more/faster that happens. They also get quite useless out past jupiter due to the lesser photon density. I'd rather be replacing units every few decades and picking up bits of broken glass than having to deal with a few tonnes of radioative flouride tho. I'd leave this sort of tech for Jupiter or further...

Personally, I'm more interested in the likes of a hadoop-dyson loop sat harvesting Sol's dynamo and beaming power back thissaway. Clean, safe, and should only require replacing every few generations.

"Space vessels" currently tend to generate power using nuclear means via RTG - radioisotope thermolectrical generator. These don't tend to be overly efficient, the voyager missions starting off with just under 500W.

  Updated  on Jan 16, 17 / Aqu 16, 01 12:23 UTC, Total number of edits: 1 time
Reason: typo

Jan 16, 17 / Aqu 16, 01 16:57 UTC

You are right EyeR. I think we can use reactors just on asteroids, not in a space station.

Jan 18, 17 / Aqu 18, 01 20:27 UTC

I like the idea. They actually tried the same experiment with Thorium reactors that caused Chernobyl to melt down. With the Thorium reactor it safely shut down. Also, we don't need to worry about the radioactive waste. The original Thorium reactor in the US ran for about 60 years if memory serves, and they never replaced the control rods. Not to mention, all of the spent rods in the US can be used to power Thorium reactors. We would actually be removing waste from earth. Generally I am against nuclear power but in this case I am for it. However, if the rods are transported using rockets and something were to go wrong, that could make thousands of miles uninhabitable for a very long time.

  Last edited by:  Anthony Johnson (Asgardian)  on Jan 18, 17 / Aqu 18, 01 20:28 UTC, Total number of edits: 1 time

Jan 19, 17 / Aqu 19, 01 17:43 UTC

I would think cooling the reactors in space would be fairly simple, just expose the cooling liquids/rods to the well below freezing temps of SPACE. Ambient temps being in the neighborhood of -454F or -270C. Short of "absolute zero" tested in labs, it doesn't get much colder than that. As far as nuclear radiation, having a separate module, lined or unlined, tieing into the station would avoid most if not all radiation exposure. Granted if there is any significant amount of radiation leakage, the reactor was either designed or built very poorly.

Jan 19, 17 / Aqu 19, 01 17:53 UTC

I believe that absent of a more practical and efficient method of energy generation, Nuclear power is pretty much the only viable method off-hand, (there are various issues with all Terran methods of power generation being modified for space given the fact that they all had to deal with Gravity during their inception/implementation).

(and yes, I do know of the High-efficiency Solar Panels that several labs around the world are working on, but currently they are still in the prototype phase and are not available for widespread implementation and/or deployment)

Jan 20, 17 / Aqu 20, 01 13:53 UTC

pwmmal, thermal management isn't as easy as sticking something out the window into space. As there's nothing out there, there's nothing to accept the heat, in the same way that fire transfers heat into the surrounding air for example. Afaik you have to rely on IR radiation which is nowhere near as effective.

Jan 20, 17 / Aqu 20, 01 14:30 UTC

You don't expose the rods to space, then you lose the thermal energy you're trying to harvest, just like on Earth you transfer that thermal energy to a fluid(commonly) which then transfers it's thermal energy to another fluid to prevent nuclear contamination of the electrical generation system.

Now you've the fluid all loaded up with thermal energy, the "sensible" thing to do would be pump that into radiators to "expose to the freezing temperatures of SPACE"(yay, lets randomly capitalise words like a spastic) but, due to the lack of matter in SPACE you're then limited to bleeding via IR as convection/conduction isn't a viable process without something to convect/conduct through... This drops the efficency of the thermal dissipation down to about 28%. To take the ISS as an example, they generate a few KW of power, and first thought 14KW of thermal dissapation would be adaquete. Wrong. ETACS was soon upgraded to 80KW. This consists of 14 panels, each measuring 6 by 10 feet (1.8 by 3 meters), for a total of 1680 square feet (156 square meters) of ammonia-tubing-filled heat exchange area. Ignoring the solar panels also acting as radiators. 156m² for 80KW. So, for dissipating a single megawatt - which is tiny really, when consdering the requirements for some installations, mass habitation is likely to require terawatts of dissipation - we're talking 1962.5m² - just under 2Km² to just stop the generation of that heat baking everything. That's not dealing with any heat within the electrical generation, the use of this electricty, extra solar input, or any "biological input" to the thermal system. That'll all be extra paneling. Now start working out the array sizes for gigawatts or terrawatts....

Ignoring the most obvious problem - thermal dissapation - you've also got to take into account how the fluid dynamics adjust in microgravity.

Thorium reactors have a vastly differing design to the reactor that was operating at chenobyl, simulating the precise failure chain shouldn't be viable. I'm unaware of the emergency flouride heat sink being tested, as this would first require to lose control over the reactor and then allowing it to melt through the reaction chamber. Theory only. Yes you do have to worry about radioactive waste, it produces plenty. Even factoring in for decay, you get tonnes more radioactive matter out than you put in, as you irradiate the flouride(to assume LFTR design) used to cool it.

  Updated  on Jan 20, 17 / Aqu 20, 01 14:43 UTC, Total number of edits: 1 time
Reason: typo

Jan 20, 17 / Aqu 20, 01 21:03 UTC

Thank you for explaining the thermal dissipation problems of cooling the reactor. Condescension aside, it was helpful. So how would you cool the reactor in microgravity if convection is out and IR thermal dissipation is unrealistic for the size required?

Jan 22, 17 / Aqu 22, 01 04:34 UTC

Wow, I had no idea that this post would foster such a discussion from a wide range of people. I appreciate the input and further research from those who have continued the discussion.

A few people have spoken about the nature of bleeding heat that is built up from the various elements that go into any space habitation. ie from people, power generation, electrical systems, machinery and heat accumulated from our Sun.

A few years ago a young girl entered into a science competition with a flashlight that was powered by the heat given off by the persons hand that was holding it. Could this type of technology be used to convert IR heat from a cooling system into electricity? My guess is that this could produce a compact way to generate electricity and could be built in a non-pressurized section of a space station? I'm aware that pressurized sections of any space structure is at a premium and increase costs dramatically compared to non-pressurized ares that could be used for items rarely frequented by people.

Has anyone heard of or know anything about this type of power source?

Jan 23, 17 / Aqu 23, 01 11:34 UTC

This might be the Peltier effect? If so, it would work on the premise that one side is colder than the other. The problem with that is you have to cool the other side, so you have the same problem you started with.

Jan 24, 17 / Aqu 24, 01 10:50 UTC

It's more likely the Seerback effect - which is closely related to the Peltier effect. It might even be a metamaterial. Without examining the technology it's unwise to guess.

However such systems - including stirling engines - only actually remove remarkably little thermal energy. It's removing some certainly, and possibly could find use as an auxilery, complimentary system, but is unlikely to form the primary dissipation attempt.

As to how I would do it - that's the thing. I'm not aware of a "better" method - or it would of been provided. The natural instinct to "just increase surface area" is going to eventually cause more problems.. A few dozen square miles is a lot of fragile exposed surface area to get cracked open and start bleeding coolant. New methods are certianly required.

I did entertain the notion of several "radiator barges" that dock up, exchange warm coolant for cold then drift off to cool down elsewhere. Enough of these in rotation could reduce the requirement for "local" thermal dissipation - but it's then relying on a lot more systems to not go wrong.

Jan 24, 17 / Aqu 24, 01 15:10 UTC

I had heard a long time ago that mass could be used to transfer heat, kind of like you describe (from memory it was directly ejected). Depending on how much heat you're generating, it could be feasible.

There's a stackexchange answer here http://worldbuilding.stackexchange.com/a/67312 that references this tool for calculating similar things: http://www.engineeringtoolbox.com/radiation-heat-transfer-d_431.html

  Last edited by:  Sean Jardine (Asgardian)  on Jan 26, 17 / Aqu 26, 01 11:27 UTC, Total number of edits: 1 time
Reason: folks were having trouble with link