Yeah, it's unlikley any single method will end»end result in product with multiple techniques employed to result in completion. For most examples. I'm using current industrial hardware as a "guide" but initially intended on a smaller scale. Size IMHO is not as much as a problem as complexity. There's a lot of room in space, and it can get bigger once it's up there, the word "unfold" was chosen carefully.
The mention of laser was as a "carving method" for forming 3D shapes entirely within itself tho lacking experience I'm unaware of how practical this may be. It's likely mechanical grinding will generate much less thermal residue and thusly be able to be dissipated easier. I consider it more of a "stretch goal" than a "feature".
Constant attention should be supplied by itself. As this system is made of lots of other systems, first each individual system must be made aware of itself. A common problem with repetative action within the industrial sector is "unexpected" input - Something is in the way, or has become misaligned. Or has failed to operate as intended. As I'm sure you're aware things like infrared beams, hall sensors, ultrasonic devices to measure volume/distance are currently and commonly used to detect such, and we can do similar. Less common is visual processing - this is another field in rapid development(with a lot of good work being open sourced) and should certainly be employed by us to detect "anomalies" - most of which it should be able to handle autonomously, and anything it's not absolutly positive about it can ask for help. There then should exist "above" this a "control layer" that understands how each piece works and can provide sensible vague instructions, which the layers below operate on(ie: unit 001, print /blueprints/vessel/tug/main_chassis and then the applicable device looks at the file and figures out what it needs to pull it off - puts in a request to the controller, which then submits a request to "stores", delivers required or returns err. Materials get fed in, in sequence) alongside it another layer that watches all the telemetrics and metadata from subcomponents. I specifically mentioned things like calibration cycle to collect metadata because this is something that for a given design should be reasonably consistent. All models should exhibit behaviours within a window, and each individual model should exhibit behavours like itself. For example, the first 2D CNC I'm planning on building a little later this year is little more than an etch-a-sketch mechanism(but larger scale) but with stepper motors instead of knobs. Nothing fancy or complex, but excellent PoC and lets me play with things like jitter and vibrational compensation(as well as giving me some more tooling to build better things). The bars I've for runners at the moment are 600mm. I've not got as far as actually constructing a design but at a loose guess shave 5cm each side for mounting(way over-estimate) and lose 10cm for the slide brushings, should have about 580mm². Whilst not massive, this isn't entirely useless, either. 10cm around the edges for table/case and 610mm³ isn't overly encumbersome. However this is not going to be it's final form. Rather than doing math and re-writing firmware for each generational improvement(like increasing runner length, or putting more teeth on the cog driving the belt) I'd considered instead each mechanical operation can be measured to completion in a "calibration cycle". Ie: the X-axis runner moves 'till it hits the near limit sensor, then counts steps on the stepper motor, RPM, and time until it hits the far limit sensor. Armed with such a baseline it can "measure itself" as it changes, and if this data is regularly observed it can also generate pre-failure reports as an action that used to take 1.354 seconds now fluxuates between 1.354 and 1.367. Something that used to heat up to 35.1°C±0.1°C now heats up to 36.5°C. Something that consumed 48W over a cycle now eats 51W etc...
The likes of tools shattering I'd not expect to be common. There's a reason things like this happen, manufacturing defects are one and inappropriate environmental stresses another. I'd like to think much can be avoided, approached "correctly". However when it does inevitably occur I'd expect the damages caused by this to be quite minimal, cosmetic at best. Quite simply because this should be expected to occur and provisions made. Shielding springs to mind - there should be little requirement to expose anything particularly damageable. A bit braking up @ high RPM can end up a projectile, but there's only so much kinetic energy it's possible to transfer. Even if we need to coat the inside of the cutting area in a MgAl3 foam with a non-newtonian fluid as insulational barrier between the two - if a .45 will turn to powder on impact I'm sure a titanium / tungsten carbide machining bit will just bounce off.
Nay saying is good - as long as it's not just saying nay and instead providing for problems that it's possible to sink teeth into and tear to shreads. The more specific the problem, the more precise the resolution. You might of thought of something someone else hasn't and not saying it isn't going to do anyone any good. The more problems that are able to be identified, earlier, the earlier solutions can be entertained, rationlised, selected/rejected. Which should, in theory, stop a lot of mistakes happening. Thus is the power of "collaboration" - the sum of the whole is greater than all the parts.