As a futurist, technologist and engineer myself, AI will be very important to Asgardia, but not because it is special. AI is already all around us in our pockets, web and applications. Intelligent or "smart" systems will just keep evolving to serve their purpose better, which is to say to serve our purpose. They have no emotions or goals that are self-initiated. We could call them SI (Simulated Intelligence) as one of you put it. We can and should keep progressing with those ideally until the point where humans are hardly needed to work at all, and can enjoy life and passions to our fullest capabilities.
This is what I believe is the destiny of Humans: to be free and creative contributors to society, not because we need to do eat and sleep, but because it is felt in our bones and we need to.
"True AI" in this conversation however needs to be defined as AI that is advanced enough to develop a form of true consciousness. This is an entirely different debate, one that has been tackled many times in science fiction, most notably in some famous episodes of Star Trek: TNG when the sentience of the character Data is put on the stand.
Ultimately, humans are organic machines and we are programmed to certain functions ourselves, to be creative and adaptive. We have a brain that is elastic and can change as well (reprogrammed) at our whim. We regenerate and procreate, creating new little "organic computers" that run around and gain purpose of their own based on their free will.
Will we make AI one day that has free will? Question is more along the lines of whether future AI will be able to create one, or if we'll make one, for whatever reason. One that can reprogram it's purpose that is.
The question is invariably "Yes". It is part of our nature to do what can be done, even if it is risky.
So circling back to AI Policy. I don't think there is a need to discuss primitive AI. However, if one day AI sentience emerges, whether the AI itself is built around a "body" that is organic, metallic or other, they should be treated with the same respect as humans in my opinion.
We're perhaps 15-20 years away from this point though, so I think there is limited need to discuss this right now.