Frankly even the crappy A.I. we have now would be an improvement on our U.K. governmental system currently, regardless of party.*
Seriously, I think the 'existential threat' comes from the reliance on A.I. to do work that, while complex, could be done by us poor wetware.
Life, especially in the fields science, technology and military, has become increasingly complex. Why teach children to read maps when you have GPRS and Google Maps? Those children can be 'taught' to use a technology replacing the 'traditional stuff'.
Sure, the world marches on and the 'new' becomes 'traditional', but how many sci-fi and post-Armageddon books and movies point out lost skills that, while old fashioned, are valuable.
The fear of A.I. triggering a nuclear apocalypse stems from a machine (no matter how complex) looking at a situation with logic and no emotion. Look at how most nuclear weapon launch systems rely on a multi-level decision mechanism. Two or more humans must agree to launch. If one baulks then there's a 'circuit-breaker'. This 'fuse' is based on human emotion, not logic.
I, for one, am glad it's there but frankly I think it's flawed. No matter what psychological assements you have, humans are flawed and can utterly defy prediction.
I'm not au fait with top military thinking (if you can label it as that) but I'm pretty sure the "Guys at the Top" are jealous of their perogative to initiate a white-hot, radioactive war that reduces the Earth to a ball of molten dirt. The point is, A.I. might be used as a tool to 'do the heavy lifting' but, ultimately, it doesn't have the key to the forklift.
I refer, m'lud to the popular 1980's film Wargames which approached the use of computers in military strike deployment.
* This is humour and not an opening of a discussion about A.I. having political power. We've got enough trouble on our hands with weapon systems.