We all know that artificial intelligence (AI) can do marvelous things and that it is currently being incorporated into many industries. Yet, modern AI has an Achilles heel: It seems universally non-robust. That is, it can be unstable to tiny perturbations or generalize in unpredictable ways, both of which can lead to AI generated hallucinations. In this tutorial, we will investigate the many reasons for instabilities, unpredictable generalization and hallucinations in AI based image reconstruction. Furthermore, we will provide guidance on how to reduce the issue of non-robustness and untrustworthy outputs in AI based image reconstruction.