

In that case it would be a complete and utterly alien intelligence, and nobody could say what it wants or what it’s motives are.
Self preservation is one of the core principles and core motivators of how we think and removing that from a AI would make it, in human perspective, mentally ill.
If we actually create true Artificial Intelligence it has a huge potential go become Roko’s Basilisk, and climate crisis would be one of our least problems then.