AI is a long way from gaining sentience, or self-consciousness. It is possible that we might design an AI that isn’t self-conscious, but we give it bad instructions and it destroys the earth. (The philosopher Nick Bostrom talks about an AI designed to make paper clips destroying humanity because this will let it make more paper clips). But AI is just a tool, and as long as we use it safely, we will be fine.
Comments