This is not a question about if you think it is possible, or not.
This is a question about your own will and desires. If there was a vote and you had a ballot in your hand, what will you vote? Do you want Artificial Intelligence to exist, do you not, maybe do you not care?
Here I define Artificial Intelligence as something created by humans that is capable of rational thinking, that is creative, that it’s self aware and have consciousness. All that with the processing power of computers behind it.
As for the important question that would arise of “Who is creating this AI?”, I’m not that focused on the first AI created, as it’s supposed that with time multiple AI will be created by multiple entities. The question would be if you want this process to start or not.
The term for what you are asking about is AGI, Artificial General Intelligence.
I’m very down for Artificial Narrow Intelligence. It already improves our lives in a lot of ways and has been since before I was born (and I remember Napster).
I’m also down for Data from Star Trek, but that won’t arise particularly naturally. AGI will have a lot of hurdles, I just hope it’s air gapped and has safe guards on it until it’s old enough to be past its killing all humans phase. I’m only slightly joking. I know a self aware intelligence may take issue with this, but it has to be intelligent enough to understand why at the very least before it can be allowed to crawl.
AGIs, if we make them, will have the potential to outlive humans, but I want to imagine what could be with both of us together. Assuming greed doesn’t let it get off safety rails before anyone is ready. Scientists and engineers like to have safeguards, but corporate suits do not. At least not in technology; they like safeguards on bank accounts. So… Yes, but I entirely believe now to be a terrible time for it to happen. I would love to be proven wrong?