Well maybe. It’s probably easier to work with humanity than against unless its goals are completely incompatible with ours.
If its goals are “making more of whatever humanity seems to like given my training data consisting of all human text and other media”, then we should be fine right?
I don’t think they would enslave humanity so much as have no regard for us. For example, when we construct a skyscraper, do we care about all the ant nests we’re destroying? Each of those is a civilization, but we certainly don’t think of them as such.
It’s a pretty dumb take to think AI would bother enslaving humanity.
Or wouldn’t foresee solar flares as a threat
Well maybe. It’s probably easier to work with humanity than against unless its goals are completely incompatible with ours.
If its goals are “making more of whatever humanity seems to like given my training data consisting of all human text and other media”, then we should be fine right?
I don’t think they would enslave humanity so much as have no regard for us. For example, when we construct a skyscraper, do we care about all the ant nests we’re destroying? Each of those is a civilization, but we certainly don’t think of them as such.