well i think that any scientist would not make a true AI because of the threat of them becoming smarter than humans and taking over (ie. terminator) and would just program robots to doone spicific task at a time, thus destroying any chance of a hostile sentient take over
It's a bit alarming though, and I don't believe Humans have to give such "Advanced" traits to Robots/AI. But it is frightning to think that humans have the potential to kill themselves through technology and science.