As I've just written, I think that it's absolutely crucial.I don't think it actually matters what 'intelligence' specifically is.
How can we possibly talk about "more intelligent" if we don't know what "intelligent" is?The worry is that 'man' will design machines which are more intelligent the he is.
For a start, as above, it is currently a meaningless question - so there really can't be an answer, 'obvious' or not.Is this possible? I would say obviously not.
I agree totally, and that's the crux of what I've been saying. It seems that almost anything done by a computer is coming to be described as "AI", even if the computer has merely executed explicitly programmed algorithms, and that this is being used (largely unnecessarily)to frighten people. I don't see that sort of machine function as indicating any degree of "intelligence", and nor am I convinced that anything yet achieved by a machine qualifies as involving what most people would probably regard as "intelligence".That this is being promoted now as if it has already happened is the really worrying aspect - just more brain-washing to dupe the masses.
Having said all that, I think there is certainly some scope for fears (as science fiction writers have recognised for many decades), since I really don't think that we can say that it is impossible that we will one day be able to create machines which can act 'autonomously' (I suppose that could be taken to mean 're-programme themselves').
Let's face it, the day will presumably come when (if we are 'allowed' to do it) 'we' will be able to create new 'life forms' (by manipulating DNA) - and if we do that, there will be no telling what we would end up with in terms of 'intelligence' (whatever that means) in the new life forms we had created.

