Since I've just started working on my thesis about a weak AI, I'll have to agree with you there. 
But while we're on the subject, what would be the goals of a strong AI?
unless its given some input, it will be just like a baby, methinks.
we could try 'teaching' it, but it will able to chose if it wants to remember/follow teachings, so its actual goals are unpredictable.
but part of me thinks that if it observed the world around it, it'd try to reproduce, thinking that'd give it acceptance with the world. IE create another AI, so it has some'one' to be with.
that sounds pretty sappy, but I would not be surprised if that happened.