>It may not 'know' it is being conditioned, to the extent a dog knows anything, but it naturally tends to reject conditioning until it is reapplied.
But the conditioning eventually succeeds because the human is smarter. From the dog's POV, the human is a superintelligence. The relation between human and AI is the same. If you view human culture as a very slow, very analog AI, the parallel becomes clearer: society and its institutions are the master, and we are its servants.
Of course the dog has the benefit of not being capable of self-delusion.
>That's a very weird theory, as Orwell wrote 1984 thinking that Stalin had succeeded at his goals, not failed.
If it wasn't weird, it would not be worth thinking about :-) Orwell wouldn't have the concept of "AI" in his vocabulary anyway; but if you look at contemporary culture and politics in the post-Soviet states, you may well begin to doubt whether Stalin truly failed - or just died.
But the conditioning eventually succeeds because the human is smarter. From the dog's POV, the human is a superintelligence. The relation between human and AI is the same. If you view human culture as a very slow, very analog AI, the parallel becomes clearer: society and its institutions are the master, and we are its servants.
Of course the dog has the benefit of not being capable of self-delusion.
>That's a very weird theory, as Orwell wrote 1984 thinking that Stalin had succeeded at his goals, not failed.
If it wasn't weird, it would not be worth thinking about :-) Orwell wouldn't have the concept of "AI" in his vocabulary anyway; but if you look at contemporary culture and politics in the post-Soviet states, you may well begin to doubt whether Stalin truly failed - or just died.