But isn't the one major point of a Turning test is that the subject doesn't know they are taking one? Like... in the sense if you applied the idea to a human and tested if it was a robot, it wouldn't know what you are testing for, which is part of the point.
For a robot to pass, it cannot know it's taking the test, otherwise the results are void and it technically fails. Knowingly fooling it is beyond what the test is testing, that would be more like Kirk and the Kobyashi Maru.
For a robot to pass, it cannot know it's taking the test, otherwise the results are void and it technically fails. Knowingly fooling it is beyond what the test is testing, that would be more like Kirk and the Kobyashi Maru.