Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If he has genius enough to build AI then it's not hard to understand that he kne…
ytc_Ugzo4fGNQ…
G
Bro, for 300 a month Grok heavy better do whatever the fuck I would want with it…
ytc_UgwkxAa78…
G
May want to tweak those algorithms. Five days to fill an order plus another five…
ytc_UgwrYPC1a…
G
This is the new future where robots and
artificial intelligence are taking over …
ytc_Ugy0cWQxV…
G
This is what I fear the most. Humans being AWFUL to robots and AI, all good and…
ytc_UgyNrOX61…
G
You can't really use ASI as a puppet - that's the whole problem. Once something …
ytr_UgztvlOJR…
G
Being able to die is what makes living things alive eo how can somthing that can…
ytc_Ugz1gKUCE…
G
You are very right wrt AI, that it's people problem, even more since AI hallucin…
ytc_UgyR7_Dc4…
Comment
One important thing you didn't go over though, is that the Turing test has been re-written numerous times to increase complexity. This was done to compensate faster processors, bigger memory reserves and more complex programming.
When the original Turing test was drafted it merely asked for the program to convince the human after 5 minutes of conversation. But it has been adjusted numerous times to compensate diversionary tactics used by chatbots seeking to defeat the test based upon the manipulation of its rules.
youtube
2016-08-09T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugi1yh8I7Gqn_3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjXrER6VVXb0XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgijjsMpDmX7z3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Uggfndq2J7NnqngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghYgtQFZIyZQHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghH74kqk0AFK3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Uggxw68cAiW-M3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiKQ4yawP8DjngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghOhj5JFXSTzngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh42eOQdjXgzXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}]