Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Too many people who think AI is great have not seen The Terminator movie series.…
ytc_Ugx1S5tkE…
G
Fun fact in NSW Australia schools are not allowed to deduct marks based off of a…
ytc_UgzCMZueq…
G
1:14:30 when AI is explaining the human what a Turing tests is and the human is …
ytc_UgykUb9m-…
G
I actually disagree partly. To get the most out of AI you need to know how to pr…
ytc_Ugx9cJ4kA…
G
The robot is trying to grab the box. Instead of clamping on the box, it clamped …
ytc_UgzUtbVk3…
G
Ooof I can see it now… coming this spring new horror movie *”astro gon ape shit …
ytc_Ugyga4Nco…
G
Not to mention the resources and the water that will run out more and more as AI…
ytr_Ugyqt942k…
G
So... are you saying AI is actually great, considering how much amazing art it h…
ytc_UgwaCq6v9…
Comment
Fooling the Chinese room is actually very simple. Just keep asking it same question. It will always be giving back same answer again and again and again, while real person might start saying other things or even insult you. But all Chinese room can is to match the one set of symbols with other set of symbols. The simple question of "what I said five minutes ago?" would expose it's true nature even faster. Or any question that requires collection and interpretation of input data.
Ergo, I think it is impossible to invent a non-person machine that can truly pass the Turing Test. It's either true AI or it'll be busted.
youtube
2016-08-09T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugi1yh8I7Gqn_3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjXrER6VVXb0XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgijjsMpDmX7z3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Uggfndq2J7NnqngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghYgtQFZIyZQHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghH74kqk0AFK3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Uggxw68cAiW-M3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiKQ4yawP8DjngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghOhj5JFXSTzngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh42eOQdjXgzXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}]