Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Come on, Elon, we already know what happens. Just watch any of the terminators …
ytc_UgyWhmA6B…
G
The ONLY time ive used ai was to make funny images to laugh at the weird forms w…
ytc_UgzvFOzwP…
G
Still no robot carpet layers, robot plasterers, robot plumbers, robot roofers, r…
ytc_UgxxrIGR9…
G
They get things right by accident and then say the robots wrong? now you wanna s…
ytc_UgxiEdL1Q…
G
And we will live in a future where we look back on this video and think it’s old…
ytc_UgyzcmU9c…
G
The fact that it's come down to this, not being able to determine what's real or…
ytc_Ugzg_LpPA…
G
Want to see it stop? Attach a criminal law that would apply to Waymo executives…
ytc_UgyHjk4TZ…
G
I'm not against ai being used to assist in the creative process, it's when you a…
ytc_UgywgUveD…
Comment
I would suggest that no matter how ginormous and comprehensive a code-book you're given, you would never pass the "Chinese Turing Test" unless you actually speak Chinese. Not for very long, anyway. At some point or another you will be presented with some nuanced phrasing or some complex sentence that any Chinese speaker would immediately understand yet your code-book doesn't cover.
And that's just the point - No collection of "If - Then" code, no matter how comprehensive it is, would ever fool a human into thinking it's conscient for long. In order to do that you would have to create the system in a way that actually mimics human though processes and learning - Things that we don't yet fully understand how they work.
Once we know how to accurately describe those in humans, we could create algorithms based on them and therefor code programs that mimic them.
youtube
2016-08-11T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgihTDu9TVqWJ3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh3jypUXJ3aq3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugje_oczNoCl4ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UggJDVBrEZQ8JHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UghjTO1j58RZB3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh7UhkBGL3Kc3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UghTpjY2tZvJXngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjdJIL8FexO9HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjXLgiUCTbywngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg5BaG1i2T1FngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}
]