Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And you don't have the slightest hunch this might be a bad decision? What does a…
rdc_jt16dj0
G
“Someone should think about it more” is what the CEO of ai said about how humans…
ytc_UgwtxgYb2…
G
When an artist is learning, in whatever area they are, they usually study artist…
ytr_UgzjaVIrl…
G
Besides the super interesting topic of this conversation, I want to applaud Mr. …
ytc_UgyqSIm0w…
G
I think it’s actually really sad that people just use AI to make art instead of …
ytc_UgwnjOjPL…
G
Half of people aren't that bright. Not that bright people are easily confused an…
ytc_UgyqV6A3h…
G
Ah yes, getting an Ad to use AI on a video about fuck off to AI.…
ytc_Ugzfwp--8…
G
I keep hearing all this and maybe it's so, but I've yet to see AI solve one of o…
ytc_UgzwUQJfT…
Comment
Isaac Asimov's "Three Laws of Robotics"
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
Cross-Cultural
2025-11-02T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzbC8Gm-aMTo-9CpP54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzi4cnrY_Feqp6bilt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTBwI-TrW-bDrWSBB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw8Ew6k5P708wA8XSp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwdLY663s2j2Mopgeh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy1YOJmPOEPTQqXjDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzSeKL9wdAkbydwnkp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx-TCTyJjYXU4ZQGmh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwph404pC8WFUAn3e94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwpouUbKjO1j1i4wyR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"}]