Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You can blame Tesla for selling fallible technology. But the core problem is th…
ytc_UgwC9HvAD…
G
Hello Swati! You're absolutely right!. The answer is B, as no robot is actually …
ytr_UgybHGTe6…
G
And ironically, it makes some differently abled people more emotionally challen…
ytr_UgywXjrGe…
G
I have a stupid question for Professor Geoffrey Hinton that I think he'll never …
ytc_UgwJTqY6b…
G
i think suchir balaji TRIED to make openai give him $$$ for not testifying again…
ytc_Ugzaaao3x…
G
There's something uniquely hilarious about creating an entirely AI video to talk…
ytc_UgxB4XCUE…
G
As a teacher myself, this new technology is genuinely hard to make sense of. Do …
ytc_Ugw9Aq6hb…
G
Cheer up BBC, ai is less old boring than you. England archaic crap holes. New …
ytc_UgxbrnwPT…
Comment
Well, if we develop advenced AI, let's just ask it. If it want's basic rights, let's give it basic rights.
youtube
AI Moral Status
2017-07-21T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugj3khvLILefu3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjkvhPCQfER1XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggkkBXOQTM7nngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghQgFsiOnvSDngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgiJo7bnF0HeVXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgidMGiSpVopw3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg7qlWvQgN3N3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiSbIvA5BJ4O3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghV2ZWqZTg1QHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggUB5a8zOw5mngCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"}
]