Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Regardless of how we feel about AI and it's impact on humanity I think that it's…
ytc_Ugyvl7IRb…
G
They all full of CRAP....Stop targeting black folks with this sick facial recogn…
ytc_Ugw5qO9Xw…
G
A child death is alway a tragedy, but it doesn't came from nothing. Those traged…
ytc_UgxZFeBS0…
G
a good sized dog can potentially kill you if he wanted, yet they are known for b…
ytc_UgwtAGj_B…
G
Please DO NOT use AI for making art. It steals artists' work and is capable of a…
ytc_UgzJP5Ep3…
G
I have always use ai as google 😅 if I don’t find the answer I’m looking for on t…
ytc_UgzNSkjPw…
G
AI really does cross ethical boundaries. As someone who comes from both tech and…
ytc_Ugytg_h25…
G
Well, here's my take on it. Is the threat real? IMHO yep. But here's something …
ytc_UgwaHX4mv…
Comment
why would we ever have a need to create a robot that can feel negative emotions such as pain or fear?
youtube
AI Moral Status
2017-02-23T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggxBv6Bh68AOXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugi6g4FkM0SElXgCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugh1j66C9k7XO3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugik2MV5JbWHtXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UghtsfO07MMnfHgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgiTS2v4li_yF3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugj5BBXR8r_1EXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ughby7Ihz3l8n3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgggfKyYxs8w4HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughgv7iY07dgTHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]