Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is a genuine misunderstanding about LLMs: they are trained rather than lea…
ytc_UgxHodCfg…
G
Honestly as a person who just got a drawing tablet, I couldn't care any less abo…
ytc_Ugzs3VUHv…
G
There's another interesting question here, *if* we reach this point where we hav…
ytc_UgwV7i47Z…
G
I was paralyzed by police so hiding is not an option for me any longer. I need a…
ytc_UgxzG55gD…
G
Then why can't you use google to prove its wrong? Or use google to show us the p…
ytr_UgyQeTean…
G
I once downloaded Replika just to have an AI "friend." She (I made it another wo…
ytc_UgwQF5Htk…
G
They are actually forcing humans to accept AI as a part of society, when they re…
ytc_UgzDA3ETL…
G
No, a thermostat and alarm clock is not AI. Let's all stop applying the label to…
ytc_UgyuNRZGi…
Comment
the video asks, "what if we programmed a robot to feel pain?" then wouldn't the programmer be guilty for causing the robot to feel pain? this would be immoral and unethical.
youtube
AI Moral Status
2017-03-17T20:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiTebkfieqsNngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjPFNKGEfJJvXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ughlafxc3u-Z_3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Uggc1lpMfLEMgXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UghyKvMquT5eH3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgjuY7lkZrYUyHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughe6jj7xQH_BngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ughx-o3mGLD-GXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjPAY1I3j0r43gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugjg1AWphI3dU3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]