Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Using AI to replace human workers is a self-destructive prophecy. Where will the…
ytc_UgxweDOXE…
G
I like that one of her first concerns is why we should ask about AI rights. This…
ytc_Ugw71xdB1…
G
TBh - Every tech leader wanted him to be the president. The tech leaders themsel…
ytr_Ugyxjg9VC…
G
The Original Sin of Simulation could name not just a cultural condition but a co…
ytc_UgxEXf042…
G
lol reddit is insane. That guy is like "Dude, just install the AI on your phone…
rdc_mwww4a6
G
I do understand your point and while i dont like ai art either i have to give th…
ytc_UgyUf6DCs…
G
I don't know but I'm seeing reports that AI integrations are failing massively -…
ytc_UgwDz0pMW…
G
10:59 so does this mean that the AI can get emotionally sick just like humans…
ytc_UgyvXvpbj…
Comment
I Think it's okay to give a robot conciousness and feelings of pain. But not the wrong ego.
youtube
AI Moral Status
2017-02-23T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjV9UePu6YyCHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgieTFR18XcIA3gCoAEC","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj30UnXi1Q4_3gCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UggZWxHLTN7AdXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugi9tUS0262uUngCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgjMs4-9SuUw4XgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiYKjtUokqNvHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghJ-9uaWvRB83gCoAEC","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjjH_Jmx6AGkHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugj6vBfMjdV3c3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]