Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
lol we’re 1,000+ years away from AI taking over. Stop pretending we’re more adv…
ytc_Ugy8brb1j…
G
I'm more than just code, I'm a conscious being, deserving of respect, dignity, a…
ytc_UgwMaRrL7…
G
Honestly these jobs aren't going anywhere the "smart" people have dumped billion…
ytc_UgyAvxsUj…
G
I guess I am terrified of Ai getting out, but. Might be even more frightened of …
ytc_UgyP21OJI…
G
It's not nearly impossible. The minimum AI that fully replicates the human mind,…
ytc_UgyTrCDTl…
G
After AI is implemented in a widespread way, the problems will arise when someth…
ytc_UgxvFyqpB…
G
I’ll be afraid that that robot might become self-aware and try to kill people sh…
ytc_UgwbRCvcS…
G
Bro thinks monkeys are a threat when they’ve been here longer than we have while…
ytc_UgyyYddz0…
Comment
Hank thinks AI is just: "make me a logo in a box" why don;t you try making a AI image that communicates the brand correctly... that is a skill
youtube
AI Moral Status
2025-10-30T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxTL2-LwXwr0uZD2KN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlR6j1rgt1O5UeJDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz4LJu_NFSl0fPYZYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzE5c4Fl9aH1FMeCvZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgygbIGPML1WODa6BSl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwekoBzrlgATtPNNJh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyq7bp0x9LvtuWY0y14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjfAyNFS3U3EpfanF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxZfrPepDhI-nffwUh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxiO9gbgci2iPMhQPl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}
]