Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
anyone can make art if they really put their heart into it. trust me I KNOW,, i’…
ytc_UgzsW2ocD…
G
You're just giving AI psychopaths a new thing to train AI on!
Seriously AI bros …
ytc_Ugxbu2qY5…
G
Who is going to buy? ... Solution easy ... AI needs lots of powerplants and infr…
ytc_UgymLi4iJ…
G
If the main developer of AI is trying to promote universal basic income, then we…
ytc_UgwYH989b…
G
What u need to learn is that even though u draw a stick man in a garden with a …
ytc_UgyKlkU8m…
G
"jeez i could see myself falling for her" Tech removes her face plate "maybe no…
ytc_Ugw9WTL60…
G
Itit's a large language model. It's just really good at guessing the next word o…
ytc_UgyWbM0a_…
G
The people mad about ai making there shitty art or remote bs job obsolete are th…
ytc_Ugz8k9d5h…
Comment
From your point of view, it seems that what is important is what you consider to be a human life, whereas a lot of us care more about the life of a PERSON which is wider than just a human being. In short, if AI reached sentience and consciousness, even if it’s not human, it would be a person whose life would require protection.
Also, it is very interesting how you dismiss the DIFFERENT CULTURAL AND MORAL PERSPECTIVES which lets me know that you want to IMPOSE your personal beliefs on everyone else.
youtube
2025-08-14T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxXI2kdenCw-6rTEFZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwEcOnUG-ca_BfOqy94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"none"},
{"id":"ytc_UgwrSvq189fru2tWO5R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy8Wwvb7IVhP92fgtJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyC93F-I5HCrV5hNAl4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"none"},
{"id":"ytc_UgzrAcc1GpsMjRgMKTV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNVTgJp-EjovEq_MZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugznud4oYp-ZD_BJrxd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxmePKcc4FyiZFZBnZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxKTvXYhQnA7w5LOfp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}
]