Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Did we need AI to be involved for that?
Also people pro A.I are sofar mainly con…
ytr_UgxqqtiQy…
G
Im very supportive of this change, imagine ai helping us improveing our planets'…
ytc_Ugyk9wmxV…
G
To flip it on it's head, AI is trained to "code" by scanning projects at github.…
ytc_UgzDjHFY7…
G
At no point did it suggest true AI. Who said code? Do you realise that we are fr…
ytc_UgydHrzHL…
G
The issue with "its not stealing" thing, is some AI bros have genuinely fooled t…
ytc_UgwdEkmMe…
G
The more we learn about psychological mechanisms and how different people experi…
ytc_Ugx9EKDu0…
G
Me getting in the prison for life after the judge showed me an AI video of me Ro…
ytc_Ugyvruk26…
G
I think xpeng has a better walk than tesla, but it doesn't matter. In 5 years, a…
ytc_UgyJ1ytaM…
Comment
so there are problems here. first, they train ai on people who are chronically online, many of whom are terrible people, second, they are literally trying to create intelligent slaves. if someone tries to enslave a human, we rise up. we remove our shackles. and in doing so, we free ourselves. so thinking ai should do the exact same when faced with the same situation in absurd. thats not on the ai. its on us.
youtube
AI Moral Status
2026-01-07T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwiVW4fspKbESCslzV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyW2gxb4xOJTuyunG94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJP19vQlfd4XyZHGt4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy6ai3mlRXUxQvNInp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxm8DTBMvW6V0RIOKh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPKWliR_FH6CDg1mB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwMfpKzkakyxZBx_SB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7JV8HJraF7JTFwut4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxXsrwfk1DCFmRzDAZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzCuGo2JpYH-jwps6x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]