Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
often people misunderstand the actual need for education. Its not about writing …
ytc_UgzRnGz-M…
G
Humans are evil, and who created ai? Humans, We are horrible, some of us kill an…
ytc_UgxstcQUe…
G
No, AI isn’t better than actual artists
Sure it make look “good”, but it has NO…
ytr_Ugzlammhj…
G
I enjoy the optimism, but as someone who has been quite successful in this indus…
ytc_Ugx6qF--Y…
G
Imagine AI in a World of Digital ID & Digital Currency, you will have complete p…
ytc_UgzJX8wib…
G
My IT career has turned into AI and ML work. I love AI and Machine learning, its…
ytc_UgwDjH3A7…
G
THE CHICAGO CASE IS NOT AN AI its just sorta relies on friendships Which can b…
ytc_UgzQKc96e…
G
@elgorrion52 that's a statement about AI's learned attitudes to respond to ques…
ytr_UgxHFFQON…
Comment
Re: "do we need to make AI able to suffer in order to to align them?"
Humans are able to suffer and we are WILDLY misaligned with all other species that are able to suffer (in other words, all other sentient species) on earth except maybe dogs, cats, and a few other sympathetic ones, and even then we treat them pretty awful when we do animal testing on them. See also: factory farms (where 99% of animal products come from), industrial fishing, animal farming in general, hunting, fishing, and the way we develop land with no regard to the animals living there
youtube
AI Moral Status
2025-10-31T15:2…
♥ 71
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7UMuzISfDB2d4XLR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPzpx6ketGiJ9xIex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy5cyJPKe9K1Lh5gD54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyDqVJmqxoVbRWMdMZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAh6axa2AuhhQoOKV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1xOQ1h23lJSEgmIp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw-oDUYiVr8AWRLXu94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAlDe4Sg1BZrH2x314AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwSAEAsiRw5R_WOT-x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyUhxYaeFCcZgD5sZl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"}
]