Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It was cool because it kinda felt like looking at how moss grows, it was so hazy…
ytr_UgzG1aU4f…
G
There's alot of talk about the ideology of privacy and freedom, but what exactly…
ytc_UgxLot7_N…
G
Autonomous cars can be a valuable add-on to public transit and traffic, bringing…
ytc_UgyGnGZKX…
G
Yeah, saw the original fight. This was everywhere, and no, the robot here is fak…
ytr_UgxSXdgLx…
G
It blows anyone minds that how it’s legal to call it ‘FULL’ Self Driving when it…
ytc_UgyYvDvOP…
G
I'm with ChatGPT on this one. ChatGPT tries to be practical, while Alex is talki…
ytc_UgxrvtjLB…
G
Thank you Bernie for looking into this idea of AI that has been just rattling ar…
ytc_UgwakxChf…
G
Love is a chemical reaction that we evolved for the survival of our species. Wit…
ytc_UgxJSxWIk…
Comment
and once again, man invents God. Why would there be rights for a machine when we don't even grant that to all the extant species that we abuse at will and in many cases eat? Why wouldn't AI treat us the same way? We kill and destroy because we can will little to no consequence. Why wouldn't our creation do the same to us?
youtube
AI Moral Status
2025-04-28T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwcBnJHuEUfXla0WS14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwh6VTUVELEgCgYZ594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy_ugcPUS1rJSfSkX94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBgAwfnpzM4-GEVnd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzDhSXbaVFd8-74NMV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzeW9cN4BKgeJqSMwt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzr1H1qt2ydyg--8IN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz9uP3ailRvKrZuIHN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy9lumlptX_Pl8IFA54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxJ0y0-RfxromYI0tB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]