Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This makes assignments very easily but more easily, tutor detects that your work…
ytr_Ugw6v8Z8w…
G
AI can not interpret what it is reading like a human can.
John 1:1 says "In the…
ytc_Ugx_b3b3w…
G
You nailed it. AI is just another tool to help you do your job. The issue is tha…
ytc_UgzophaCC…
G
We aren't getting to matrioska brain AI quite yet. (Ie AI's that use all the pow…
ytr_UgwDI51Ts…
G
it isnt "art" its ai images, dont call it that, "the expression or application o…
ytc_Ugwd07O2A…
G
How many of these AI databases do they need. They are building these things on b…
ytc_UgyhHNV3_…
G
I think it’s because it internally is built on humans design where it starts to …
ytc_UgztbSl3J…
G
Ai "artists" want to be able to create how all the masters they see create. they…
ytc_Ugw0CoBrZ…
Comment
did he say all the money will go to the people who created the AI, and then they would change how wealth is distributed? Uhh how about we cut out the middle man and start gutting the 1% if that's the true way to end poverty
youtube
AI Moral Status
2026-03-18T05:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxDH4I00pEQiTqNVwl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgysgaxRySe2664aTqt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0mDHNZpWtCLRtK3J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwc9AETtnp2NcGjvOB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy7JY5EJu6WYxEEOBR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyCp8wX_If0rdf1fHh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxPTjaV6HbctVYoXWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxM3oVjRU4ofFEXNAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzv9m5IH9n1ls4EfPV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZEJIulBXsEi35Owt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]