Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not surprising that the ex-Google CEO is gaslighting and out of touch. Saying th…
ytc_Ugy841Yzn…
G
See if you can find a book paper or article written by your teacher or dean, tha…
rdc_kgp96fa
G
I have personally used AI just to practice on my coloring and lighting - althoug…
ytc_Ugy61oVPw…
G
As soon as i finished the video i knew i would find some random mf using this em…
ytc_Ugwd6GJ1r…
G
When AI makes a mistake and hurts people, who is responsible? I don’t want a soc…
ytc_UgwKa9rP2…
G
Imagine needing AI to answer this when you could just google it or go to Apple’s…
ytr_UgzmWgUc8…
G
The irony on the first point alone that AI enthusiasts don't understand the very…
ytc_Ugwp8_D4I…
G
First they said 2025….then 2026…then 2027….then 2030….AI is not even able to lif…
ytc_UgytcYUlW…
Comment
Stop training AI. You are literally giving away your future, your children’s future, humanity’s future and walking into their digital slave prison. AI requires ALOT of energy to run continuously. The mega rich who own this planet and are/have taking all the world’s resources for themselves (and/or set regulations so that ordinary folk are no longer allowed to live off grid and self sustain because of arbitrary rules and laws they put in place). When the AI needs more energy to keep running, these mega rich will just start human farming us as biological batteries to pay our many bills (whoever wrote The Matrix script was prophetic 😞).
youtube
AI Moral Status
2025-07-30T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxf2o7JE3o01Ah0ukF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwL5MzP5WHnLywh8Dx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxU4n3c_mgM4pSBhUF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx0ycBoOshIRc2yE2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8-apCyyE5RfM9H9V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTdtLfwMGKKoWzQm94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwE8jlLyXNlWcRdhZd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyH700Bm93uF45iKJ54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwYpXpUh4l1vdoFEMR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx1BPQg9VlXwugjPk14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]