Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why was the car driving so fast in the first place? A human would've driven slow…
ytc_Ugw37tso1…
G
I doubt ai or a robot will ever do my job. I,m a gas emergency engineer. He ass…
ytc_UgxeHKHPg…
G
What Elon is saying about Larry Page is alarming. Unfortunately, I believe the t…
ytc_UgzUNyLDn…
G
perchance really good and free no catches but they just need an AI video generat…
ytc_Ugz_dfWKx…
G
Because medicine does more harm than good on some people. But yes its pretty dum…
ytr_Ugye3FXw3…
G
ChatGPT needs to have safeguards, stop guards, something to combat suicide. Some…
ytc_UgxKvQ5hw…
G
Its when they take credit for something the ai made for them. Straight up preten…
ytc_UgzKg5eGh…
G
This looks like a mash-up of Craig Mullens and Marco Bucci. Art is about sharing…
ytc_UgxWrLCaR…
Comment
I think we’re on the verge of a major political crisis taking place somewhere in the world, a deep fake of a major incident coinciding with a blackout of information, followed by a wild public backlash, then revenge attacks, then spiralling retaliatory moves until total anarchy is unfolding. Let’s hope AI doesn’t hit the reboot button on humanity.
youtube
AI Moral Status
2025-12-24T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzVq584tvWXcNG487p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz7D2FgSsnOiL1D7rF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzoIgV_Mu0eqFyw2D94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzPICKIfu49WklgtDZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0XqEekdhGn_A4QEJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGu7mU9jRPWfVJ9pF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxJwWBA5ZUlfb2a7G94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzuiYTtYwKRaBWYcJR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz82ZK2amhE-E9X83J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyNETnWHyEGynuFPKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]