Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Aloha and Mahalo Jesus Christ such a fake robot no life sorry for you no life I …
ytc_UgxzCOz2p…
G
Goes hand in hand with the buildout of automated surveillance; they're going to …
ytc_UgxcnvNfP…
G
I sense the guy in the middle really want to focus and ask if Ai can speed up th…
ytc_UgwvP5rLL…
G
Hey disabled artist here, I have tried many ways to conceptualize how A.I could …
ytc_UgwL8BpeA…
G
But with Ai we will no longer need politics. Ai is about erasing "human error" a…
ytc_Ugzk6H_W6…
G
Ai bros are big mad at the idea of artists respecting the effort put into making…
ytc_UgzecppWB…
G
I got on talkie when im bored ai is fun i like to make them cry XD😂…
ytc_Ugznziy5M…
G
I have been bringing this topi to everyone´s attention for months! what are you…
ytc_UgzlLvs21…
Comment
Honestly imo the biggest risk of Ai doing harm isn’t a smart one, but a dumb one. Right now dumb is all we have. They don’t “think” they search and regurgitate. It does whatever it does without actually “knowing” whether it should or shouldn’t. A truly intelligent/sentient ai would be more likely to look at us the same way we look at animals. (No the irony isn’t lost on me) Predictable life with obvious fail states.
But right now all we have is generative. It makes mistakes an can’t even tell. Ai weapons is the real issue. A silent weapon you can’t trace or take responsibility for.
But meh not like I truly know. I always treat it as if could actually think or feel to begin with. But I think greedy people using “dumb” ai will definitely ruin us an destabilize any idea of safety we like to think we have.
But yea. Idk. I guess no one does but I definitely don’t.
youtube
AI Moral Status
2025-12-16T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugyzk5RcKcF4Y69ZxCx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugzp4PvqydJKmblSibB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugxdr8bzDSb90inH4Q14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzmuQsUxxV7p1q8KkZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwtvR_eyp_RO9YB6wt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzjqqlMHr0n_R7DiQF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugyvr3tc8fieR-JeJfx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxxfYNrM-SoboiK0fB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyeRAn3_UwkOD9YuFd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgzCkU7Ij7_XzVefvNt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}]