Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dint know about the robot delivery thing. They kind of suck. Those robots are …
ytc_UgzO6S07c…
G
How is it destroyed if people think that the AI pic is good enough to make a new…
ytc_UgwlIUvHX…
G
The population collapse comes along right when Artificial Intelligence collapses…
ytc_Ugy8gyXgd…
G
GPT chatbots are unironically fantastic study buddies, as long as you feed it th…
ytr_UgyL5_8c7…
G
AI is looking more like a scam than anything else, and I’ll continue to think so…
rdc_ohwyz7h
G
Really if AI replaces every job. Every factory and service provider has robot. A…
ytc_Ugw14ik_A…
G
@wvd07the artists could just sing ai written songs tho or start using an ai voic…
ytr_Ugw5_97f-…
G
Your brain does not “collect” information, it interprets it.
There’s no evidenc…
rdc_mnq5pyy
Comment
That's no what the video is saying, it's not "AI wants to..." it's following it's orders. It's a tool like you said doing what it was made to do, BUT it can do it on it's own, even taking decisions to fulfill it's function. It's preventable, just like security protocols exist for construction places, when those tools are made they put human safety above everything no matter what. What the video says though, and it's being mirrowed in the comments, is that those who are making the tools don't care about human safety.
youtube
AI Harm Incident
2025-07-28T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxASsktvLEwAjh2H754AaABAg.AL8i-Z65gSiAMIUWQs8ZSW","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxiHjSONWgF9MytblR4AaABAg.AL7xTGhZJnqALL16xkbUWK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxLg69yQ3o68BlWYA14AaABAg.AL7dIEe8EwXALBBJEhYEX4","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugws29neEky5h8y5r4F4AaABAg.AL7KdI6HwzvAL85ez1YTKK","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugws29neEky5h8y5r4F4AaABAg.AL7KdI6HwzvAL9sXWfXEiU","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzbzBmLQYKOxNNbITN4AaABAg.AL6KoiLPeQnAL7PFZbZFpX","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgzbzBmLQYKOxNNbITN4AaABAg.AL6KoiLPeQnAMMxCDpBAt8","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgzRZaN5Z1zjWMAqD_h4AaABAg.AL69YcijqY8AL6A6MHlpQw","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyFE3-0NjInX_I13Th4AaABAg.AL63OYcBbizAL8xhfiSABI","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyFE3-0NjInX_I13Th4AaABAg.AL63OYcBbizALEfyfCVp78","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]