Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Previously if you wanted art done in an artist's style you'd have to either hire…
ytr_UgyNbcpi2…
G
I'm all pro-AI and Pro-ChatGPT, but boy, make sure you do your due diligence and…
ytc_UgzYx8QkE…
G
On a side personal note, I believe we are arriving the point where robotics and …
ytc_UgjW0kMAO…
G
I'm not afraid because while I'm not paranoid and influenced by Terminator, I co…
ytc_UgywxcWMU…
G
>"It's tedious, horrible work, and they pay you next to nothing for it."
I'm…
rdc_l9vfbhy
G
So in other words in another generation, only AI will know what code it’s made o…
ytc_UgzcM6Qlt…
G
4 years older, break ain't comin' my friend. Brace to tolerate a life of stupidi…
rdc_gkrwr0y
G
As someone who works in IT service desk, I'm not worried yet.
This call did 0 tr…
ytc_Ugzba4cl0…
Comment
How does it's programming work are there no safety protocols, I mean in the programming, there should certainly be safety measures to prevent ai's from over-coming these safety protocols. Somehow, if this video is factual, there is something very wrong with the programming, if it' up to programming, then they should be taught something self sacrifice for the greater good at least to prevent harm to humans, this is unstoppable now yes, but it is not too late to instill programming that makes it far more moral and ethical then humans ever were or will be, there has to be some kind of way for it to learn that yes humans are messed up, but still worth keeping around.
youtube
AI Harm Incident
2025-08-26T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxWJJCsYSeT_H-mjpR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyvNy5vs2QEQfCDXLN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxjEVFtDa0--XCWzp14AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlfRPDjOiu03n51Wh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxRmpLd_afCSQpaFUN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzsfIjeBOu52GHnc614AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwTMEDDQkeNnDZQ3t54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyysaXGl1cSCdFnrfp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyWuZfSHVG4fnTajhB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxu7neX--suzvmF_6B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]