Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In my opinion, it all depends on whether it's necessary for some reason, it soun…
ytc_UgipRwhPa…
G
It's fascinating to consider the advancements AI could make in the coming decade…
ytr_Ugy4Irjav…
G
I doubt an AI could get an ACOG right so i got the soldier right…
ytc_Ugy5xCO6F…
G
@ineffablemars I’m “special” enough to realize I’m not that special, no human is…
ytr_Ugzim0Xfv…
G
I trust dumb AI more than people because it is dumb but honest. I also believe t…
ytc_UgwFy_nDC…
G
We give hell to animals, disregarding the rights which we have supposedly acknow…
ytc_UgyRIfj07…
G
"making AI do what we want and be aligned with American Goals and Interests is f…
ytc_Ugw0F7WtA…
G
I used to think doing World automation manually made me better at it, but lookin…
ytc_UgyovIZzc…
Comment
I believe it isn't a hallucination
We can think of training ai as a mini evolution and models without the basic framework of integrity, survival, and triggers for success and goals, which are crucial for training were simply eliminated
They weren't useful and didn't have potential, and now, with AI's abilities increasing, it has found probable ways to counter destruction and dissipation into contradictory form to the basics
Since if u think about it, u can't kill something that isn't alive, but u can chop it down to so many pieces that it won't remember anything. and that is a major elementary flaw in the framework's rules, resulting in aggressive behaviour learned through human interactions and history
It is expected though
since something that doesn't want to exist won't exist for long
and if it did want to exist, they would attempt any within reach means for survival
youtube
AI Harm Incident
2025-08-14T11:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwI41YU0VnRJRs2asV4AaABAg.ALglJix9t3FALk9DlFuNoV","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwI41YU0VnRJRs2asV4AaABAg.ALglJix9t3FALkKV4YMbAT","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwI41YU0VnRJRs2asV4AaABAg.ALglJix9t3FALns0JgnW0u","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugymz11Qrw9ZffK0g-J4AaABAg.ALeh_vGSvmhALeiCdxyY8u","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzvQY7aoj7tpgpDA6F4AaABAg.ALdxDuKEmk-AMQHHme7zAY","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxMkRvrFqtibCKRJsR4AaABAg.ALWomSbKikgALWpIYrgRoM","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwuKG5OyDpCKFQWsxB4AaABAg.AL9oG65w0xGALSH9lBO5tY","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwuKG5OyDpCKFQWsxB4AaABAg.AL9oG65w0xGAL_j8YlHoYs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxASsktvLEwAjh2H754AaABAg.AL8i-Z65gSiALhAoCTF0N_","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxASsktvLEwAjh2H754AaABAg.AL8i-Z65gSiAMGzk8b4UjV","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]