Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There's no point in giving a robot freedom or rights if they can't appreciate or…
ytc_UghJw5ulo…
G
Are people prepared for that?! NO! Are govs prepared for that?! NO! Because of n…
ytc_UgxZ9mXO4…
G
"What create an art genuine is the history and reason."
This what I figure out t…
ytc_Ugw2g3A81…
G
Only God can create life and this is Satans way to come close to creating life. …
ytc_UgyEJTGId…
G
Ai is trained on artists, so it only looks good because the source material look…
ytc_UgzTLFFef…
G
Not in 5 years. Maybe 20-30 years. This is the timeframe to really implement end…
ytc_UgyaYAb7k…
G
If things really do go to crap for society, maybe there will be riots and sabota…
ytc_UgzteAU2f…
G
this is a dumb take because AI will generate enough wealth for everyone to live …
ytr_Ugz2V7B2n…
Comment
These AI "go for it", Silicon Valley egomaniacs are philosophically not so different than the real True Believers that comprised the. Nazi Party in Germany. Building the " perfect Society ", weed the weak people out, have a small cadre of people in control. People find it easy to fear the hydrogen bomb but can't get past the friendly face of the avatar, the promises of curing disease, or even immortality. I find it impossible to believe that mankind survives this threat. Just look at Mark Zuckerberg declaring that he will allow his source code to be freely available! Do you trust Mark Z?
youtube
AI Harm Incident
2025-07-27T00:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwrnJ6m11bip-14br14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwHN5t8C_EteVstzRd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxrQ8YBvkOvM9y5b2R4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQfwE-qdyu84G1ZKt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyEVYBx9xwxpQPQ2_F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgyIVkwQxCU6Np7Mwkp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwHPQJBdw8siSdloXF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy1ZNQw2rhUMkVe1IN4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxHFrkkmgPdUN5Q7nB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyPa6MKDkrCBr30LvF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]