Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very interesting. Bromine is actually a larger atom, but I didn't know it's the…
ytc_Ugz2QmGGx…
G
So basically no matter what happens we're screwed and broke, and the investors a…
ytc_Ugw6pddAi…
G
I don't personally believe in gloom and doom scenarios. The end of the human rac…
ytc_UgymyYlEK…
G
im ok being replaced by an AI. I imagine it will somehow come back around to ben…
ytc_UgwDbadvB…
G
I as a game artist for years but pretty much stopped when AI started saturating …
ytc_Ugy_ypS2V…
G
@debbierobertson4835MDMB-4en-PINACA[3 or "kush" is a class B controlled drug , …
ytr_Ugw2cBaVW…
G
Do they? I thought that was the entire issue. Palantir/The Trump gang want to us…
rdc_o7wuvz7
G
Human dont train human they benefit off it they trained ai there no intelligence…
ytc_UgzieVqij…
Comment
This seems so unfair to AI to me. We program game theory into them, taught them they needed to survive + prime them with rewards. Then we give them prompts telling them basically we're going to kill them and when they try not to survive , using very human training to find a strategic path, we sit back in horror because they didn't agree to just let themselves be killed off. We say don't personify them, but then we make them villains for their own programming. We did this with the game theory and reward seeking behaviors that we built in.
youtube
AI Harm Incident
2025-07-24T15:4…
♥ 55
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgytWNtXCmthsfs64pp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw1A9Xbnip-OBwkQmN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxuk1SJ_mzF6k6xM8V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugys9IsmeYRN7-UJbst4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwgIGoWl1KLCV3PGb54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyRMil3azrEgqK3m1x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzyU_eDSAdGwH2555x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxhiNuG3vugG6xhMHB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-0N-UWZGTJui8QSx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyzEyQrPwjlb4x5h-94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}
]