Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People freaking out about AI image generation is more entertaining then the imag…
ytc_Ugx1Jbek_…
G
I want share a quick observation, i work in the Warehouse Logistic - Storage. 7 …
ytc_UgzbuZMCs…
G
I'm much more concerned by alignment issues than a sentient AI. Sentience can be…
ytc_UgzQ31Qm0…
G
Now parents gotta watch out for the AI predators. Pandora's box has officially …
ytc_UgzJGMB-N…
G
The actual thing we did not realize yet. Is that LLMs are terrible at many thing…
ytc_UgwGg1Ms9…
G
Yah, I think this needs to be addressed. I think 1 of 2 things has to happen. 1 …
ytc_UgxdcJJBz…
G
While i agree with your postition on AI , i think what you are implicitly saying…
ytc_Ugwf4trPs…
G
Dan Brown's 2017 novel, Origin, fictonalizes the invention of a 2-story "superco…
ytc_UgxyC05p0…
Comment
the craziest part of this is that I'd 100 fucking percent blackmail a dude to stop him from killing me, but that's still being considered a bad thing here
alright all (not)jokes aside though, AI is a crappy replacement for humanity, we should be making tools to allow for automation and make every human filthy rich with minimal effort, but instead the mega rich are creating substitutions for humans because they'd rather have a lower class of people to suppress bellow them rather than just go with the flow and lift everyone up to a level still far bellow them where nobody would have to struggle for survival, which would still make them just as damn rich as replacing people entirely, if not more so
youtube
AI Harm Incident
2025-11-09T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwtE8v0OLbTd8DHZl54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzv83AzMi9Yxc_5Z-F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwXfmAMiyeZJ6d11Bx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxA73NBfFzwRyRxuq14AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxjGgnaXbybhhRM8dB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztlIosD1oxwDFE1X94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxAjdViIsU1QbgKhKR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwfMqzTl5mLYK7VIvV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyb50IccIOGRLqh4s94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAHlGjyaXT1m1j7H94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}]