Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
After grilling all humans with his cosmic supernatural debates, Alex is here bea…
ytc_Ugz4072hd…
G
We can spend our time enjoying the fruits of the AI. We can all do everything on…
ytc_Ugz5sO8zI…
G
A.I. to me feels like one of those weakened evil "gods" trapped in the underworl…
ytc_UgxDVk7UC…
G
I am not getting in a plane with a f****** robot pilot. You got me f***** up…
ytc_UgwRsaQkO…
G
I’ve never been able to draw much better than a doodle, but I’ve been seeing so …
ytc_UgyoUMupd…
G
How can be more worse when also who made a nuke is more intelligent than AI. Wh…
ytc_UgzQMvS7x…
G
This is stupid if your gonna put a robot vs human at least make the human double…
ytc_UgyQq4_h7…
G
@kenswood2.031 And I'm sure it was on their (kStuen) radar to watch and see what…
ytr_Ugxxzqeey…
Comment
People start looking to AI for stupid shit, the AI creators scramble to hardcode protections into the AI because the AI was never intended to replace other areas outside what the creators intended. AI starts losing its efficacy in its original intent and purpose. Then people notice how fuckin weird its acting. Mass hysteria increases as the hype dies down. Everyone backing AI gets the shaft from the reactionary movement. Then the AI bubble bursts, and we have a much harder time using AI in the appropriate and useful places because we had that shit in places it never needed or deserved to be(like fucking notepad!?).
I hate it here.
youtube
AI Harm Incident
2026-01-19T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwpUHuxS1IQKE4zZhR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzD8RjRBd043gyoVnd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyPX5Dm6ZA4nQi2W0h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLVtSMsYYA9YFLFI94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx13ZTsOPkSAuBDxMl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw1otzRlzo2j6FQ_tl4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxga2udH9DKm4ZRiAR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzq43qO8pjCegB_QC54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx18hBhJh1qL4380dJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_Cttx-RczeFCnz3d4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]