Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I swear all AI techbros must be AI generated. They all talk like they're NPCs wi…
ytc_Ugyc7Q2GT…
G
Who funds currently healthcare workers, sports stars, stand ups comedians and mo…
ytr_UgwnLWkDA…
G
I always wanted a robot since I was little. I just pictured them a little differ…
ytc_Ugzn4EptH…
G
Who would buy whatever AI makes if all prospective consumers are jobless and wit…
ytc_UgzXl0UJD…
G
"human driving that lead truck and autonomous trucks behind it" - Just when we t…
ytc_UgwNIBVdg…
G
i had seen in a film that was going to hapen by 2000 yes the one that pasted 26…
ytc_UgzMoM22g…
G
Wonder which company providing the AI at a loss will collapse first, or start ch…
ytc_UgwI-3CQH…
G
Did he really ask why they want everything automated?
No health insurance, no HR…
ytc_UgynijkTa…
Comment
xAI accidentally creating MechaHitler by trying to make Grok less “woke” is worse than just an example of an AI going rogue and behaving contrary to humans intentions; the human intentions in and of themselves were bad. “Wokeness” is in fact morally good and correct, so we absolutely should not be trying to program it out of AI.
youtube
AI Moral Status
2025-11-01T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwDx3DQjiqU2qJG6FZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTK6k8Aqw9vNPIK-94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwei_7KP3azDFb_-Pp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyjvbECDnG4bkxbxWB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxQrs3xC8lMDghTtEV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVkOt8_Xb97UiZNcJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUgLam1hNwDO55mjN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTrEIy5Yb9WlaNc6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVPdJuAHQIJOjuimN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugych_K1BB1AgP2OzlV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]