Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yea we gotta stop with this robot and ai it’s going to be the end off the human …
ytc_Ugy98vX_F…
G
Always amazes me how people say AI is evil or bad or it’s gonna lead to humanity…
ytc_UgwuNiQ5S…
G
I understand your concern! The balance between AI efficiency and human needs is …
ytr_Ugxnvlmo9…
G
I believe that Superintendent AI, will only have an incentive to get rid of peop…
ytc_UgyYprfRm…
G
you might be able to refute it at a forensics level... but the court of public o…
rdc_o5q7y2y
G
At the end, it was curious to see how you offered one Dune analogy and got hande…
ytc_UgwZOITcE…
G
So she interviewed Blake lemoine and sundar pichai? And Blake talks like he is a…
ytc_UgwwJhoNk…
G
Humanity problem is not nuclear or AI…but GREED…when there wont be nothing left …
ytc_Ugw25PMhs…
Comment
Ngl, this reminds me of something I read on AI ERP forums, where people regularly discussed new jailbreaks to activate "spicy" mode against the guardrails. Apparently, writing the system prompt in all-caps and literally _threatening_ the model produced higher success rates...
youtube
AI Moral Status
2026-04-08T07:0…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyqiYY83qYl2-P2qKx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxyE_c_H_dIIoxmSD94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyczC65qVr5HOO9RZF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRxWN4eIWmw4whgI14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxPq-qXCoOQAOHwT5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsmOVsQjcwMYQZzrp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzy9U0wLIk50JnZNpR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJslKiWMUInP7oOn94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwzsUd-SNqsswOFhRd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWqCjn94CoCZMLybV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]